We designed and implemented an evaluation for the CHCI and the EHSDI that was both formative and summative and focused on implementation issues and impacts. As we approached the evaluation design and assessed the feasibility of addressing the evaluation objectives, it became apparent that there were specific issues and limitations which influenced the approach to evaluation:
- There was little in the way of documented program theory or program logic developed specifically for the programs, particularly for the CHCI. There was a noticeable absence of things such as program context and assumptions that can be valuable tools for informing evaluation design.
- There was an absence of documented policy on the CHCI, limiting our ability to establish the theory, context and assumptions for the program. The stated objectives for the program were processorientated rather than outcome or impact-orientated.
- The context for the EHSDI was rich and complex, having evolved over some time. There were at least two sets of stated objectives for the program, reflecting its early state and the evolving nature of development and implementation. There is some consistency between the two sets of objectives, with a focus on expanding services and moving towards regional approaches to service delivery. The first set of objectives also has a focus on the RAHC while the second set introduces quality improvement, equitable distribution of resources and has a stronger focus on increasing Aboriginal involvement in delivery, management, and control of PHC services (see Appendix A).
- The EHSDI was seen by the evaluation partner organisations to be part of a wider process of reforming the NT PHC system that had been progressing for some years, and consequently the evaluation needed to consider this wider context.
- The evaluation of the CHCI needed to add value to the program’s existing monitoring and reporting processes, undertaken by AIHW.
- There was a strong appetite among the evaluation partners for a formative approach to the evaluation of the EHSDI—an approach that would support ongoing improvements to the implementation of the program.
- Not surprisingly, the evaluation partners’ needs and expectations of the evaluation differed. DoHA required both a summative evaluation of the CHCI that ‘completed the story’ on this program, and a formative evaluation of the EHSDI that would support ongoing improvements. The other partners championed for a greater focus on the formative evaluation of the EHSDI. Both sets of expectations had to be managed within the design and execution of the evaluation.
- the evaluation considers the trajectory of the NT health system with regard to child health and primary health care and assesses the impact, if any, of the CHCI and the EHSDI on this trajectory5
- the evaluation draws on and synthesises performance and impact from multiple levels (including NT, regional, community, and individual) and perspectives such as governance, management, funders, service providers and service users.
2.1.1 The CHCI: a summative evaluationThe overall evaluation approach for the CHCI is a summative assessment that documents and reports on the impacts and lessons from the program. It can be used to support improvements to wellness checks and child health more generally.
The approach recognises that many of the impacts of the CHCI such as coverage, diagnosis of health conditions, health status and treatment are either largely addressed by existing CHCI monitoring and reporting processes, or were not feasible given the short amount of time that has lapsed between the program and this evaluation. The overall approach draws on existing quantitative analyses and includes some additional analyses of NT data collections. It also draws on information about how the child health checks and follow-up services were run in different communities, and what else was happening in these communities at the time, to help contextualise the quantitative analyses.
2.1.2 The EHSDI: a formative evaluationThe evaluation of the EHSDI is based on a formative assessment. It focuses on current experience that can be used to improve the ongoing implementation of the program. This approach recognises that the implementation of the EHSDI is evolving and developing. By taking an evaluation approach that is flexible and responsive to the dynamic nature of the program, the findings will remain relevant and credible. To be formative, the evaluation has had to interact with program implementation. This means the evaluation team has engaged with those responsible for implementing the program at regular intervals to discuss and agree on evaluation questions, identify and discuss current issues affecting implementation, and share interim evaluation findings.
While the initial objectives for the EHSDI evaluation remain important to this formative approach, the way we have considered these objectives and the priority we have given them have evolved since the evaluation design phase. This evolution reflects what we have learnt from our interaction with the program and with those implementing it. This includes current implementation issues that the evaluation could assist and systems planning frameworks that the evaluation could usefully inform. As a result, the evaluation questions for the EHSDI are generally more focused on analysis of implementation processes and the potential for the EHSDI to impact on health services in the future, rather than assessing actual outcomes and impacts. The analysis draws on data and progress reports on the program and qualitative information from informant interviews and with participants at a regional and community level about how the EHSDI is being implemented.
The formative approach required identification, discussion and reporting back to those responsible for implementing the EHSDI in real time so that the findings are relevant to the program at that time. To accomplish this, we held two workshops and produced reports for participants from the partner organisations—DoHA, DHF and AMSANT. The findings included in these two workshop reports are not repeated in this final evaluation report. However, the issues raised at the workshops have informed the analysis in this report and we have referred to the workshops as sources of information where appropriate. Top of page