Skip to main content

Development evaluation

Good practice examples

The following are good practice examples of evaluation reports, terms of reference, plans and management responses. These examples were identified as part of the Review of Quality of 2022 Evaluations and the Review of 2017 Program Evaluations.

PNG Transport Sector Support Program Phase 2 (2022)

The evaluation report is clear with a logical layout. There are eight appropriate evaluation questions which are clear and specific, three related to program design and five to program implementation. The report is structured around the evaluation questions, which makes it easy to navigate. The authors clearly answer each question, bolding the key points that directly relate to answering the question. There is a description of the adequacy of the program's M&E system and some evidence that the report draws on the M&E system, particularly in relation to the achievement of outcomes. The methodology is appropriate, and the review draws on a range of sources with a coherent analysis of the evidence. The executive summary is clear and concise, providing information needed for future decision-making. 

PNG-Australia Transport Sector Support Program Review 2022

Australia, New Zealand, International Finance Corporation: Papua New Guinea Partnership (2021)

The scope of this mid-term review (MTR), as detailed in the terms of reference (TOR), was well suited to the evaluation time and resources. The purpose described in the TOR was clear. The evaluation used mixed methods to evaluate the partnership, incorporating quantitative and qualitative measures. The methods were appropriate for the task and explained well in the TOR.

Terms of Reference (see Annex 6.1)

The MTR report was structured around the six key evaluation questions and related sub-questions. It addressed the questions in a succinct but comprehensive way. It identifies key strategic issues and also provides relevant operational and technical detail in the body of the report and in accompanying annexes. The evaluative judgements of the consultant are clear and backed by evidence. The report is not too long (28 pages for the main report plus annexes) and it is well written. 

Midterm evaluation of the Australia, New Zealand, International Finance Corporation: PNG Partnership

Justice Services and Stability for Development Program (JSS4D) (2022)

The TOR for this evaluation were good, including a list of evaluation questions which were amended after discussion with the evaluation team. The scope of the evaluation was suitable for the time and resources available (90 days in total for the 3-person team). The roles and responsibilities of the team members were clearly set out in the evaluation plan which was detailed and comprehensive, and appended to the main report.

Terms of Reference (see Annex C)

The methodology described in the evaluation plan was very high quality. It described analysis and triangulation of data across the evidence matrix from a variety of sources to make a judgement of the strength of evidence. The evaluation used purposive sampling of stakeholders for interview, noting the option to use snowball or referral sampling where needed. The evaluation report included a table of possible limitations, and it was also clear on the limitations of evidence from the program MERL system. The evaluation plan included a section on safety and ethical practice which included cultural issues, confidentiality of interviewees, and also noted procedures for the safekeeping of data.

Evaluation plan (see Annex D)

Overall the report was well structured around the evaluation questions, each of which was clearly answered. The presentation of evidence is credible and convincing. Findings clearly flowed from the evidence and narrative. Where there were limitations in the data in specific areas this was made clear in the text. Noting the lack of evidence from the MERL system in assessing achievements against EOPOs, the evaluation team compared their own assessments of progress using documentary and stakeholder evidence with the program's assessment of progress to validate the program's reporting and assessments. 

Australian Infrastructure Financing Facility for Pacific (AIFFP) (2022)

This evaluation was very high quality in nearly every respect. The purpose and scope were very clear. The methodology was particularly sound. Of particular value was the use of an Evaluation Framework, which presented the evidence required to answer each key evaluation question, the data sources, the data collection methods and analysis approach. Document review and interviews were the primary sources as well as landscape analysis and benchmarking exercises. It was useful to see a comparison to other financing initiatives. Documents and interviews were coded against the key indicators required to answer the evaluation questions and NViVO software used to support the analysis.

This is a good practice example of being clear about data analysis approaches and it is easy to see how data was intended to be triangulated. It is noted that certain factors were excluded where there was insufficient evidence to inform robust analysis. There was third party verification of the report and validation of the findings. Limitations were clearly identified. Ethical considerations were also included, covering informed consent and privacy and confidentiality. 

Australian Infrastructure Financing Facility for the Pacific Two-Year System-Wide Review and Management Response

Penyediaan Air Minum dan Sanitasi Berbasis Masyarakat (PAMSIMAS) (2022) (Community-based Rural Water Supply and Sanitation Program)

The report clearly set out the evidence for its conclusions and there was a clear line of sight along the chain of evidence. It was notable that the evaluation interrogated the program MIS data where that data was not consistent with the evaluation's data collection and made recommendations for a future MIS. Evidence for the evaluation was thus collected from a range of sources including program MIS data, key informant interviews, focus group discussions, a small household survey, water quality testing, infrastructure quality checks. Data was triangulated to draw evidence-based conclusions.

Independent Evaluation of the Community-based Rural Water Supply and Sanitation (PAMSIMAS) Program

ASEAN Australia Smart Cities Trust Fund (AASCTF) (2022)

This evaluation was very clearly structured so it is very easy to find information in the findings. The evaluation questions were appropriate and well answered, with a balance between operational and strategic issues. The conclusions are easy to read and flow well from the findings. A strength of the report was the executive summary, which clearly summarised the findings, conclusions and recommendations, with no significant information gaps. At three pages long it is proportionate to the length of the whole 36-page report.

ASEAN-Australia Smart Cities Trust Fund: Mid-term Review Report and Management Response

The ToRs are provided in the annexes and are a good practice example. The background is a good summary of a complicated program implemented in multiple locations. The evaluation questions are prioritised, and the sub questions are clear and relevant. The section on the requirements for the aide memoire is clear.

Terms of Reference (see Annex 5)

Independent Evaluation of the Australian NGO Cooperation Program (ANCP) (2022)

The evaluation considers the issue of the Australian NGO Cooperation Program modality from a range of perspectives. The evaluation report answers each of the 5 key evaluation questions and the report is structured around these questions. The report draws out different perspectives well. The evaluation's findings are based on strong evidence. That is, the key findings are triangulated across the multiple reliable sources (including the 2015 ODE evaluation, international literature, project documents and stakeholder consultations). Where there is a lack of evidence, this is noted. There was a range of formats used to present data, particularly in the annexes.

Tropical Cyclone Pam Recovery Program Evaluation (2018)

The evaluation report synthesises a number of complex program and policy matters in an easy to read, well-structured and coherent way. The report uses a range of methods to inform the reader of findings and conclusions, e.g. graphs, text boxes, quotes, photos. There is a clear line of sight between lines of enquiry, findings, evidence, conclusions and recommendations. Although some data to measure the program's impact is not available due to weaknesses in the monitoring system, findings and recommendations are well supported by a good balance of quantitative and qualitative evidence. Another significant strength of this evaluation is its use in informing Australia's subsequent recovery efforts to the Ambae volcano crisis, Vanuatu.

Humanitarian assistance to Myanmar (2017)

The Terms of Reference provide a clear description of the purpose, background, timeframes and deliverables. Key evaluation questions and methods are outlined, stating that these will be refined in consultation with the evaluation team. The scope is suitable for the time and resources available for the evaluation. The TORs could be strengthened by outlining the role and responsibilities of the two DFAT officers who are part of the team and by stating that the final evaluation report will be published on the DFAT website.

A clearly presented and structured evaluation plan which covers areas of evaluation use, enquiry, methods, limitations and evaluation team roles and responsibilities well. Annexes provide a matrix of data sources, collection methods and relevant tools against each evaluation question and include interview and focus group discussion guides.

This is a good quality report. The report is clear, well written and structured. Evaluation questions are addressed well. There is a clear upfront statement of findings followed by supporting evidence from multiple sources and/or methods. Data sources are clearly identified including numbered interviews. Recommendations are clearly linked to key findings in table form.

The management response is concrete and provides intended actions and timeframes. It specifies who (DFAT) will take things forward, where appropriate by work area and posts. (see page 6)

Pacific Women Shaping Pacific Development: Fiji Country Plan Review (2017)

The evaluation plan presents key elements of an evaluation in a succinct and accessible manner. The evaluation plan demonstrates triangulation well by presenting a matrix clearly showing multiple sources of evidence/methods for each evaluation question. The evaluation plan could have been improved by outlining a sampling strategy and addressing any ethical issues.

This is a well-researched, well-presented report. Some of the strengths include: evidence of change is presented at program, partner and beneficiary levels; findings relevant to specific sub-groups e.g. most vulnerable are included; and there is good analysis of contributing, enabling, constraining factors and lessons learned. The report provides a good analysis of the program's M&E system and recommendations for improving it. Questions about the degree of independence (evaluation was led by staff from the program's regional Support Unit) were counterbalanced by balanced evidence, verified through a collaborative workshop with a range of stakeholders.

Fiji Community Development Program (2017)

This Terms of Reference clearly sets out the background, purpose and scope including a detailed timeline, explanation of expected outputs and team make-up. It attached key documents and refers to the DFAT M&E standards. The evaluation questions are clear, and balanced. The number and type of evaluation questions matches the time available and the team capacity.

Fiji Community Development Program (2017) Terms of Reference [PDF 595 KB]

A comprehensive evaluation plan which provides sound justification for the methodology proposed and clearly outlines the methods and data sources to be used for each evaluation question in a table format. The evaluation plan includes a clear sampling strategy and a framework to guide analysis of data. There is a comprehensive discussion of limitations, including identification of mitigation strategies. Interview guides are provided in the Annex.

This is a high-quality report, presented in a clear and methodical manner. The use of evidence is rich. Each finding is presented in a balanced way and supported with evidence from a variety of sources. The position of the author is unambiguous. There is good analysis of the risks that the context presents and how these impact on the program's performance. The recommendations are clear, actionable and fit for purpose.

Pakistan Trade and Investment Policy Program (2017)

This is a good example of Terms of Reference for a jointly conducted evaluation. The roles and responsibilities of DFAT and the World Bank staff in the evaluation are clearly outlined. The Terms of Reference clearly state DFAT's evaluation requirements, including the need to publish the report and comply with the DFAT M&E Standards. Terms of Reference (See Annex 1 page 18) of Evaluation Report.

A clear, concise report, which is supported by a good stand-alone executive summary and Annexes which provide further details to substantiate findings. Findings are clearly supported by evidence from a range of sources, including data from the M&E system where available. The evaluation addresses an appropriate balance of operational and strategic performance issues. The evaluators clearly state their position about program progress. Conclusions and recommendations flow logically from the findings.

Terms of Reference (See Annex 1)

A good example of a management response for a jointly conducted evaluation. The actions that are being taken by both DFAT and the World Bank to address the recommendations are clear, and include additional actions by DFAT to improve program monitoring and performance. Most actions are ongoing, but other actions include specific timeframes for implementation. Clear explanations are provided for why two recommendations were only partially accepted.

Pakistan Trade and Investment Policy Program (2017) Management Response

Timor-Leste National Program for Village Development (2017)

The evaluation plan's discussion of approach, users, methods for data collection, sampling and analysis are of good quality. The methods/tools to be used for each evaluation question are clear and the Annexes include detailed interview and focus group guides tailored for different stakeholder groups. The evaluation plan could be improved by further discussion of limitations and ethical issues and an allocation of tasks to team members.

Timor-Leste Nutrition Strategic Review (2017)

The evaluation plan has been developed in close collaboration with the primary audience. A comprehensive document which covers all the key elements of an evaluation plan. Limitations, ethical issues and risks are well addressed.

Back to top