8.2 Learning and generating knowledge from monitoring and evaluation Accountability for learning

The increasing focus of UNDP on MfDR and outcomes has shifted its focus from inputs, outputs and processes to development results at the outcome level. When the focus is on outcomes, which are influenced by multiple factors and are beyond the direct control of UNDP, the traditional view of assigning accountability to individuals for delivering outputs is no longer adequate. Accountability for outcomes encompasses RBM. Learning constructively from past mistakes and experiences is a critical part of MfDR and the UNDP accountability framework.

Monitoring and evaluation can only play a significant role in the accountability process if measures to enhance learning are put in place. Through regular exchange of information, reporting, knowledge products, learning sessions and the evaluation management response system, information from monitoring and evaluation can be fed back into the learning process and planning. UNDP needs to focus on learning from monitoring and evaluation to make a meaningful contribution to outcome achievement accountability and to encourage innovation for better results.

Using knowledge in planning and programming

One of the most direct ways of using knowledge gained from monitoring and evaluation is to inform on going and future planning and programming. Lessons from evaluations of programmes, projects and initiatives and management responses should be available when new outcomes are being formulated or projects or programmes are identified, designed and appraised. At the time of revising or developing new programmes, projects, policies, strategies and other initiatives, UNDP should call for a consultative meeting with key partners and stakeholders to review and share evaluative knowledge in a systematic and substantive manner.

Institutionalization of the learning process can be achieved in part by better incorporating learning into existing tools and processes. Knowledge from monitoring and evaluation should be incorporated in the following:

  • Project revisions—Monitoring and evaluation should together answer a number of useful questions such as whether the project initiatives are relevant to development needs, the project implementation is on track (outputs are being delivered on time), the strategy and logic of the results chain are working, the partnership strategy is efficient, and the project is reaching its target beneficiaries as intended. In addition to answering these questions, evaluation provides information as to ‘why’ things are working or not working. Such information should be incorporated in the improvements of the project strategy and trigger adjustments in a timely manner. When budget or other revisions are made to the project document, the lessons associated with the purpose of such change should also be stated. Good documentation of lessons and their internalization in project revisions help UNDP and its partners manage for results and foster a culture of systematic learning.
  • Replication and up-scaling—Evaluation of pilot initiatives is a must before such initiatives are replicated or scaled up. Lessons on what has and has not worked should inform the replication process. Again, good documentation of lessons and their internalization in the replication and up-scaling processes will help UNDP and its partners ensure that mistakes are not repeated.

Monitoring and evaluation lessons should be incorporated into the formulation of:

  • New programme documents—Country, regional and global programmes are formulated taking into account results achieved and lessons learned from regular reporting tools, internal reviews, and relevant evaluations, including project and outcome evaluations and independent evaluations conducted by the Evaluation Office, such as the ADR, which looks at the UNDP contribution towards development results in a given country. The evaluations of the regional and global cooperation frameworks should also provide substantive inputs to the design of respective programmes. It is also helpful to consult reviews and evaluations conducted by UNDP partners and non-partners in a similar thematic or subject area to find out whether any lessons can be drawn from their experiences. It is a good practice to document the sources of such evaluative information in a programme document as a future reference and for transparency purposes. Members of the Programme Appraisal Committee should ensure that there is clear evidence that relevant independent and decentralized evaluations are used in the formulation of new programme documents.
  • Project documents or AWPs—Project documents should include reference to the findings of relevant reviews or evaluations in the situation analysis section. Members of the Project Appraisal Committee should ensure compliance with this requirement by requesting explicitly which evaluation findings and lessons have informed the project design.
Accountability for learning

The increasing focus of UNDP on MfDR and outcomes has shifted its focus from inputs, outputs and processes to development results at the outcome level. When the focus is on outcomes, which are influenced by multiple factors and are beyond the direct control of UNDP, the traditional view of assigning accountability to individuals for delivering outputs is no longer adequate. Accountability for outcomes encompasses RBM. Learning constructively from past mistakes and experiences is a critical part of MfDR and the UNDP accountability framework.

Monitoring and evaluation can only play a significant role in the accountability process if measures to enhance learning are put in place. Through regular exchange of information, reporting, knowledge products, learning sessions and the evaluation management response system, information from monitoring and evaluation can be fed back into the learning process and planning. UNDP needs to focus on learning from monitoring and evaluation to make a meaningful contribution to outcome achievement accountability and to encourage innovation for better results.

Box 42. Experience from the Nepal country office: Using evaluations in the CPD and project design
The Nepal country office has been making a concerted effort to learn from and use evaluations. Most recently, in preparation for the development of the new CPD (2008-2010), the office reviewed all outcome evaluations under the current programme, project evaluations from 2006 (approximately eight were conducted), and other reviews and assessments conducted between 2003 and the end of 2006. The office synthesized the main findings and recommendations—focusing on the recurring points, common lessons and most relevant issues for the development of the new programme—into a 40-page document that was used as a reference while preparing the CPD. The office has also referred to it and shared relevant sections summarizing lessons learned when discussing joint programming or collaboration possibilities with other UN organizations.

The country office uses evaluations, particularly project evaluations, when preparing successor projects or extensions. They have developed a checklist for approval of new projects and substantive revisions, which includes a section for the monitoring and evaluation team. In addition to checking the monitoring and evaluation sections of the narrative, the results frameworks, and other monitoring tools, if there has been a recent evaluation, the monitoring and evaluation unit in the office reviews the evaluation and the project document together to ensure that relevant recommendations have been incorporated in the new project or revision.

Source: UNDP Nepal—extract from contribution to the EvalNet discussion, June 2007
 
Contribution to national, regional and global knowledge in development and evaluation 

As a partner in development, UNDP should ensure that its evaluations contribute to a better understanding of development effectiveness in the development community beyond UNDP. Key findings, conclusions and recommendations from evaluations should be widely shared and made available to potential users, as dissemination to audiences beyond UNDP and its immediate stakeholders can increase the impact of evaluations in important ways. For this purpose, evaluation reports should be made available to a wider audience. However, users often find evaluation reports too long and not easily accessible. Therefore, lessons and knowledge from the evaluations can be ‘packaged’ in the form of a knowledge product to meet the needs of a wider audience.

In order to effectively target a broader audience, there should be a thorough analysis of who the potential users of evaluation knowledge and lessons are, what they do, what their information needs are, how their learning takes place, and what kinds of communication and knowledge products are most suitable to achieve the objective of sharing knowledge. The commissioning programme unit should designate an individual (for example, a communications officer or knowledge management officer) to lead the process and coordinate activities to ensure effective sharing and dissemination of evaluation reports, lessons, knowledge and knowledge products.60

There are numerous ways to share information from evaluations. Below are some examples:

  • Upload evaluation reports and other knowledge products based on evaluations on the organization’s public websites. Ensure that the reports and the knowledge products are written clearly and made available in the most commonly used local languages.
  • Organize a meeting with interested stakeholders to discuss lessons from the evaluation(s).
  • Incorporate evaluation findings and lessons learned in the organization’s existing publications, such as annual reports, newsletters or bulletins.
  • Present findings and lessons at the annual stakeholders meeting, such as CPAP review meetings and forums with media.
  • Develop a brochure for UNDP activities and accomplishments.
  • Develop a brief with a concise summary in a plain language and circulate widely. UNDP may include the development of a brief in the ToR of the evaluators. Alternatively, the evaluation manager or a UNDP communications officer may develop it in consultation with the evaluators. 
  • Publish an article for an academic journal based on the evaluation findings.
  • Present a paper at a conference related to the evaluation subject area.
  • Invite local researchers and academics to discuss the data collected for the evaluation or to discuss the evaluation methodology and methods applied in the evaluation. This effort can also be supported by the evaluators.
  • Share findings, recommendations and lessons learned at training sessions and workshops for UNDP staff, government counterparts and other partners. Training should focus on areas such as how to improve the quality of UNDP programmes and projects and develop skills in methodological innovations.
  • Share lessons through knowledge networks within and beyond UNDP. For BDP, feed lessons into practice notes and other knowledge products developed by the policy and practice bureaux and units in Headquarters.

It is critical to make information from evaluations user friendly, easily accessible and advantageous to the audience. The following section provides guidance on how to develop a useful knowledge product.