TERMS OF REFERENCE
Review of the functionality of the evaluation management response process in UNICEF
Consultancy: Two positions, Evaluation Consultant
Location: Evaluation Office, New York
Duration: Three months (from mid-December 2015 mid-March 2016)
Background and Context: Management Responses to Evaluations
UNICEF offices complete between 85-120 evaluations per year. Since 2009, it has been a requirement that each evaluation be followed by an evaluation management response (EMR) in which management details the actions it will take in reaction to the findings and recommendations. These commitments are recorded and subsequently tracked in a Management Response Data Base (MRDB). This process is governed by guidance and standard operating procedures. The EMR system operates by self-reporting. Offices upload their intended MRs, and subsequently update the actions taken and whether the intended action is completed or not. The results are reported to UNICEF’s Executive Board and to internal users on a periodic basis.
A summary of the logic and guiding policy for MRs is contained in the final 2 pages of this document. This is drawn from the guidance at the introductory pages of the MRDB. Additional written guidance exists. This summary makes reference to three other documents that provide additional insight. Those documents may be viewed her by following this link: http://www.unicef.org/evaluation/index_86334.html
Periodically UNICEF undertakes major reviews of some or all aspects of its evaluation function. It is intended in 2016 to request that an external peer review be conducted, potentially led by the UN Evaluation Group. In preparation for this peer review, a set of assessment exercises are being undertaken under the guidance of the UNICEF NY Evaluation Office. This review will be one if this set of exercises. It will be an independent look at the operation of the management response process, focusing on the actions taken by offices to implement commitments made in the MRs.
The purpose of the review is to generate evidence and analysis on the performance of the UNICEF evaluation Management Response system and advise UNICEF on steps required to strengthen it. This will require review of compliance with the requirements of the system, assessment of user engagement with the system, and preparation of recommendations concerning actions to make the system more timely, comprehensive, and impactful. Comparison with similar systems is expected to yield suggestions for adjustments required to strengthen the system and increase its ease of use by users across the organization.
The remainder of this Terms of Reference details the Scope of Work and the other information necessary for interested persons to submit an application to be considered for this work.
Scope of Work
- Develop a methodology for testing how much of the promised MRs have been completed, under these parameters: An easy to understand grading system is developed and employed that conveys how completely the MR has been satisfied
The grading system is to include at least three elements:
Quantity: what percentage of the intended response is completed
Quality: whether the response is a suitable action to take
Timing: when it was done
Qualitative information about user experience and satisfaction with the MR follow-up and the MR data base is needed. The offices accountable for the evaluation may be contacted and asked to describe their actions.
- Apply this methodology to both the global level (those commissioned by the Evaluation Office in UNICEF HQ) and the field level (the regional and country level offices) evaluations contained in the MR data base. The MRs for all global evaluations completed since 2011 should be sampled
A statistically valid random sample of field office evaluations completed since 2010 should be sampled.
On the assumption that offices will be ‘inspired’ to become more active with their MRs as a result of the present assessment this effort, develop a process to measure how much unexpected new activity occurs due to this work. It can be assumed that the Evaluation Office can provide a copy of the MR Data Base at two or more points in time so the change in activity levels can be measured. For purposes of contrast, MRs not investigated in this exercise will serve as a control.
Compare the UNICEF evaluation MR experience with three comparators: the UNICEF Office of Internal Audit (for audit responses) and 2 development agencies (to be approached by UNICEF to seek their cooperation). Issues to focus on in the comparison are the following:
Contrasting the rules and expectations regarding completed evaluations and management responses, to show how much agreement or difference there is between UNICEF’s structure and theirs. Contrast the information required within their MRs and that must be uploaded in the MRDB. Develop this contrast so it shows the comparisons for the initial upload and successive updates.
Contrast the process by which claimed MR actions are reviewed (e.g. proof must be attached? Periodic random samples? etc). Similarly, contrast the process, if any, by which MR actions are closed.
- Collect user input on their experience with and perceptions of the Management Response process. This may be collected by survey or interview means. This feedback should cover at least these issues:
Ease of use of the MR data base; quality of guidance; help desk availability
Ease and difficulty in arriving at appropriate MR actions: quality of the recommendations made in the evaluation reports; determining who is management; ‘span of control’ sufficient to make changes Obstacles and facilitators for completing MT commitments; reasons for high and low levels of commitment to complete MR actions
Perceptions about the value of the MR requirements and process
- Prepare a report the supplies the following information:
MR completeness at the HQ and the field levels, including by the quantity, timing, and quality metrics User reaction to the MR requirement, the MR processes, and the MRDB. Contrast the UNICEF findings with those of the Office of Internal Audit and two other agencies
Recommendations on what could make the MRDB and the MR process become more timely, comprehensive, and impactful. The final draft of the recommendations to follow the activity noted in item
- Lead a dissemination exercise for the draft report. The specific focus is be developing recommendations for improving the MR process. The exercise to be a participatory engagement with a set of key stakeholders. The final recommendations need not echo stakeholder conclusions, but should have taken their views into account.
Deliverables
- An inception report that details the methodology to be applied. 2. Weekly short interim progress reports 3. Draft report 4. Final report
The inception report may determine that other deliverables are required, such as the survey data if a survey tool is employed.
Calendar Expected end dates for the required work outputs: 1. Methodology finalized: 7 January 2016 2. Data collection: 30 January 2016
- Measurement of unexpected new activity: 10 February 4. Contrast of UNICEF evaluation MR experience with comparators: 20 February 2016 5. Key findings available in draft form: 20 February 2016 6. Final report complete draft: 1 March 2016 7. Dissemination exercise for draft report: 8 March 2015 8. Final version of final report: 15 March 2015
Team composition/Individual requirements It is estimated that the work will require a two person team to complete the overall assignment within the allotted time. Bidders must identify how they wish to be viewed with respect to team membership according to these options:
a. Team only: As part of a two person team in which both persons are named in the response to this invitation. The two persons can only be hired as a team.
b. Individual: As an individual and not as part of a team. Persons opting to bid as individuals are signalling they are willing to be matched with another person by UNICEF: i.e. two strong individuals will be directed to work as a team.
Level of Effort It is anticipated that this work requires full time attention for three months, particularly in the data collection phase (January especially).
Skills and Competencies Required The work has been estimated to require skills and competencies at the ‘4’ level in the United Nations job classification scheme. The following requirements are based on this level. The actual level and compensation rate will depend on the selected candidate’s work history and established fee range.
Required 1. Advanced degree in a research related discipline (e.g. Evaluation, Statistics, Research) or in a substantive social sciences or development studies field requiring methods training (e.g. Economics, Geography, Development Studies, Education etc.) 2. At least eight years professional work in the evaluation field 3. At least two of those years involved in evaluation in developing nation settings 4. Having been a team member or team leader in at least three evaluations that have generated recommendations for which the client was later to supply a management response 5. Having been a participant in the process of formulating the management response to an evaluation 6. Complete English fluency 7. Strong analytic skills 8. Strong writing skills
Advantages 1. Sectoral expertise in one or several areas featured in the UNICEF Strategic Plan (e.g. Health, HIV-AIDS, Gender, Education etc.) 2. Experience with management response data base details: structure, guidance, etc. 3. Experience in the quality review of management response efforts 4. French language capability
EmoticonEmoticon