United States of America: Evaluation Expert--Consultant

Project: Youth and Democracy, Human Rights and Governance (DRG) Research and Learning Project

Background
USAID released its first Agency-wide Youth in Development (YID) Policy in November 2012, and since then has been focused on policy implementation. One key component of policy implementation involves assessing existing knowledge and research related to youth programming, identifying any potential gaps in the evidence, and filling those gaps to equip field officers to effectively design, implement, and evaluate evidence-based youth programming.

Project Overview
Counterpart International is implementing the Youth and DRG Research and Learning Project as a part of the Global Civil Society Strengthening (GCSS) Leader with Associates (LWA) Program which provides technical and thought leadership on civil society issues, facilitates creative new programming ideas, and assists in knowledge management that feeds content into the broader USAID civil society strategic framework. The goal of the project is to enhance the effectiveness of DRG programming which includes the areas of Leadership, Organizations, Civic Education, and Voice and Civic Engagement and has a specific focus on youth by strengthening the knowledge and skills of the USAID field officers and other staff in the design, management, and creation of evidence-based evaluation of youth programming. The project has the following core objectives:
Objective 1: Identification of current evidence of what works or doesn’t work in Youth and DRG programming as well as a prioritization of critical knowledge gaps that need to be addressed through the DRG Center’s Youth and DRG Learning Agenda.
Objective 2: Creation of technical guidance and analytical tools to assist USAID field officers in designing, managing, and implementing evidence-based youth programming and in integrating youth development principles and approaches into general DRG programs.
To achieve these two objectives, Counterpart International is seeking a Research Team of three consultants consisting of a an Evaluation Expert, an Academic Specialist with expertise in public policy, democracy and/or governance, and an Academic Specialist with expertise in youth, sociology and/or development. This team will be supported by a Team Leader and Project Assistant from Counterpart International. Over a seven month period, this coordinated group of specialists will collaboratively produce the materials outlined in the diagram below. The implementation process illustrated below requires close coordination to identify existing documented methodologies and tools that lend themselves to effective program design, management and evaluation of evidence-based youth programming. The majority of the work detailed below in the individual scopes of work would take place between December 2014 and March 2015.
Sequencing of Project Activities

Scope of Work 1: Evaluation Expert
The role of the Evaluation Expert in Activity 1 above is to identify and analyze youth and DRG related program evaluations to determine if the evaluations meet the necessary rigor to be considered “evidence-based.” The Evaluation Expert will also provide recommendations of how the program activities and evaluation findings may contribute to Youth and DRG programming. The results of the Evaluation Experts tasks in Activity 1 will help inform the Academic Review in Activity 2.
Tasks:
• Develop in collaboration with the Team Leader a definition of “evidence-based” evaluation and establish the criteria (e.g. sampling frame; research design; causal identification strategy; threats to and tradeoffs between internal/external validity; etc.) to determine whether a given evaluation meets that definition;
• Conduct a thorough review of the Assessment of Youth Programming Across DRF Sectors report (a desk review exercise conducted by a team of graduate students as a preliminary research effort under this topic) to determine the report's quality in terms of breadth, depth, and methodological rigor;
• Provide recommendations for how and whether to incorporate the Assessment report findings into the project’s research process and findings;
• Define the role that qualitative data findings play in the overall research effort (e.g. for descriptive inference; to unveil complex interactive causes; etc.), and whether those meet satisfactory academic standards of methodological rigor to be included in the effort to identify good youth and DRG practices;
• Conduct further desk review of relevant evaluations, as needed, by applying a rigorous methodology;
• Categorize the literature/evidence under the four research areas: Leadership, Organizations, Civic Education, and Voice and Civic Engagement. Develop a program matrix that aligns with these broader research areas;
• Identify and engage relevant subject matter experts (SMEs) in line of DRG programming areas of Leadership, Organizations, Civic Education, and Voice and Civic Engagement to help suggest additional evaluations - qualitative or quantitative - to supplement the Assessment report; conduct interviews with relevant subject matter experts as needed. This should include but not be limited to work conducted by the Alliance for International Youth Development (AIYD) and the United Nations Development Program (UNDP).
Deliverables:
• Youth and DRG Programming and Program Evaluation Assessment Report (including but not limited to research methodology, analysis criteria of evaluations selected and assessed, sources, categorization of all evaluations reviewed, gaps in knowledge, findings, recommendations), including;
• Summary of the review of Assessment of Youth Programming Across DRF Sectors;
• Individual interview summaries;
• Cross-analysis of all interviews;
• List of additional relevant evaluations to incorporate and supplement this effort;
• Contributions to Summary Report.

Requirements:
• PhD in relevant field;
• 10 years’ experience designing and conducting quantitative and qualitative evaluations;
• Experience designing and conducting meta-analyses and/or critical appraisals of evaluations in relevant field;
• Recent experience working on research team that required collaboration with other team members to produce deliverables;
• Knowledge of USAID and/ or international development a plus.
Submit resume, references, list of evaluation work, and list of publications


EmoticonEmoticon