Job Description

Introduction

Established in 1951, IOM is a Related Organization of the United Nations, and as the leading UN agency in the field of migration, works closely with governmental, intergovernmental and non-governmental partners. IOM is dedicated to promoting humane and orderly migration for the benefit of all. It does so by providing services and advice to governments and migrants.

 

IOM is committed to ensuring a workplace where all employees can thrive professionally, while working towards harnessing the full potential of migration. Read more about IOM's workplace culture at IOM workplace culture | International Organization for Migration

 

Project Context and Scope

The project “Improving the Control Capacity of DG Customs Enforcement”, funded by the European Union (EU) and implemented by IOM Türkiye in cooperation with the Ministry of Interior, the Department of European Union Affairs and Foreign Relations, aims to improve the surveillance and control capacity of the customs in the movement of goods in the country. The main beneficiary of the project is the Ministry of Trade, Directorate General for Customs Enforcement (DGCE). The DGCE is the first national body at the borders tasked with checking personal belongings, goods, and vehicles entering and exiting the country. Therefore, the DGCE plays a critical role in providing the first response for the detection of nuclear and radiological materials on persons, in goods and in vehicles at borders. An international Evaluation Consultant will be engaged for the conduction of the final evaluation. 
 

Organizational Department / Unit to which the Consultant is contributing

 

Commissioned by: Monitoring, Evaluation, Accountability, and Learning (MEAL) Unit, IOM Türkiye

Managed by: Senior National MEAL Officer   

Responsibilities

  1. Evaluation Context

The International Organization for Migration (IOM), the United Nations (UN) Migration Agency, was established in 1951 and is the leading intergovernmental organization in the field of migration, working closely with governmental, intergovernmental, and non-governmental partners. With 175 member states, 8 states holding observer status, and offices in 172 countries, IOM is dedicated to promoting humane and orderly migration for the benefit of all. 

The IOM established its operations in Türkiye in 1991. IOM’s partnership with the Government of Türkiye (GoT) was formalized in November 2004, when Türkiye became an IOM Member State. IOM Türkiye works closely with the government of the Republic of Türkiye, regional authorities, the UN, donors, and civil society organizations to address migration challenges by implementing programmes through three pillars: Resilience, Mobility, and Governance. Across the country, IOM Türkiye provides a comprehensive response to the humanitarian needs of migrants, internally displaced persons, returnees, and host communities through direct humanitarian assistance, recreational activities, and various other efforts. Alongside IOM’s role in addressing the needs of migrants during crises, the mission works in close collaboration with the Government of Türkiye to address the longer-term impact of migration, including migrant assistance programmes, labour integration and migration management, immigration and border management, and research and data collection on migrant movement.

PROJECT INFORMATION


 

Title of the Project:

Improving the Control Capacity of DG Customs Enforcement
Project ID:IB0439
Implementing Agency:

International Organization for Migration (IOM)

UNSDCF 2021-2025 Outcome:

Outcome 4.2: By 2025, the effectiveness of the international protection and migration management system is improved.

IOM Strategic Plan 2024-2028 Objective and Long-Term Outcome:

Objective 3: Facilitating pathways for regular migration.

Long-Term Outcome 3b: Migration flows and cross-border mobility are well managed, with measures to ensure well-being, including health, security and safety throughout the mobility continuum.

IOM Türkiye Mission Strategy 2021-2025 Priority:Strategic Priority 4 - Governance: Strengthen cooperative development and implementation of evidence-based and inclusive migration governance that addresses migration challenges, leverages opportunities and facilitates safe, orderly and regular migration.
Beneficiary (Partner) Institutions:

Lead Institution: Ministry of Interior (MoI), Department of European Union Affairs and Foreign Relations (DIAB)

Beneficiary Institution: Ministry of Trade (MoT), Directorate General for Customs Enforcement (DGCE)

Location of the Project: Border Crossing Points in the Republic of Türkiye (land, sea, and air)
Start and End Date of the Project:1 May 2024 – 30 April 2026
Project Extensions (if any):N/A
Total Planned Project Budget:

Planned Budget: EUR 1,000,000

Funding Source/Donor:European Union (EU)
Overall Objective of the Project:Overall Objective (Impact): To improve the surveillance and control capacity of the customs in the movement of goods in the country.
Target Groups[1]:

Primary:

• Ministry of Trade (MoT)

Secondary:

• Ministry of Interior (Turkish National Police)

• Ministry of Energy and Natural Resources (MoENR)

• Turkish Nuclear Regulatory Authority (NDK)

• Turkish Energy, Nuclear & Mining Research Institution (TENMAK)

• Turkish Postal Services (International Mail & Cargo Processing Center [IMCPC])

• Express Courier Companies 

• Private Sector Entities

Final Beneficiaries[2]:

Travellers crossing the border and local communities
Estimated Results (Outcomes and Outputs):

Specific Objective (Outcome): To strengthen the customs control capacity in the areas of nuclear security and smuggling via parcel carriers.

Output 1: Radiation control capacity of DG for Customs Enforcement is improved.

Output 2: Mail and Express Courier Control Capacity is improved.

Evaluation Coverage:The thematic scope of the evaluation covers project activities implemented at DGCE Headquarters, selected border points, mail processing centers, training sites, and study visit beneficiary groups. It includes all technical, operational, capacity-building, and coordination components. The final evaluation aims to comprehensively assess the project’s performance during the period between 1 May 2024 and 30 April 2026.
  1. Evaluation Purpose, Objectives and Scope

This evaluation is being conducted in accordance with IOM’s Evaluation Policy and Guidelines which sets out a number of guiding principles and key norms for evaluation in the organization following the Norms and Standards of the United Nations Evaluation Group (UNEG)

The final evaluation aims to comprehensively assess and demonstrate progress achieved towards the expected results against those stated in the Project Documents and identify the lessons learned and recommendations relevant to the planning, preparation and implementation of activities under the subsequent phase.  The purpose of the external final evaluation is to provide a summative assessment encompassing all six core OECD-DAC criteria: relevance, coherence, effectiveness, efficiency, sustainability, and likely impact (including early signs of results, whether intended or unintended, positive or negative) as well as cross-cutting issues. 

Specific Objectives of the Final Evaluation

The specific objectives of the evaluation are to:

  • Assess the extent to which the project has achieved its stated objectives (outcome and impact levels) and outputs, while identifying both negative and positive factors that have facilitated or hampered progress in achieving the project outcomes, including external factors/environment, weakness in design, management, and resource allocation.

  • Measure project’s degree of implementation, efficiency and quality delivered on intended and unintended results (outputs) and specific objectives (outcomes), against what was originally planned. 
  • Provide summative evaluative evidence on the contribution of the project towards the improvement of surveillance and control capacity of the customs in the movement of goods in the areas of nuclear security and smuggling via parcel carriers.
  • Measure the project contribution to the objectives set in the IOM Strategic Plan 2024-2028, IOM Türkiye Mission Strategy 2021-2025, United Nations Sustainable Development Cooperation Framework (UNSDCF) for 2021-2025, IOM Immigration and Border Governance (IBG) Guidance, Community Engagement & Policing (CEP) Document, and Migration Governance Framework, the EU Integrated Border Management principles, national border management and customs enforcement policies and strategies of Türkiye (11th and 12th National Development Plans, Türkiye's Strategy Document and National Action Plan 2021 – 2025, MoT Strategic Plan).
  • Assess the effectiveness of the implementation strategy (i.e. implementation modalities, issues of coordination and partnership arrangements, and synergy among similar projects as well as with other initiatives/programmes of IOM Türkiye).

  • Assess the management and financial efficiency of the project.

  • Assess the extent to which the project's outcomes will be sustainable (without the need for external support) and contribute towards the objective of the project.

  • Assess project’s contribution to gender equality and women’s empowerment and the broader “Leave No One Behind” agenda where applicable. 
  • Generate substantive evidence-based knowledge by identifying best practices and lessons learned that could be useful for remaining project implementation, other interventions at national (scale up) and international level (replicability).

  • Provide a forward-looking perspective for IOM’s positioning in relation to integrated border management for subsequent phases.

  • Provide formative recommendations toward design and implementation arrangements of the subsequent phase.

Through a participatory approach, the Evaluation Consultant will actively engage relevant stakeholders, including IOM Türkiye staff, focal points of the national counterparts and beneficiaries from different activities. A tentative Data Collection Mission Agenda is provided under section 10 to indicate the level of effort and in-country travel required for data collection.

Recommendations, emerging from the evaluation, should be strongly linked to the findings of the evaluation and should provide clear guidance to IOM Türkiye and its stakeholders on how they can address them.

Furthermore, the evaluation will have a focus on what worked, what did not work, and why, based on feedback against evidence and provide actionable recommendations, highlight best practices, share lessons learned, and offer valuable insights, which will inform the remainder of the implementation of the project. An essential aspect of the evaluation involves analysing the integration of IOM cross-cutting themes of gender equality, non-discrimination and human rights-based approach into project activities and implementation.  

  1. Evaluation Criteria

The OECD-DAC evaluation criteria, relevance, coherence, effectiveness, efficiency, likely impact, and sustainability will guide the final evaluation. All assessments will be conducted in accordance with the OECD-DAC definitions to ensure methodological rigour and consistency with international standards. 

Relevance: The extent to which the intervention's objectives and design respond to the needs, policies, and priorities of beneficiaries at the global, national, and institutional levels, and continue to do so as circumstances evolve.

Coherence: The extent to which the intervention is compatible with other interventions within the same country, sector, or institution.

Effectiveness: The extent to which the intervention has achieved, or is expected to achieve, its stated objectives and results, including any differential results across groups.

Efficiency: The extent to which the intervention delivers, or is likely to deliver, results in a cost-effective and timely manner.

(Likely) Impact: The extent to which the intervention has produced or is expected to produce significant positive or negative, intended or unintended, higher-level effects.

Sustainability: The extent to which the benefits of the intervention continue or are likely to continue.

  1. Evaluation Questions

In the light of the evaluation parameters, the Evaluation Consultant is expected to analyze data and share their findings, conclusions and recommendations generated by this analysis. As a reference point for the evaluation, the Evaluation Consultant is provided with indicative key and sub-evaluation questions below; which are expected to be amended, elaborated, aggregated, consolidated and submitted as part of the Inception Report and shall be included as an annex to the final evaluation report described below. In consultation with the Commissioning Unit, in the inception phase, the Evaluation Consultant will further develop and refine the evaluation questions to ensure detailed and specific information is gathered for each criterion. The evaluation matrix will be reviewed collaboratively with the Commissioning Unit. Any fundamental changes to the evaluation criteria and questions should be agreed upon between the evaluator(s) and the Commissioning Unit and reflected in the inception report.

The sub-questions included in the ToR are intentionally designed to guide prospective evaluation team by outlining the expected scope of the assessment, key thematic areas to be explored, and the breadth of analysis anticipated. They are meant to provide a comprehensive picture of what the evaluation should cover and to support evaluators in developing a robust evaluation matrix. Further refinement and consolidation of evaluation questions is an essential step during the inception phase. The evaluation evaluation team will be expected to prioritize, aggregate, and streamline these questions to ensure analytical depth, coherence, and a focused final report.

The following key questions will guide the evaluation process:

Evaluation CriteriaEvaluation Questions 

Relevance

Key evaluation question-1: To what extent was the project’s design, theory of change, and interventions appropriate and aligned with the needs and priorities of target beneficiaries and stakeholders, as well as with national, regional, and international policy frameworks and IOM’s comparative advantages?

Sub-questions:

  1. To what extent were the theory of change applied and interventions designed in the project relevant to serving the needs of the target groups, beneficiaries and stakeholders (MoT, MoI, NDK, TENMAK, IMCPC)?

  2. To what extent was the project aligned with national border management and customs enforcement policies and strategies of Türkiye (11th and 12th National Development Plans, Türkiye's Strategy Document and National Action Plan 2021 – 2025, MoT Strategic Plan) and priorities of the global and regional initiatives such as the Almaty Process, Italy–Türkiye bilateral engagement, and other frameworks?
  3. To what extent were the design and strategy of the project activities aligned with the objectives set in the IOM strategies, UNSDCF 2021-2025, IOM Immigration and Border Governance (IBG) Guidance, Community Engagement & Policing (CEP) Document, Migration Governance Framework, and the EU Integrated Border Management principles?

  4. To what extent does the project fit into the comparative advantages of IOM’s global experience and presence in the area of border management?

  5. To what extent were the participants of project activities representative of public, civil society, academia, private sector and other relevant target groups where applicable?

  6. What opportunities are there to better align the support/activities to the changed context and the needs of the beneficiaries for future needs?

Coherence

Key evaluation question-2: To what extent was the project internally consistent and logically designed, and how well did it complement and create synergies with other relevant interventions and institutional frameworks in border management and customs enforcement?

Sub-questions:

  1. (External coherence) To what extent did the project create synergy/linkages with other projects and interventions in the country i.e. other projects implemented for border security, customs enforcement and surveillance? How well does the project complement government-led border management systems?
  2. (Internal coherence) To what extent are the project’s objectives, outcomes, outputs, and activities logically aligned and mutually reinforcing? How coherent is the project’s theory of change in linking project interventions (e.g. ToTs, cascaded trainings, study visits, team building, assessments) to intended border management outcomes?

Effectiveness

Key evaluation question-3: To what extent did the project achieve its intended outputs and outcomes, strengthen institutional capacities and coordination, and respond effectively to beneficiary needs while enabling adaptive implementation and learning?

Sub-questions:

  1. To what extent has the project achieved its intended outputs and outcomes until the end of project duration? (The Evaluation Consultant is expected to provide detailed analysis of 1) planned activities and results and 2) achievement of results.)

  2. How effective have the project’s interventions (e.g. ToTs, cascaded trainings, study visits, team building, assessments) been in improving border security and surveillance in line with the EU’s integrated border management policies and strategies? To what extent have project activities responded to the needs of target groups and beneficiaries (e.g., national and provincial administrations, private sector, regulatory authorities)?

  3. To what extent have the project partners/beneficiary institutions increased their capacity on implementation of the project interventions such as DGCE’s radiation detection response capability (e.g. alarm handling, NORM identification, emergency protocols) and prevention of smuggling via postal/express courier channels (e.g. risk targeting, profile-based selection, OSINT use)?

  4. How did the project contribute to enhancing inter-agency information sharing and operational coordination?

  5. What are the key factors contributing to project success or underachievement during the project execution?  How might this be improved for future programming?

  6. To what extent have monitoring mechanisms supported effective implementation and results achievement?

  7. To what extent and in what ways has ownership - or the lack of it - by the partners/beneficiary institutions impacted the effectiveness of the project?

  8.  

Efficiency

Key evaluation question-4: To what extent were project resources (financial, human, and time) used optimally to achieve results, and how well did management, partnerships, and adaptive approaches support cost-effective and timely implementation?

Sub-questions:

  1. How well did the resources (funds, expertise, and time) convert into results? To what extent were project outputs delivered on time and with high quality? To what extent has the project ensured value for money? 

  2. How efficient are the project’s approaches to improving border security and surveillance (e.g. ToTs, cascaded trainings, study visits, team building, assessments) compared to alternative approaches?

  3. What was the progress of the project in financial terms, indicating amounts committed and disbursed across results (total amounts & as percentage of total) by IOM? Was funding sufficient for the achievement of results? (funding analysis)

  4. To what extent and in what ways has ownership - or the lack of it - by the partners/beneficiaries impacted on the efficiency of the project? 

  5. Was there any identified synergy between IOM initiatives/projects that contributed to reducing costs and optimizing resources while supporting results? 

  6. How well did implementation arrangements work for achievement of results? Did the project have the necessary coordination mechanisms, implementation arrangements and communication flow to ensure that the allocated resources efficiently converted into the expected outputs?

  7. To what extent did project MEAL systems provide management with a stream of data that allowed it to learn and adjust implementation accordingly?

  8. What type of (administrative, financial and managerial) obstacles did the project face and to what extent has this affected its efficiency?

  9. How appropriate and effective has IOM’s partnership strategy employed by the project been? What factors contributed to its effectiveness or ineffectiveness?

  10. How well did the project respond to external shocks or contextual shifts? (adaptive management)

Likely Impact 

Key evaluation question-5: To what extent is the project contributing, or likely to contribute, to sustainable improvements in border security, surveillance capacity, and broader institutional and systemic outcomes at different levels?

Sub-questions:

  1. What evidence exists of emerging changes in capacity of beneficiary institutions as a result of the project?

  2. To what extent is the project likely to contribute to longer-term improvement of border security and surveillance capacity in line with the EU’s integrated border management policies and strategies?

  3. What broader effects (positive or negative/intended or unintended) are emerging from the project’s interventions at the individual, organizational, or national level?

  4. What factors are likely to influence the project’s longer-term impact? How can future design be strengthened to scale up results?

Sustainability

Key evaluation question-6: To what extent are the project’s results, capacities, and systems likely to be sustained over time through institutional ownership, policy integration, and availability of financial and human resources?

Sub-questions:

  1. To what extent have partners/beneficiary institutions demonstrated ownership of the intervention modalities created by the project? To what extent has this project induced prospect for active policies targeting target groups to be pursued by the beneficiary institutions to improve the overall efficiency of their services?

  2. To what extent are project results embedded within existing institutional systems (e.g. Radiation Control Plan, risk profile portfolios, training curricula, OSINT unit)?

  3. What systems, capacities, and resources are in place to sustain results after project completion? To what extent do legal frameworks and governance structures support continuity of project benefits?

  4. What are the main risks to sustainability (institutional, financial, political, operational)?

  5. Are the legal frameworks, policies and governance structures and processes in place for sustaining project benefits?

  6. To what extent can project approaches be replicated or scaled up across Türkiye or in other contexts? What measures (including exit strategy elements) are needed to strengthen sustainability?

Crosscutting Issues

Key evaluation question-7: To what extent did the project integrate and uphold principles of human rights, gender equality, inclusion, conflict sensitivity, and equity in its design, implementation, and outcomes?

Sub-questions:

  1. To what extent has the project integrated human rights, gender equality, disability inclusion, and Leave No One Behind principles in its design, implementation, and monitoring where applicable?

  2. To what extent are project activities accessible and responsive to the needs of vulnerable groups, including disadvantaged groups of women and persons with disabilities where applicable?

  3. What barriers exist to equitable participation and benefit among stakeholders and beneficiaries?

  4. What measures could strengthen inclusion and equity in future programming?

  5. To what extent has the project applied conflict-sensitive and Do No Harm approaches, particularly in sensitive migration contexts?

  6. To what extent has the project contributed to strengthening the protection and human rights of migrants in Türkiye?

  1. Evaluation Approach and Methodology

The evaluation should be transparent, inclusive, participatory and utilization-focused. The overall
methodology should be implemented following a theory of change approach, framed by the UN/OECD DAC[3] evaluation criteria drawing upon mixed methods (quantitative and qualitative) data to capture direct project results as well as (likely) contributions.

In line with good practice in evaluating this type of complex system change-focused intervention, the overall methodology should be based on three concrete pillars:

  1. the project’s theory of change;
  2. an evaluation matrix grouping key evaluation questions and sub-questions by broad OECD/DAC criterion allowing analysis of programme results at different levels of its results chain
  3. a data collection toolkit for the evaluation describing the quantitative and qualitative primary and secondary data collection tools that will be deployed to collect and analyse data to answer the evaluation questions.

The evaluation process should be participatory, engaging government officials, implementing and development partners, project staff, key stakeholders and a wide cross-section of staff and beneficiaries while ensuring inclusion of elements of gender equality.

The main analytical framework for the evaluation is provided by the Project’s theory of change, which should be used to organize the evaluation questions according to the Project’s expected results at each level of its results chain. In doing so, the evaluation should use a broad Contribution Analysis (CA) approach to causal inference with a view to understanding the influence of relevant contextual factors, and alternative and additional drivers or obstacles to change at the regional and national levels that may have influenced the Project’s direct and indirect, intended and unintended results.

The evaluation should also seek to apply additional evaluation techniques that can further strengthen the plausibility of links between the results of the different strands of work on various intended Project outcomes at the policy, community and individual beneficiary levels as well as telling the story of how and why both intended and unintended change has or has not happened as a result of the intervention. The methodological prism may involve contribution analysis (effectiveness), process tracing (case studies), outcome harvesting/most significant change (unplanned/emerging results), landscape analysis (relevance and coherence), quantitative analysis (indicator achievement and funding analysis), document review (relevance and efficiency), light foresight (forward-looking aspects) and techniques linked to participatory evaluation[4].

In line with UN evaluation practice, the scope of the evaluation should cover six core UN/OECD DAC evaluation criteria: relevance, coherence, effectiveness, efficiency, likely impact, sustainability, and crosscutting issues. In proposing how to conduct the evaluation, the Evaluation Consultant should use an evaluation matrix to operationalize the theory of change and its agreed framework of direct and indirect results into a set of measurable categories of evaluative analysis following the results chain of the intervention. The evaluation matrix should properly address gender equality (GE) and human rights (HR) dimensions, including age, disability and vulnerability.

The evaluation questions above present a set of preliminary questions that the Evaluation Consultant should address in their proposed approach, following the revised UN/OECD DAC criteria. A final, more detailed evaluation matrix will be developed during the inception phase on the basis of document review and initial consultation with key Project stakeholders.

On the basis of the questions included above and the information present elsewhere in this Terms of Reference, the Evaluation Consultant should deploy a set of data collection methods and tools (that includes gender disaggregation) and allow for rigorous triangulation. These methods and tools will allow leveraging existing secondary data as well as collecting new primary data to be gathered during the field visit, which together will be able to answer the initial questions listed above. 

The combination of primary and secondary tools or the number of separate ‘lines of evidence’ should be at least five and be designed – as with the rest of the evaluation - with triangulation and complementary assessment of the sub-questions in the matrix in mind.

The Evaluation Consultant is requested to propose a set of mixed methods data collection/analysis methodologies and techniques to answer the evaluation questions.[5] This will be refined in the inception phase. The following lines of evidence should be considered: 

  • Document, literature and monitoring systems review: Commissioning Unit will provide access to all relevant documentation, data collected, and analysis. Further documents may be requested by the Evaluation Consultant. The Project team will share information and provide guided walk-throughs of the Project and project management methods, platforms and tools. This should include a review of;
  • Project document and description of the action
  • Result Framework/MEAL Framework and Plan 
  • Work Plan
  • Donor/Progress Reports 
  • Monitoring Reports
  • Project Steering Committee meeting minutes
  • Studies relating to the country context and situation 
  • Financial documentation and reports.
  • Background documents and other documentation.
  • Analysis of deliverables and financial reports: Comprehensive access to deliverables, financial reports, and reporting dashboards will be provided alongside documentation.
  • Structured, semi-structured and/or in-depth interviews or Key Informant interviews (KIIs): The team will provide a stakeholder list, including a wide range of stakeholders from the donor, national government and local administrations, as well as project partners, IOM representatives and the project team, and selected provincial stakeholders. All interviews should be undertaken with full confidentiality and anonymity. (The final evaluation report should not assign specific comments of individuals)
  • Focus groups discussions with Training of Trainers (ToTs), training participants, and express courier company representatives.
  • Quantitative surveys: Surveys should have a clearly defined scope and seek to answer specific questions about the Project outcomes/objectives. Conduction of online surveys targeting 180 respondents from cascaded training participants and open-source intelligence team members is expected.
  • Secondary data analysis.
  • Direct observations.
  • Case studies/deep dives.

The Evaluation Consultant will ensure triangulation of the various data source. Data and evidence will be triangulated with multiple sources to address evaluation questions. The final methodological approach including interview schedule and data to be used in the evaluation should be clearly outlined in the inception report and fully discussed and agreed between IOM, stakeholders and the Evaluation Consultant. 

Data collection tools should be gender sensitive, ensure that the data collection is disaggregated by sex and take into account the broader cross-cutting issues as presented below and elsewhere in the ToR.

Cross-cutting

As noted above, the promotion and protection of Human Rights (HR) & Gender Equality (GE) and Disability Issues (DI) and LNOB are central principles to the mandate of the UN, and all UN agencies must work to fundamentally enhance and contribute to their realization by addressing underlying causes of human rights violations, including discrimination against women and girls, and utilizing processes that are in line with and support these principles. Those UN interventions that do not consider these principles risk reinforcing patterns of discrimination and exclusion or leaving them unchanged. It is therefore important that evaluations commissioned by IOM take these aspects into account.

The methodology and techniques to be used in the evaluation should be described in detail in the Inception Report and the Final Evaluation Report, and should contain, at minimum, information on the instruments used for data collection and analysis, whether these be documents, interviews, questionnaires or participatory techniques following high level of research ethics and impartiality. 

Final decisions about the specific design and methods for the evaluation will be made through consultation among IOM, the Evaluation Consultant and key stakeholders about what is appropriate and feasible to meet the evaluation purpose and objectives as well as answering the evaluation questions, given limitations of budget, time and data.

  1. Ethics, Norms, and Standards for Evaluation 

The Evaluation Consultant must follow the IOM Data Protection Principles, UN Evaluation Group (UNEG) norms and standards, and relevant UNEG ethical conduct guidelines while carrying out the final evaluation. The evaluation of the project is to be carried out according to ethical principles and standards established by the UNEG. 

  • Anonymity and confidentiality. The evaluation must respect the rights of individuals who provide information, ensuring their anonymity and confidentiality. 
  • Responsibility. The report must mention any dispute or difference of opinion that may have arisen between the Evaluation Consultant and Project Team in connection with the findings and/or recommendations. The Evaluation Consultant must corroborate all assertions and disagreements. 
  • Integrity. The Evaluation Consultant will be responsible for highlighting issues not specifically mentioned in the ToR if this is needed to obtain a more complete analysis of the intervention. 
  • Independence. The Evaluation Consultant should ensure its independence from the intervention under review and must not be associated with its management or any element thereof. 
  • Incidents. If problems arise during the interviews, or at any other stage of the evaluation, they must be reported immediately to IOM. If this is not done, the existence of such problems may in no case be used to justify the failure to obtain the results stipulated by IOM in this Terms of Reference. 
  • Validation of information. The Evaluation Consultant will be responsible for ensuring the accuracy of the information collected while preparing the reports and will be ultimately responsible for the information presented in the evaluation report. 
  • Intellectual property. In handling information sources, the Evaluation Consultant shall respect the intellectual property rights of the institutions and communities that are under review.
  • Delivery of reports/deliverables. If delivery of the reports/deliverables is delayed, or in the event that the quality of the reports delivered is lower than of the quality desired by IOM, the Evaluation Consultant will not be entitled for any payment regarding that specific report/deliverable, even person/days for submission of the report/deliverable has been invested.
  1. Human Rights, Gender Equality, Vulnerable Groups and Disability Issues

The methodology used in the final evaluation, including data collection and analysis methods should be human rights and gender-sensitive to the greatest extent possible, with evaluation data and findings disaggregated by sex, ethnicity, age, etc. Detailed analysis on disaggregated data will be undertaken as part of the final evaluation from which findings are consolidated to make recommendations and identify lessons learned for enhanced gender responsive and rights-based approach of the project. These evaluation approach and methodology should consider different types of groups in the project intervention – women, youth, minorities, and vulnerable groups.

The promotion and protection of Human Rights (HR) and Gender Equality (GE) vulnerable groups and Persons with Disabilities are central principles to the mandate of the UN, and all UN agencies must work to fundamentally enhance and contribute to their realization by addressing underlying causes of human rights violations, including discrimination against women and girls, and utilizing processes that are in line with and support of these principles. Those UN interventions that do not consider these principles risk reinforcing patterns of discrimination and exclusion or leaving them unchanged. It is, therefore, important and required that evaluations commissioned by UNDP take these aspects into account.

  • Guidance on Integrating Disability Inclusion (DI) in Evaluations[6]
  • Integrating Human Rights and Gender Equality in Evaluations[7]

Concretely, Evaluation Consultant members are requested to incorporate the following key principles from the UNEG guidance for integrating human rights, gender equality, and disability inclusion into their work:

  1. Inclusion. Evaluating HR, GE, and DI requires paying attention to which groups benefit and which groups contribute to the intervention under review. Groups need to be disaggregated by relevant criteria: disadvantaged and advantaged groups depending on their gender or status (women/men, age, location, etc.), duty-bearers of various types, and rights-holders of various types, in order to assess whether benefits and contributions were fairly distributed by the intervention being evaluated. In terms of HR & GE, it is important to note that women and men, boys and girls who belong to advantaged groups are not exempt from being denied their human rights or equal rights. The same applies to persons with disabilities. Therefore, the concept of inclusion must assess criteria beyond advantage. Likewise, it is not unusual that some groups may be negatively affected by an intervention. An evaluation must acknowledge who these stakeholders are and how they are affected, and shed light on how to minimize the negative effects.
  2. Participation. Evaluating HR, GE and DI must be participatory. Stakeholders of the intervention have a right to be consulted and participate in decisions about what will be evaluated and how the evaluation will be done. In addition, the evaluation will assess whether the stakeholders have been able to participate in the design, implementation and monitoring of the intervention. It is important to measure stakeholder group participation in the process as well as how they benefit from results.
  3. Fair Power Relations. Approaches to human rights, gender equality and disability inclusion issues seek, inter alia, to balance power relations between or within advantaged and disadvantaged groups. The nature of the relationship between implementers and stakeholders in an intervention can support or undermine this change. When evaluators assess the degree to which power relations changed as a result of an intervention, they must have a full understanding of the context, and conduct the evaluation in a way that supports the empowerment of disadvantaged groups. In addition, evaluators should be aware of their own position of power, which can influence the responses to queries through their interactions with stakeholders. There is a need to be sensitive to these dynamics.
  1. Evaluation Deliverables and Schedule

Final Evaluation is expected to be conducted between 4 May 2026 – 30 September 2026. In total, it is expected that the evaluation will require a maximum of 28 working days to complete, including all contributions to the inception, country visit and write-up phases of the evaluation inclusive of all related costs. 

The below-proposed timeframe and expected deliverables will be discussed with the Evaluation Consultant and refined during the inception phase. The final schedule of deliverables should be presented in the Inception Report. The MEAL Unit reserves the right to request revisions to the evaluation deliverables until they meet the quality standards set by the IOM guidelines. 

The Evaluation Consultant is expected to submit the following deliverables to the satisfaction of IOM:

#DeliverableDue DateReview and Approvals Required
1Final Inception Report and Data Collection Toolkit

1 June 2026

Reviewed and approved by Evaluation Manager in consultation with Senior National Programme Officer (IBG) and Regional Office
2Draft Evaluation Report 12 July 2026Reviewed and approved by Evaluation Manager in consultation with Senior National Programme Officer (IBG) and Regional Office
3Final Evaluation Report with Audit Trail + Presentation slide deck + Management Response Matrix + Evaluation Brief and Infographics23 August 2026Reviewed and approved by Evaluation Manager in consultation with Senior National Programme Officer (IBG) and Regional Office
Deliverable and Linked PaymentIndicative total required level of effort*Related ActivityResponsible Party

Expected 

Date of Completion**

Inception Report + Data Collection Toolkit 

7

Kick off meeting IOM6 May 2026
Review of relevant documentation and secondary data, scoping interviews with project team, and submission of the draft Inception Report + Data Collection ToolkitEvaluation Consultant18 May 2026
Providing feedback to Draft Inception ReportIOM22 May 2026
Finalized Inception Report + Data Collection Toolkit based on the feedback receivedEvaluation Consultant1 June 2026
Draft Evaluation Report  

14

Primary data collection and interviews with IOM and key stakeholders (field mission)Evaluation Consultant

7-12 June 2026 or 

14-19 June 2026 (indicative)

Online Mission Wrap-Up Meeting to present initial findingsEvaluation Consultant1 July 2026
Delivery of Draft Evaluation Report compiling findings from data collection and interviews with key stakeholders and secondary data reviewEvaluation Consultant12 July 2026 

Final Evaluation Report with Audit Trail + Presentation slide deck + Management Response Matrix + Evaluation Brief and Infographics

7

Review the Draft Evaluation Report and provide feedback

IOM

20 July 2026
Addressing of IOM commentsEvaluation Consultant2 August 2026
Review the Revised Draft Evaluation Report and provide commentsEvaluation Reference Group17 August 2026
Delivery of the Final Evaluation Report and Audit Trail by taking into consideration the feedback from IOM and Evaluation Reference Group (Package will also include Presentation slide deck + Management Response Matrix + Evaluation Brief and Infographics)Evaluation Consultant23 August 2026
Online Presentation to IOM and StakeholdersEvaluation Consultant1 September 2026

* The number of person/days are solely provided to give the Evaluation Consultant an idea on the work to be undertaken inclusive of all related evaluation costs. The payments shall be realized in accordance with Evaluation Budget and Terms of Payment section, irrespective of the number of person/days to be invested for the completion of each respective deliverable. The person/days indicated shall include all related costs for field mission and data collection including travel, accommodation, local transportation, and translation/interpretation.

** Dates may be changed according to the actual contract start date.

The final report of the evaluation will be shared with the IOM and key stakeholders. It will be attached as an Annex to the final report for donor submission. The results of the evaluation report will be presented and discussed with IOM Türkiye, stakeholders, and the donor.

The external Evaluation Consultant is expected to deliver the following:

  • Inception report + Data collection toolkit

This report will be 30 pages maximum in length and will propose the methods, sources and procedures to be used for carrying out the independent evaluation. The report should justify why the said methods are the most appropriate, given the set of evaluation questions identified in the ToR. It will also include a mission programme which indicates proposed timeline of activities and submission of deliverables as well as an Evaluation Matrix. The Evaluation Matrix will demonstrate the Evaluation Consultant`s understanding of the ToR and outline data collection and analysis plans, to be completed and reviewed with the IOM Türkiye prior to the field visit. The report should be submitted to IOM Türkiye after the document review phase for feedback and discussion, prior to the interview phase. The Inception Report should include in Annex a Data Collection Toolkit that includes a set of data collection instruments for both qualitative and quantitative data collection tools to be used in the course of the evaluation (i.e. for qualitative data: interview guides, focus group discussion guide, direct observation forms, questionnaires for consultations with stakeholders, etc; for quantitative data, surveys, relevant templates to assess change in basic financial and operational performance of the partners over the period supported by IOM). The toolkit should also include a proposal around how the different data sources will be organized and synthesized. The data collection toolkit should be able to be tailored to both English and local language data collection needs. The Evaluation Consultant will maintain an audit trail of the comments received and provide a response on how the comments were address in the revised drafts. This document will be used as an initial point of agreement and understanding between the Evaluation Consultant and IOM. In principle, the report is expected to contain the outline provided in the footnote link.[8]

  • Presentation of the initial findings

Following the field visits phase, the Evaluation Consultant should prepare a presentation of the preliminary evaluation findings, tentative conclusions, and recommendations. This will be used to debrief the IOM team and address any misinterpretations or gaps. 

  • A draft evaluation report

Based on the debrief and initial feedback, the Evaluation Consultant should prepare a draft evaluation report[9]  incorporating lessons learned and recommendations, and share with the IOM Türkiye Team. It will also contain an executive summary of no more than 5 pages that includes a brief description of the project, its context and current situation, the purpose of the evaluation, its methodology and its main findings, conclusions and recommendations. The findings section must be organized by evaluation criteria and refined sub-questions agreed at the inception stage, presenting evaluation findings and recommendations for the Project, aggregated and synthesized on the basis of the results of the different data collection and analysis tools (35-45 pages). Annexes must be presented with a summary of findings from each of the ‘lines of evidence’ used to support the evaluation findings. All completed tools and datasets making up the different lines of evidence should be made available to IOM. IOM will disseminate the draft evaluation report to the evaluation reference group in order to seek their comments and suggestions. Comments and suggestions of IOM and Evaluation Reference Group will be collected in an audit trail and will be shared with the Evaluation Consultant for final revisions. There could be up to three rounds of review, typically first by IOM for quality assurance, then by evaluation reference group, and finally by senior management.

  • Final evaluation final report + Audit Trail

Once finalized, the Evaluation Consultant will submit the final evaluation report in the IOM-provided template. The report should include an executive summary, a list of acronyms, an introduction, an evaluation context and purpose, an evaluation framework and methodology, findings, conclusions, and recommendations. Annexes should include the TOR, the inception report, the list of documents reviewed, the persons interviewed or consulted, and the data collection instruments. In addition, the Final Evaluation Report should contain clear recommendations that are concise, feasible, actionable and clearly linked to conclusions and findings. The Final Evaluation Report will be shared with IOM to be disseminated to the key stakeholders. The Evaluation Consultant will also submit its responses to the Audit Trail to show the actions taken/not taken and revisions made/not made in line with suggestions and recommendations of IOM and Evaluation Reference Group providing detailed justifications in each case. 

  • Evaluation brief (two-page summary according to the IOM template) and infographics

The Evaluation Consultant will prepare a concise two-page evaluation brief[10]  in English, summarising key findings, lessons learned, and recommendations. IOM will provide a template that may be adapted but it should not exceed 2 pages. Page one should include identification of audience; project information (project title, countries covered, project type and code, project duration, project period, donor(s), and budget); evaluation background (purpose, team, timeframe, type of evaluation, and methodology); and a brief description of the project. Page two should summarise the most important evaluation results: key findings and/or conclusions, best practices and lessons learned (optional), and key recommendations. The Evaluation Consultant will also prepare infographics summarizing key findings, good practices, lessons learned, conclusions, and recommendations. The purpose of these infographics is to enhance accessibility of results for diverse audiences and support effective dissemination. 

  •  Management response matrix (IOM will provide the template)

After IOM Türkiye approves the evaluation report and brief, the Evaluation Consultant should draft a Management Response Matrix[11] using the IOM template, listing recommendations and indicative timelines for implementation. The IOM team will finalize the matrix in coordination with project stakeholders.

  • The final presentation of the evaluation report (online briefing with a slide deck for IOM staff, the donor, and key stakeholders to be identified and agreed upon)

A meeting will be organized with key stakeholders including IOM and Evaluation Reference Group members to present findings, conclusions, and recommendations. The meeting will be held online. The presentation will be on main findings and lessons learned but will also be forward looking in proposing recommendations that are actionable by IOM and stakeholders. A draft slide deck should be shared with the IOM project and MEAL teams in advance, and feedback should be reflected in the final version.

All deliverables are to be written in English and meet good language standards. The final report should meet the standards laid out in the UNEG Quality Checklist for Evaluation Reports. IOM Türkiye will not cover any cost related to translation/interpretation during data collection and reporting; all related expenses shall be included in the Evaluation Consultant’s payments.

  1. Institutional Arrangements/Reporting Lines 

IOM has full ownership of this assignment and of its final products. Thus, any public mention (including through social media) about the activity should state clearly that ownership. In addition, any public appearance or related published work related to the activity should be coordinated and approved by IOM in advance. Likewise, any visibility material or product produced for this assignment must be in the name of IOM and shall not be used without prior approval from IOM.

The Evaluation Consultant shall be responsible to the Evaluation Manager (in this case IOM’s Senior National MEAL Officer) for the completion of the tasks and duties assigned throughout this Terms of Reference. All the reports are subject to written approval from the Evaluation Manager, for the payments to be affected to Service Provider/Individual Consultant.

The following are the key actors involved in the implementation of this Final Evaluation:

1. Evaluation Manager

This role will be conducted by the Senior National MEAL Officer of IOM who will have the following functions: 

  • Supervise the evaluation process throughout the main phases of the evaluation (preparation of the ToR, implementation and management and use of the evaluation)
  • Participate in the selection and recruitment of the Evaluation Consultant
  • Provide the Evaluation Consultant with administrative support and required data and documentation
  • Ensure the evaluation deliverables meet the required quality 
  • Safeguard the independence of the exercise, including the selection of the Evaluation Consultant 
  • Review the Inception Report, Draft Evaluation and Final Evaluation Reports and provide necessary approvals on behalf of IOM Commissioning Unit
  • Collect and consolidate comments on draft evaluation reports and share with the Evaluation Consultant for finalization of the evaluation report
  • Contribute to the development of management responses and key actions to all recommendations addressed to IOM
  • Facilitate, monitor and report on implementation of management responses on a periodic basis

2. Senior National Programme Officer (IBG) will have the following functions: 

  • Establish the Evaluation Reference Group with key project partners when needed
  • Ensure and safeguard the independence of the evaluation
  • Provide comments and clarifications on the Terms of Reference, Draft Inception Report and Draft Evaluation Reports
  • Ensure the Evaluation Consultant’s access to all information, data and documentation relevant to the intervention, as well as to key actors and informants who are expected to participate in interviews, focus groups or other information-gathering methods 
  • Respond to evaluation recommendations by providing management responses and key actions
  • Ensure dissemination of the evaluation report to key stakeholders
  • Be responsible for implementation of key actions of the management response

3. Evaluation Consultant (who will be recruited under this Terms of Reference) will be responsible for the overall coordination and quality of the final evaluation report to be produced. It is the Evaluation Consultant who will be held accountable to IOM in the quality of the final product. The Evaluation Consultant will conduct the evaluation study by fulfilling their contractual duties and responsibilities in line with this ToR, United Nations Evaluation Group (UNEG) norms and standards and ethical guidelines and in full compliance with IOM’s Evaluation Policy and GuidelinesThis includes submission of all deliverables stipulated under this ToR document, to the satisfaction of IOM. Evaluation Consultant’s functions do not include any managerial, supervisory and/or representative functions in IOM, partners and beneficiaries. All documents and data provided to the Evaluation Consultant are confidential and cannot be used for any other purpose or shared with a third party without any written approval from IOM. 

An Evaluation Consultant will be conducting the Final Evaluation. All consultants shall not have participated in the programme/project preparation, formulation, and/or implementation (including the writing of the Project Document) and should not have a conflict of interest with programme’s/projects’ related activities.  The scope of work for the Evaluation Consultant of this evaluation will include but not be limited to: 

  • To develop and finalize the inception report that will include elaboration of how each evaluation question will be answered along with proposed methods, proposed sources of data, and data collection and analysis procedures; 
  • To design the tools and data collection; 
  • To reconstruct the Theory of Change if required;
  • To conduct data collection, analysis and interpretation; 
  • To develop the draft evaluation report; 
  • To finalize the evaluation report; 
  • To present of findings and de-brief;
  • To plan, execute and report, kick-off and feedback meetings and debriefings; 
  • To ensure compliance with the TOR; and 
  • To utilize best practice evaluation methodologies.

4. Evaluation Reference Group: This group is composed of the representatives of the major stakeholders in the project and will review and provide advice on the quality of the evaluation process, as well as on the evaluation products (more specifically comments and suggestions on the draft report and final report) and options for improvement.

Reporting Line

The Evaluation Consultant will be responsible to the Evaluation Manager (in this case Senior National MEAL Officer of IOM) for the completion of the tasks and duties assigned throughout this Terms of Reference document. All the reports are subject to written approval from the Evaluation Manager following consultation with the Senior Programme Coordinator, for the payments to be affected to the Evaluation Consultant consultant(s). 

Reporting Language and Conditions

The reporting language will be English. All information should be provided in an electronic version in Word document format. The Evaluation Consultant shall be solely liable for the accuracy and reliability of the data provided, along with links to sources of information used.

Title Rights

The title rights, copyrights and all other rights whatsoever nature in any material produced under the provisions of this ToR will be vested exclusively in IOM.

  1.  Duty Station, Travel and Facilities to be Provided

The Evaluation Consultant will be requested to travel for at least 5 days (excluding travel days and weekends) to Ankara and up to 3 other selected provinces in Türkiye where the Project has been implemented as tentatively indicated in the expected data collection mission agenda below. All the costs associated with visa, travel, accommodation and any other living costs shall be borne by the Evaluation Consultant.

The methodology—including the final sampling strategy and choice of locations to be visited—should be further developed by the Evaluation Consultant during the inception phase under the supervision of the MEAL Unit. In total, it is expected that the evaluation will require maximum 28 working days input to complete, including all contributions to the inception, country visit and write-up phases of the evaluation.

TENTATIVE DATA COLLECTION MISSION AGENDA

Partners/ Stakeholder(s) to be InterviewedLocation

Level of Effort (person/days)

(1 day = 8 hours) 

Method
MoT – DG Customs Enforcement (DGCE)

Ankara

0.25

In person

MoI - Department of European Union Affairs and Foreign Relations (DIAB)

Ankara

0.25

In person

EU Delegation (donor)

Ankara

0.25

In person

IOM teams (group interview)

Ankara

0.25

In person or Remote

Output-1 EUSECTRA Karlsruhe training and ToT participants (FGD 8-12 persons)

Online

0.5

Remote

Output-2 ToT participants (FGD 8-12 persons)

Online

0.5

Remote

Output-1 Cascaded Training Participants (including customs officers)

Ankara, Istanbul, Izmir and Antalya

2

In person

Output-2 Cascaded Training Participants

(including customs and IMCPC staff)

Ankara, Istanbul, Izmir and Antalya

2

In person

Output-2 Open-Source Intelligence Team (MoT and private sector) (FGD)

Online

0.5

Remote

Output-1 Radiation Control Plan stakeholders (NDK and TENMAK)

Ankara

0.5

In person

Express courier companies (FGD)

Online

0.5

Remote

Online surveys targeting 180 respondents from cascaded training participants and open-source intelligence team members

Online

N/A

Remote

ESTIMATED TOTAL

7

The locations of partners and stakeholders do not rule out the probability of a remote monitoring mission if approved by the Commissioning Unit under exceptional circumstances. The names of cities are there to indicate a potential sample which must be agreed between the Evaluation Consultant and the Commissioning Unit.

  1. Evaluation Costs and Terms of Payment

The Evaluation Consultant's fee will be all-inclusive, covering all costs related to visa, international/domestic flights, local transportation, hotel accommodation, meals, field trips to respective implementation sites, translation/interpretation, venues for focus groups, printing, communication, and any other expenses required to complete the evaluation. Payment of consultancy fees will be processed in 30 calendar days upon IOM’s approval of the following deliverables: 

Payment Schedule:

Deliverable(s)

% of Payment 

Submission and approval of the Final Inception Report and Data Collection Toolkit

25% 

Submission and approval of the Draft Evaluation Report 

50% 

Submission and approval of the Final Evaluation Report with Audit Trail + Presentation slide deck + Management Response Matrix + Evaluation Brief and Infographics

25% 

Total 

100% 

Duration of the Contract 

Final Evaluation is expected to be conducted tentatively between 4 May 2026 – 30 September 2026 and required level of effort is maximum28 working days inclusive of all evaluation costs. This period covers all evaluation tasks, including preparation, data collection, analysis, and reporting. The Evaluation Consultant is expected to manage and complete these tasks efficiently within the agreed timeframe, ensuring high quality of deliverables. The specific dates will be identified at contracting stage.

  1. Specific Duties of Evaluation Consultant

An international Evaluation Consultant will be engaged for the conduction of the final evaluation. The Evaluation Consultant should present a combination of technical expertise and experience in evaluation with a focus on border management, customs enforcement and surveillance. The Evaluation Consultant should be familiar with approaches used to assess project contribution to policy level changes as well as theory-based approaches to project evaluation, using both quantitative and qualitative analysis of existing secondary data and primary data sources. The Evaluation Consultant should have comprehensive knowledge of institutional capacity building and technical assistance modalities.

Evaluation Specialist

The Evaluation Specialist will be responsible for the overall management, methodological rigor, and timely delivery of the evaluation.

The duties and responsibilities of the Evaluation Specialist include:

  • Lead the overall design and implementation of the evaluation in accordance with the Terms of Reference (ToR).

  • Develop and finalize the inception report, including the evaluation methodology, evaluation matrix, data collection methods, sampling approach, and analytical framework.

  • Ensure that the evaluation design aligns with internationally recognized evaluation standards, including the United Nations Evaluation Group (UNEG) norms and standards.

  • Coordinate the development of data collection tools and guide their application during the evaluation process.

  • Lead the reconstruction and validation of the Theory of Change, if required.

  • Plan and conduct all data collection activities, including key informant interviews, focus group discussions, document reviews, and other relevant methods.

  • Ensure the quality, consistency, and reliability of collected data and analytical outputs.

  • Lead the analysis and synthesis of evaluation findings.

  • Draft and finalize the evaluation report, ensuring high analytical quality and clear presentation of findings, conclusions, and recommendations.

  • Ensure that all deliverables meet the required quality standards and are submitted in a timely manner.

  • Lead the preparation and presentation of evaluation findings to stakeholders, including the kick-off meeting, debriefings, and validation workshops.

  • Liaise with IOM for all evaluation-related communications and submission of deliverables.


 


[1]    “Target groups” are the groups/entities who will directly benefit from the action at the action purpose level.

[2]    “Final beneficiaries” are those who will benefit from the action in the long term at the level of the society or sector at large. 

[4] For more information, please see publications on evaluation methods by the Independent Evaluation Group of the World Bank as well as the United Nations Evaluation Group:  http://www.unevaluation.org/document/detail/2939, https://ieg.worldbankgroup.org/evaluation-international-development as well as Befani and Mayne (2014) “Process Tracing and Contribution Analysis: A Combined Approach to Generative Causal Inference for Impact Evaluation”. https://onlinelibrary.wiley.com/doi/abs/10.1111/1759-5436.12110 

[5] See guidance available within the international development evaluation community on selecting appropriate evaluation methods to answer different types of evaluation questions, such as https://www.betterevaluation.org/en/approaches or  https://www.bond.org.uk/resources/evaluation-methods-tool 

[9] Though IOM does not oblige Evaluation Consultant to use the same reporting format, Evaluation Consultant is expected to address all components outlined in the IOM Components Template and Template for Evaluation Final Report per the IOM M&E Guidelines (see p. 237).

[10]  IOM will provide an IOM template for the brief, which will be developed on Microsoft Publisher. The brief should provide a short (two-page) overview of the evaluation, including key project information, findings, conclusions, and recommendations.

[11]  IOM template for Management Response and Follow-up.

Qualifications

Experience and Qualification Requirements

Evaluation Specialist

I. Academic Qualifications:

Required:

  • Advanced university degree (Master’s or higher) in International Relations, Public Administration, Security Studies, Criminology, Customs Administration, Nuclear Security, Social Sciences, Engineering, or a related field. (5 points)

Asset:

  • Formal training or certification in evaluation methodologies (e.g., OECD DAC evaluation, RBM, or equivalent). (5 points)

II. Years of experience:

Required:

  • Minimum 7 years of overall professional experience in research design, field work, qualitative, quantitative and mixed-method research strategies, including but not limited to focus groups, surveys and interview techniques. (10 points) 
  • Minimum 5 years of relevant evaluation experience with designing, conducting, and managing complex international/national humanitarian/development evaluations either as a team leader, Evaluation Consultant member or sole evaluator that apply Theory of Change-based mixed-methods approaches to a variety of different modalities in development cooperation, involving inter-governmental organizations and their government, non-governmental and private sector counterparts. Evidence and links to at least three evaluation reports/research shall be submitted. (20 points)

III. Competencies:

Required:

  • Having conducted at least 3 evaluations/reviews/assessments on capacity-building programmes, including training, systems strengthening, and institutional reform. (15 points)
  • Minimum 2 years of professional experience in one or more of the following thematic areas: (i) Border management and customs enforcement, (ii) Counter-smuggling / illicit trade prevention, (iii) Security sector governance or law enforcement capacity development. (15 points)

Asset:

  • 3-5 years of progressive professional experience in one or more of the following thematic areas: (i) Border management and customs enforcement, (ii) Counter-smuggling / illicit trade prevention, (iii) Security sector governance or law enforcement capacity development. (10 points)
  • Experience in designing, implementing or evaluating UN-implemented or EU-funded interventions in Türkiye or comparable contexts. (10 points)
  • Experience in assessing or conducting multi-stakeholder coordination (e.g., ministries, law enforcement, regulatory bodies, private sector actors) (5 points)

IV. Language:

Required:

  • Excellent command of spoken and written English. (5 points)

Notes:

  • Internships (paid/unpaid) are not considered professional experience. 
  • Obligatory military service is not considered professional experience.
  • Professional experience gained in an international setting is considered international experience.
  • Experience gained prior to completion of undergraduate studies is not considered professional experience.

Required Competencies

IOM’s competency framework can be found at this link. Competencies will be assessed during the selection process.

Values - all IOM staff members must abide by and demonstrate these five values:

  • Inclusion and respect for diversity: Respects and promotes individual and cultural differences. Encourages diversity and inclusion.
  • Integrity and transparency: Maintains high ethical standards and acts in a manner consistent with organizational principles/rules and standards of conduct.
  • Professionalism: Demonstrates ability to work in a composed, competent and committed manner and exercises careful judgment in meeting day-to-day challenges.
  • Courage: Demonstrates willingness to take a stand on issues of importance.
  • Empathy: Shows compassion for others, makes people feel safe, respected and fairly treated.

Core Competencies – behavioural indicators

  • Teamwork: Develops and promotes effective collaboration within and across units to achieve shared goals and optimize results.
  • Delivering results: Produces and delivers quality results in a service-oriented and timely manner. Is action oriented and committed to achieving agreed outcomes.
  • Managing and sharing knowledge: Continuously seeks to learn, share knowledge and innovate.
  • Accountability: Takes ownership for achieving the Organization’s priorities and assumes responsibility for own actions and delegated work.
  • Communication: Encourages and contributes to clear and open communication. Explains complex matters in an informative, inspiring and motivational way.

Notes

IOM covers Consultants against occupational accidents and illnesses under the Compensation Plan (CP), free of charge, for the duration of the consultancy. IOM does not provide evacuation or medical insurance for reasons related to non-occupational accidents and illnesses. Consultants are responsible for their own medical insurance for non-occupational accident or illness and will be required to provide written proof of such coverage before commencing work. 

Any offer made to the candidate in relation to this vacancy notice is subject to funding confirmation.

Appointment will be subject to certification that the candidate is medically fit for appointment, accreditation, any residency or visa requirements, security clearances.

IOM has a zero-tolerance policy on conduct that is incompatible with the aims and objectives of the United Nations and IOM, including sexual exploitation and abuse, sexual harassment, abuse of authority and discrimination based on gender, nationality, age, race, sexual orientation, religious or ethnic background or disabilities.

IOM does not charge a fee at any stage of its recruitment process (application, interview, processing, training or other fee). IOM does not request any information related to bank accounts.

IOM only accepts duly completed applications submitted through the IOM e-Recruitment system (for internal candidates link here). The online tool also allows candidates to track the status of their application.

No late applications will be accepted. Only shortlisted candidates will be contacted.

For further information and other job postings, you are welcome to visit our website: IOM Careers and Job Vacancies

Required Skills

Job info

Contract Type: Consultancy (Up to 11 months)
Initial Contract Duration: 6 months
Org Type: Country Office
Vacancy Type: Consultancy
Recruiting Type: Consultant
Grade: UG
Is this S/VN based in an L3 office or in support to an L3 emergency response?: No
At Impactpool we do our best to provide you the most accurate info, but closing dates may be wrong on our site. Please check on the recruiting organization's page for the exact info. Candidates are responsible for complying with deadlines and are encouraged to submit applications well ahead.
Before applying, please make sure that you have read the requirements for the position and that you qualify. Applications from non-qualifying applicants will most likely be discarded by the recruiting manager.