JM 03.2/3


Rome, 17 September 2003

The Independence and Location of the Evaluation Service – Further analysis of options

Table of Contents

Annex 1: Summary of Main Features Relating to Independence of Evaluation among Selected United Nations Agencies


1. At their joint session in May 2003, the Programme and Finance Committees considered a paper submitted by the Secretariat entitled “Independence and Location of the Evaluation Service” (JM 03.1/3). The report of the Committee1 stated that “The Committee … agreed unanimously on the importance of the evaluation function as a management tool for ensuring overall the programme effectiveness of the Organization and enhancing transparency in governance.” ….. “Regarding the proposal of the External Auditor to merge AUD and PBEE, most speakers were not in favour of combining these two units. However, it was felt that with the level of information provided, it was not possible to reach a firm position.” ……………… “In discussing the issue of independence, the Committee members agreed broadly that evaluation should be independent and that it was essential that it contribute to organizational learning and improvement as well as to accountability” …………. “The Committees … requested a paper be prepared by the Secretariat … expanding on the various issues noted … and identifying alternative scenarios for ensuring greater independence, including a fully independent unit, as well as strengthening independence further under the current organizational location of PBEE. The paper should also take into account practices and lessons of selected international organizations that would be relevant in considering alternative models of arrangements for independence.”

2. The present paper responds to the Committees’ request. In preparation of the paper the World Bank and selected UN organizations were consulted on their evaluation arrangements and their experience, including a questionnaire survey.2


3. There is broad consensus on the functions of evaluation among the UN and other international development agencies, i.e. to contribute to:

  1. improvement in future policies, programmes and projects through feedback of findings and lessons;
  2. broad-based learning within organizations; and
  3. accountability and transparency, including dissemination of information to the public as well as to the stakeholders. (It may be noted that accountability for the use of public funds in evaluation relates to the results and impact of the programmes and policies, rather than to financial accounting and legal compliance which are normally regarded as audit and inspection functions3.)

4. To be credible both within organizations and to Governing Bodies and other stakeholders, there needs to be confidence that evaluation is:

  1. independent of vested interest;
  2. analytically rigorous, working from a comprehensive information base; and
  3. transparent in methodology.

5. Evaluation also needs to be cost-effective in bringing about tangible improvements in the relevance of programmes to members’ needs and in their performance. It is in this respect that there is greatest value in the engagement of programme managers and implementers in the evaluation process so as to ensure the relevance and practicality of recommendations and their acceptability. While endorsement of evaluation recommendations by Governing Bodies can, to some extent, encourage compliance, this is seldom as effective as when managers themselves become convinced of the need for change and support the solutions that have been worked through with them. Thus, it is generally recognized that there is no single, universal solution to the trade-off between accountability and learning but that “optimal solutions should be sought to balance all of these requirements” and “the organizational arrangements and procedures should facilitate the linking of evaluation findings to programming and policy making.”4

6. As noted in document JM 03.1/3, the issue of independence of evaluation has been most comprehensively analysed by the DAC Group on Aid Effectiveness.5 Considerations found to form an essential part of enabling conditions for independence include: (a) organizational structure for separating evaluation from line functions with the evaluation unit reporting directly to senior management or a governing body; (b) access to and control over resources for evaluation work (both budget and staff); (c) authority over planning and conduct of evaluation as well as the process (including how opinions of management are reflected in final reports); and (d) linking evaluation findings and lessons to programme and policy making.


7. The FAO Evaluation Service (PBEE) is located in the Office of Programme, Budget and Evaluation (PBE) within the Office of the Director-General (ODG). The Director, PBE provides administrative support and management supervision of PBEE as well as the key interface between evaluation and corporate planning, programming and budgetary processes. The Service is responsible for maintaining an adequate evaluation system for the Organization, including methodologies and procedures, and is separate from the line management of programmes. Main functions of PBEE and its main clients (users of evaluations), can be grouped as described below. All these functions are essential to the Organization and there are important synergies among them.

Major programme and thematic evaluations for the Governing Bodies and senior management

8. These evaluations, which address organizational strategies, programmes and other cross-cutting issues such as decentralization in the context of FAO’s overall work, are planned in consultation with the Programme Committee, and are managed and conducted by the Evaluation Service. In addition to the Governing Bodies and Senior Management, the main users of the evaluations are the concerned Assistant Directors-General (ADGs) and Directors, as relates to the substance of the programme, and the Director, PBE and ADG, Technical Cooperation Department (TC) as relates to use of resources. This function occupies approximately 40 percent of the time of the Evaluation Service.

Extra-budgetary funded projects and programmes of the Organization

9. The Evaluation Service establishes the procedures and manages the system for the evaluation of the extra-budgetary-funded projects and programmes of the Organization. The evaluations themselves are normally built into the workplan and budget for the programmes or projects and are carried out by teams of consultants and/or Evaluation Service staff appointed jointly by the donor, the beneficiary country and FAO. The main clients are the recipient country and the representatives of the funding source, with the project and higher-level programme management also being key users. This function occupies approximately 30 percent of the time of the Evaluation Service.

Internal evaluations requested by managers and auto-evaluation

10. PBEE conducts ad hoc evaluations at the request of individual managers. A new major function is to develop and manage the recently-initiated internal auto-evaluation system, which requires managers to evaluate all programme entities at least once every six years. The auto-evaluation support function is integrated into the PBE-supported Programme Planning, Implementation, Reporting and Evaluation Support System – PIRES. This function is expected to take up approximately 20 percent of the time of the Evaluation Service.

Support to overall strategy development and defining/enhancing the results-based model for medium-term planning

11. This function is an input to the Director’s Office of PBE and is both a part of feedback and a pre-requisite to improving the basis for evaluation against programme objectives and indicators. Some inputs are also made to the improvement of planning and formulation for extra-budgetary projects and programmes in TC. Other feedback activities include dissemination of evaluation results and lessons as well as support to the development of staff training. These functions occupy approximately 10 percent of the time of the Evaluation Service.


12. The current situation regarding the independence of evaluation in the Organization was presented to the previous Joint Session (see paragraphs 23-29, document JM 03.1/3). This section carries the analysis into the level of each of the main evaluation functions.

Major programme and thematic evaluations for the Governing Bodies and Senior Management

13. A biennial rolling plan for these evaluations is prepared by PBEE in consultation with department managers and for the approval of the Director-General. The proposed plan is submitted to the Programme Committee for its views and advice. In practice, the Director-General has consistently respected the suggestions of the Committee on the programme of work.

14. Overall evaluation methods and terms of reference for individual evaluations are decided by the Evaluation Service, but the process involves wide consultation with the FAO units concerned with a view to both improving the terms of reference and ensuring ownership of the evaluation by the programme managers.

15. External expertise is used in all evaluations. The Evaluation Service decides on the need for such inputs, including external leadership, external peer reviewers and the make-up of the team, on the basis of the expertise required, the strategic importance of the evaluation and budgetary considerations. All evaluations that are not externally led are submitted for review by an external peer group, which prepares its own report, to be considered by management and presented to the Programme Committee.

16. Draft evaluation reports are reviewed and commented upon by the concerned programme managers and senior management. The Evaluation Service is totally free to accept or reject suggestions for changes. Management comments, which may include disagreements with the evaluation report, are presented separately to the Programme Committee. While PBEE reports to the Director-General through the Director, PBE, the latter functions as facilitator for communication with other senior managers and his substantive comments on evaluation reports are included in the management response.

17. Similarly, PBEE interacts with the Programme Committee through management, but in practice, the Evaluation Service relates directly with the Programme Committee on a regular basis (i.e. twice per year) wherein the Committee has considerable influence over matters such as evaluation policy and methods. Such a close working relationship with the Committee has enhanced the status of evaluation in the Secretariat. In particular, feedback to the Governing Bodies on the implementation of the recommendations of evaluations, which have been accepted by management and the Governing Bodies, are now reported regularly to the Programme Committee.

18. All evaluation reports in this category are public documents and are readily available on the FAO website.

Extra-budgetary-funded projects and programmes of the Organization

19. The methodologies and procedures for evaluation of extra-budgetary programmes and projects are established by the Evaluation Service in consultation with TC Department, taking into account the needs of the funding agencies. The FAO approach conforms to the standards used by the UN agencies, based on the principles of independent evaluation (by external consultants) and tripartite participation (the recipient country, the donor and FAO). The main clients are representatives of the recipient countries, donors and FAO, including the project managers and staff, who use evaluation results to modify activities and to decide upon the future of projects.

20. It is required that projects with a budget of over US$ 1 million should normally be evaluated at least once during their lifetime and the decision on when this should occur is taken primarily by the operating unit in consultation with the donor and beneficiary country. The evaluation teams are fully independent with regard to their findings and recommendations and participants in the mission are separately nominated by the three parties to the project. PBEE plays a quality assurance role by clearing the draft terms of reference, the make-up of the evaluation team and reviewing individual reports. It also maintains a database for analysis of evaluation results and lessons.

21. Publication of extra-budgetary project/programme evaluations is subject to the Organization’s disclosure policies. In practice, this means that the first responsibility is accountability to the donors, beneficiary countries and FAO and immediate programme improvement. Thus:

Internal evaluations requested by managers and auto-evaluation

22. Internal evaluation work by the Evaluation Service is carried out at the request of programme managers, primarily for programme improvement. Recently this has been mainly for the Technical Cooperation Department (TC) where the Chief of the Evaluation Service communicates directly with the ADG, TC. These are client-determined evaluations and the terms of reference are naturally agreed with the requesting department. The Evaluation Service has full authority over what it accepts and the Service stipulates that it is to have the final decision on the selection of the evaluation team, the conduct of the evaluation and on the contents of the final evaluation report. The report then becomes the property of the client department, but effectively such reports have unlimited internal distribution.

23. With respect to auto-evaluation, following extensive discussion, basic procedures have been specified by the Director-General. The first evaluations are only now taking place and the individual evaluations will be organized by the responsible line managers. Basic procedures and minimum requirements for the involvement of external expertise, including peer review, are being put in place by the Evaluation Service in consultation with the technical departments. The reports are intended first and foremost for internal accountability and lesson-learning. However, a synthesis will be prepared for the Governing Bodies through the Programme Committee and the Evaluation Service will draw on auto-evaluation reports in preparing its evaluations.

24. The above analysis shows, as stated in paragraph 28 of document JM 03.1/3, that within the existing organizational arrangements, the principle evaluation functions are actually carried out in an independent manner. This is demonstrably so with respect to the key criteria for independence noted in paragraph 6 above: the authority of the Evaluation Service over the planning and conduct of evaluation, including the handling of comments of programme managers and senior management; transparency of the process; and the linking of evaluation findings and lessons to programme and policy decisions. The areas where the current arrangements are most lacking in terms of independence criteria are: (a) the need for more predictable access to funding of Major Programme and Thematic Evaluations; and (b) more explicit codification of the existing practices as institutionalized processes.


25. To respond adequately to the Committees’ instruction that options should be considered by examining advantages and disadvantages of alternative arrangements for the location of the Evaluation Service, the analysis below draws on the experience of the World Bank and several UN organizations, including implications of alternative arrangements on the current evaluation functions. Annex 1 provides a summary description and matrix analysis of evaluation in nine of these organizations (including FAO).

Option A: Separate Evaluation Service reporting directly to the Governing Bodies through the Programme Committee

26. This arrangement is typified by IFAD and the World Bank. The evaluation units (Operations Evaluation Department, headed by a Director-General, in the Bank and Office of Evaluation, headed by a Director, in IFAD) are organizationally separate within the secretariats and report directly to the Executive Boards (not to the Presidents). The Executive Boards have established sub-committees to oversee evaluation. The evaluation heads are appointed (for a fixed term) and can only be dismissed directly by the Board (World Bank) or only with explicit approval of the Board (IFAD). The evaluation units are separate from other oversight units and there is separate consideration of their budgets in the programme and budgeting processes. This structure is of long standing for the Bank, but was introduced in IFAD only this year.

27. The advantage of this arrangement is that evaluation is clearly an accountability function at the service of the member states. The organizational separation and governance of the evaluation function gives it a status and independence which is difficult to parallel with other arrangements. Management is required to respond formally and directly to the Executive Boards on evaluation findings and recommendations. On the other hand, the separation of the evaluation unit in this way leads managers to regard evaluators as controllers rather than partners. Evaluation-related information is widely disseminated in the Bank; however, there is a concern that Bank staff feel a limited sense of ownership of the results. In IFAD, the working of the system is to be tested, but its evaluation practices have traditionally emphasized more participatory means of feedback and learning, involving stakeholders at the country level as well as inside IFAD (“Core Learning Partnership”). It has also been noted that the more the evaluation unit is seen as separate and in a control relationship, the more difficulty it has in accessing informal information and in obtaining pro-active cooperation.

28. A pre-requisite for this type of arrangement to work effectively is the active role by the Governing Bodies, with evident cost implications. IFAD’s Evaluation Committee meets 5-6 times per year while the Bank Committee on Development Effectiveness meets some 20 times throughout the year. In FAO, such a separate evaluation committee does not exist and the Programme Committee meets only twice per year.

29. This arrangement may be suitable for an evaluation system which is focused on accountability, i.e. in the FAO context, major programme and thematic evaluations primarily aimed for the Governing Bodies and management - the first of the functions listed above. It would not be suitable for other functions presently carried out by PBEE. These functions primarily serve the needs of secretariat management in programme improvement and learning, with secondary emphasis on accountability. However, these evaluation functions are essential to internal management’s needs, and if Option A is adopted, alternative arrangements would have to be made to meet these. In the World Bank, for example, the management has put in place a system for internal review/evaluation of projects, separate from the Operations Evaluation Department (OED) with line managers being responsible for project completion reports (i.e. a form of terminal evaluation).

Option B: A Separate Evaluation Service reporting to the Director-General or the Deputy Director-General

30. This arrangement is typified by UNDP where the evaluation unit reports directly to the Administrator. In UNICEF, the evaluation unit reports to the Deputy Executive Director in charge of Programme and Strategic Planning (a situation similar to that in FAO but the programme and strategic planning function occupies a higher place in the corporate hierarchy). In both organizations, the evaluation unit is separate from other internal oversight units and apart from the placement in the organizational hierarchy the working arrangements closely parallel those in FAO, although in UNDP the Governing Bodies probably play a less active role than the FAO Programme Committee in determining the work programme. In UNICEF, the Board determines evaluation policy and it was also recently decided to establish an internal Evaluation Committee with membership of senior staff to strengthen the linkage between evaluation and programmes – it will review evaluation plans as well as individual reports. In UNDP, management comments are not issued separately from evaluation reports, while they are issued separately to the Board in UNICEF. In UNICEF, dissemination of evaluation findings and lessons includes workshops on evaluation findings and feedback into staff training, including on programme and project formulation.

31. Implications for the FAO Evaluation Service of such a change could perhaps be, on the one hand, to increase its status and authority. The functions elaborated above could probably be continued except for the one on “direct support and feedback in the strategic and medium-term planning processes” where the Service’s input would be at a more formal level and external to the PBE planning process.

32. A separate Office of Evaluation independent from Programme and Budget would lose the advantage of the internal linkages which now exist and which have led, for example, to close cooperation on the conceptual model for results-based budgeting, a common IT system in the form of PIRES (where the ‘ES’ refers to Evaluation Support), implementation and monitoring (i.e. annual assessment) and the feedback from evaluations that might affect the allocation of resources.

Option C: Evaluation Unit together with Audit and Control in an Office of Internal Oversight

33. In the IAEA, UN and Unesco, the evaluation unit is located within an office of internal oversight whose Director reports directly to the Executive Head6. The evaluation units are separate and the independence of evaluation work is generally recognized, although their work programmes and approaches tend to be influenced by the control function. The evaluation work programme and reports form part of the oversight work, for which the primary recipient is the Executive Head. This is particularly so in the cases of IAEA and Unesco where evaluation is regarded as a tool for internal management and evaluation reports are issued to the Board only in summary form (annually or semi-annually). Individual reports are published as a general practice in the UN and Unesco, but they are treated as confidential in IAEA.

34. The experience of IAEA, the UN and Unesco indicates that integrating evaluation into one office combining oversight functions, often headed by a senior manager directly reporting to the Executive Head, has had the benefit of enhancing the status of evaluation and giving it greater authority, including actions on recommendations (at least in a compliance sense). At the same time, it has tended to down-grade evaluation vis-à-vis other oversight functions. This is partly because the major focus of the office is on functions related to audit and inspection and partly because such an Office of Internal Oversight is generally headed by managers with expertise in audit and accounting. It has even been reported that, in some cases, resources for evaluation were reduced.

35. Similarly, the merger has often led to relative separation of evaluation staff from the programme staff and processes, partly because of the mandate of the new office and partly due to the perception of programme staff. Access to informal information and obtaining full cooperation is also difficult. In some cases, the oversight office stresses the confidentiality of information and reports, thus weakening the more transparent working methods employed by the pre-existing separate evaluation unit. While feedback in the compliance sense may improve, the contribution to learning aspects is difficult – the audit function and the culture of shared learning are not easily compatible.

36. As under Option B above (paragraphs 31 and 32), the Evaluation Service’s functions to support strategy development and medium-term planning processes and in support to auto-evaluation would have to be transferred out of the Evaluation Service as a consequence of this change of institutional location.

Option D: Evaluation Service in the Office of Programme, Budget and Evaluation (PBEE)

37. This organizational arrangement (where the evaluation unit is located in a larger office in the secretariat) is shared by a majority of UN agencies. Other than Option C, there are two broad patterns, each with its own priority purpose: the first is in offices responsible for technical cooperation (e.g. UNIDO and WTO); and the second is in an office responsible for programme planning and management (ILO, WFP and WHO). The latter arrangement is similar to the FAO situation with emphasis on feedback from evaluation into programme improvements, while keeping evaluation as independent as possible. Thus, although their experience would be of interest in improving arrangements in FAO, the arrangements in WFP and WHO are both of very recent origin (started in 2003) and the central evaluation functions in ILO are still evolving. Nevertheless, experience of these agencies, including WFP, indicates that their evaluation work is largely independent, especially vis-à-vis line management, although there are differences in the way management comments are communicated to the Governing Bodies.

38. This positioning of the Evaluation Service in FAO may have disadvantages in terms of organizational status and visibility and access to the Executive Head. However, as already seen above (paragraphs 12-24), the Service’s evaluation work, especially the major programme and thematic evaluations for the Governing Bodies and senior management, is carried out independently in practice. Thus, potential shortcomings in terms of the organizational arrangements are more theoretical than real, and could be addressed by some specific remedial measures, as discussed below.

39. On the other hand, the removal of PBEE from this structure would weaken the support to strategy development and conceptual input to medium-term planning procedures within the context of FAO’s results-based approach to programming and budgeting. It could also weaken the effectiveness of direct feedback into the programme planning process and resource allocation processes. The transfer of current functions from the Evaluation Service, which arises in varying degrees with each of the options above, would reduce the synergies between various types of evaluation work currently performed in the Service. Transfer of staff with functions would also tend to reduce critical mass.

Staff resources and budget for evaluation

40. In all the organizations considered, staff terms and conditions of service are governed by the rules of the Organization. The only major exceptions are the World Bank and IFAD where the head of the evaluation office is either directly appointed by the Executive Board (the World Bank) or with explicit approval of the Board (IFAD): in both organizations, the evaluation head has the authority on staff selection. This would be the practice most consistent with Option A. In all other organizations, including FAO, the head of the evaluation unit is appointed by the Executive Head of the organization, while for evaluation staff, the evaluation head makes the proposals for selection of candidates for vacancies, to be decided by the normal staff selection procedures.

41. The fully-funded FAO regular budget for evaluation has remained more or less steady during recent biennia at a level of approximately US$ 2.1 million. In addition, it recovers some of its staff costs by charging for the services it provides, for example, to Field Programme evaluations. However, it should be noted that there are structural changes underway which may adversely affect the overall level of resources available for evaluation. These include:

42. Hence, while funding of evaluations has not, so far, generally been a problem, there is a trend towards strategic and thematic evaluations which will not result in reimbursements from other programmes. Put this together with the increasing use of external expertise either as consultants or as members of peer review groups, and it is likely that resources could become a significant constraint on the evaluation process. It is for this reason that an additional US$ 250,000 is included under the RG scenario to support the cost of comprehensive use of peer review groups.

43. In drawing the above points to the Committee’s attention, it may give a misleading impression as to the relative size of the evaluation budget in FAO versus other institutions. Recent comparative data is not available but on the last occasion when the JIU reviewed the issue, it found FAO to have the largest evaluation unit and the highest expenditure among the UN specialized agencies on evaluation.7

44. The above description applies to the current situation but it needs to be stressed that a change in the organizational location of the evaluation function would, in all likelihood, affect the structure of the Service and the allocation of functions. Thus, for example, a fully independent service reporting to the Governing Bodies could not presumably be responsible for the evaluation of extra-budgetary-funded projects, internal evaluations as requested by management or the auto-evaluation process. Such functions would be transferred to other internal units along with the requisite resources which may be quite significant (i.e. several posts). The precise impact cannot be determined until a clearer idea evolves on the direction the institution tends to take, but this aspect should be borne in mind.


45. The current FAO evaluation system and process can be judged de facto to be independent. It is consistent with the principles of DAC and the practices of UN agencies and other international organizations. PBEE is separate from line management, able to choose evaluation subjects, and also free to express its judgement. It is protected from pressures by management, which is required to separately express its own response to evaluation reports and their recommendations. It can establish appropriate evaluation methodologies and procedures. The Programme Committee has also acknowledged the effectiveness of the evaluation system and has welcomed quality, thoroughness, transparency and independence of evaluation reports, appreciating, in particular, their frank assessments including the critical observations made in evaluations managed by the Evaluation Service. Considerable progress has been made possible by management’s decision to adopt more strategic and results-based management and by the initiatives of the Programme Committee to support and reinforce this trend.

46. This indicates that independence in a functional sense involves many factors other than organizational structure. These include trust in the evaluators in terms of integrity and professional quality, collaborative interaction with the Governing Bodies (especially the Programme Committee) and the overall attitude of management in making changes for improvement and creating an environment wherein evaluation is free to do its job in the best interests of the Organization’s programmes.

47. The above analysis can be summarized as follows:

  1. Option A, which envisages a fully independent Evaluation Service reporting to the Governing Bodies, strengthens the accountability function for the membership but detracts from evaluation as a management tool. In particular, it would result in weakening partnerships with programme managers with a view to learning how a programme can be improved. It would also diminish the synergy with programme planning (i.e. conceptual frameworks and feedback), would be unable to perform many of the useful functions currently carried out by PBEE, and would increase the overall cost of evaluation work for the Organization, perhaps significantly, given the need for senior level positions and regular meetings of the Governing Body review mechanism (currently the Programme Committee);
  2. Option B, which involves a separate organizational unit reporting directly to the Director-General (or Deputy Director-General), would increase the standing of the Evaluation Service, its work and its independence from the other unit in PBE (i.e. the Programme and Budget Service). However, the other side of the same coin is that it would diminish the synergy with programme planning as described under Option A above and that it would not be able to provide internal advice and support to the strategic and medium-term planning processes. Given the centrality of PBE in programme and budget work, such a decoupling could also lead to a reduction in interaction between evaluation and programme management at the most senior levels of management. This option also involves some additional cost.
  3. Option C, which requires the merger of the Evaluation Service with the Office of the Inspector-General tends to strengthen independence of evaluation from management but mixes two conceptually different processes (control versus learning) and hence detracts from the effectiveness of evaluation. Again, some of the current evaluation functions could not be handled by this unit and, therefore, would need to be transferred to other units along with the related resources.
  4. Option D, which maintains the status quo in terms of location, has the disadvantage that there is a perception amongst some members that it is not sufficiently independent. On the other hand, the Programme Committee has consistently acknowledged the effectiveness of the current evaluation system and has welcomed quality, thoroughness, transparency and independence of evaluation reports appreciating, in particular, their frank assessments including the critical observations made. The disadvantages of not maintaining the status quo include the loss of synergies between the various evaluation functions currently performed in the Service and with strategy and programme development. There would also need to be a transfer of staff from the Service, as well as cost increases. The conclusion thus remains in support of the status quo.

48. The issue of independence needs, however, to be addressed regardless of institutional location and there are a number of changes which would improve this aspect. These include formalizing current practices, improving further the internal consultation processes and providing budgetary independence for the conduct of evaluations. It is thus recommended, for the consideration of the Committees, that:

  1. internal coordination on evaluation be improved with the technical departments, including the TC Department, by the establishment of an internal evaluation committee under the chairmanship of the Deputy Director-General and the Vice-Chairmanship of the Director PBE. The committee would in particular review evaluation plans, methods and procedures with a view to strengthening feedback from evaluation to strategic planning and results-based management approaches – the committee might have a sub-committee dealing with evaluation work related to TC Department, in particular that related to the Field Programme.
  2. the current arrangements for interaction and consultation between the Office of the Inspector-General (AUD) and the Evaluation Service be formalized and their work programmes be mutually discussed at least once every six months, particularly with a view to deciding where individual evaluation or audit studies would benefit from the inclusion of staff from PBEE or AUD respectively in the team for that study;
  3. a separate budget be established for major programme and thematic evaluations for the Governing Bodies and management which could initially be set at about the current level of actual expenditure (i.e. somewhat above the current budget) for this purpose;
  4. the Evaluation Service publish and keep updated its approaches and methods for the conduct of evaluations, in particular for the major programme and thematic evaluations; and
  5. current practices be institutionalized, in particular:
  1. the Chief, PBEE report administratively to the Director, PBE but exercise managerial independence, including communication directly with concerned senior managers when necessary;
  2. PBEE decide upon proposals for the rolling biennial plan of evaluations for the Governing Bodies following consultation with all levels of management. That plan would then be definitively reviewed and approved as amended by the Programme Committee;
  3. similarly, for major programme and thematic evaluations for the Governing Bodies and senior management, the terms of reference, the composition of evaluation teams and the evaluation reports would be finalized by PBEE following consultation with all levels of management with no requirement for clearance. Where evaluations are externally led, the evaluation team leader would independently finalize the report;
  4. all evaluations to be submitted to the Governing Bodies be accompanied by a management response which would explicitly accept or reject recommendations, in the latter case stating the reasons for doing so;
  5. the current practice of a management report to the Programme Committee, after a suitable interval on the progress made in implementing those evaluation recommendations accepted by management and endorsed by the Programme Committee, would be formalized, including quality control of responses by the Evaluation Service.


Annex 1: Summary of Main Features Relating to Independence of Evaluation among Selected United Nations Agencies

Organizational Aspects

Evaluation Policy and Plans

Evaluation Reports

Budget and Resources

Feedback and Learning


·  Separate office from IFAD management, and reporting directly to Executive Board (EB)

·  Director of Office appointed by the President with approval of Executive Board

·  Director at D-1/2 level for maximum of ten years

·  Policy to be specifically approved by EB

·  Plans to be directly approved by EB

·  Submitted to EB and the President simultaneously

·  Management comments presented separately to EB

·  Approved directly by EB and the Governing Council

·  Annual budget of US$ 3.6 million

·  Professional staff of six (plus 2 to 3 APOs), appointed by the Office’s Director

·  Feedback by the President on comments of EB

·  Implementation on evaluation follow-up reported annually to EB

·  Institutional learning through special measures such as ACP, CLP and publications of lessons and issues.


·  Separate office from the World Bank (WB) management, and reporting directly to the Executive Board (EB)

·  Director of Office (Director-General) appointed directly by EB for fixed term

·  Policy specifically approved by EB

·  Evaluation plans directly approved by EB

·  Submitted to EB directly (without President seeing the draft earlier)

·  Management comments presented separately to EB

·  Approved specifically by EB

·  Annual budget of US$ 20 million plus extra-budgetary resources

·  Professional staff of 60 plus 120 consultants, appointed directly by OED’s Director

·  Feedback action by President on Evaluation Reports and comments of EB

·  Follow-up to evaluation reported to the EB

·  Learning through internal workshops, publication of lessons and inputs to staff training


·  Separate office within UNDP secretariat

·  Reporting directly to the Administrator

·  Head of Office (D-2) appointed by the Administrator

·  Policy submitted to the EB for review

·  Plans submitted to EB

·  Individual reports submitted to the Administrator with their summary presented annually to EB

·  Management comments normally incorporated in final summary (seldom presented separately)

·  Approved by EB as part of the corporate budget

·  Annual budget of US$ 3.5 million

·  Professional staff of 10

·  Feedback through comments of Administrator and EB

·  Learning through workshops and dissemination on website


·  Separate office within the secretariat, reporting directly to Deputy Executive Director and ED

·  Appointment of Head of Office (D-1) by the Executive Director

·  Policy submitted to the EB for approval

·  Plans submitted to EB

·  Reports submitted annually to Executive Board by the Executive Director (without changes)

·  Management comments presented separately as required

·  Approved by the Governing Body as part of corporate budget

·  Annual budget of US$ 2.5 million

·  Professional staff of 8 (4 at Headquarters and 4 in Regional Offices)

·  Feedback through comments of Executive Director and the Board

·  Learning through workshops, inputs to staff training and dissemination of lessons on website


·  Part of Office for Internal Oversight Services (OIOS) in the secretariat

·  Reporting directly to the Head of OIOS, through him to Director-General (DG) and to the Governing Body

·  Head of Evaluation Unit appointed by DG

·  Policy largely shaped by DG as part of OIOS’s functions

·  Plans defined as those of OIOS, and submitted to Governing Body for review

·  Reports submitted by the Head of OIOS to DG as confidential OIOS reports

·  Summary reports presented annually to the Governing Body

·  Management comments reflected in the final report and not presented separately

·  Approved as part of OIOS budget in the corporate budget process

·  Annual budget is US$ 1 million with additional US$ 0.7 million for evaluation of technical cooperation activities

·  Professional staff of 6

·  Feedback by management comments and instructions

·  Follow up on evaluation reports submitted to the Governing Body for information

·  Learning - no particular effort made

(Office of Internal Oversight Services)

·  Part of OIOS within the secretariat, reporting immediately to the Assistant Secretary-General (Head of OIOS) and through him to the Secretary-General

·  Head of Evaluation Unit appointed by the Secretary-General per the personnel policy

·  Policy specifically approved by the General Assembly (Committee of Programme and Coordination - CPC)

·  Plans also specifically approved by the CPC

·  Reports treated as those of OIOS to the Secretary-General

·  Reports submitted to the General Assembly through CPC with separate management comments

·  Approved as part of budget for OIOS

·  Annual budget of US$ 0.7 million

·  Professional staff of 4

·  Feedback through comments of Secretary-General and the General Assembly

·  Follow-up on evaluation recommendations monitored and reported to the GA

·  Learning - no particular effort made


·  Part of Office for Internal Oversight Services (OIOS), reporting to Director-General (DG) through the Head of OIOS

·  Head of Evaluation Unit (P-5) appointed by the DG

·  Policy largely decided by DG and reported to the Governing Body as part of the policy for OIOS

·  Plans decided by DG as part of OIOS medium-term plans, presented to the Governing Body

·  Reports cleared by the Head of OIOS as part of OIOS reports for use by management

·  Reports submitted periodically to the Governing Body in summary form with management comment

·  Budget is approved as part of OIOS budget

·  Annual budget is US$ 2.5 million

·  Professional staff of 7

·  Feedback through management instruction and comments of Governing Body

·  Learning – apart from dissemination of evaluation results by website, no particular effort made


·  Part of Division of Results-based Management (OEDR), reporting to the Executive Director (ED) via Head of OEDR

·  Head of Evaluation Unit (D-1) appointed by the ED

·  Major policy changes submitted to the Executive Board (EB) for approval

·  Annual plans formulated by the Evaluation Office, and shared with the EB for information

·  Reports submitted to EB by the Executive Director without changes

·  Management comments normally incorporated in the reports but could be issued separately

·  Budget approved as part of the corporate process

·  Annual budget of US$ 1.4 million plus operational support costs

·  Professional staff of 7

·  Feedback through internal consultations and comments of EB

·  Follow-up action on evaluation is monitored and reported to EB

·  Learning – through workshops and periodic newsletters


·  Part of PBE, reporting to the DG through Director, PBE

·  Head of Evaluation Service (D-1) appointed by DG

·  Policy formulated by DG in consultation with Governing Bodies (Programme Committee – PC)

·  Rolling biennial plans for major programme evaluations set in consultation with PC

·   Reports finalized by PBEE after internal and external reviews

·  Management comments (including those of Director, PBE) issued separately

·  Reports submitted to PC through DG

·  Biennial Programme Evaluation Report with summaries of individual evaluations submitted to the Conference through the Council

·  Budget approved as part of corporate process

·  Annual budget of US$ 1.3 million plus additional resources (ad hoc) as necessary

·  Professional staff of 8 plus one APO

·  Feedback through evaluation process and response of management and PC

·  Follow-up actions on evaluations reported to PC

·  Learning through workshops and dissemination of results on Website, including periodic synthesis studies


1 CL 124/4, paras. 13-16.

2 These were selected as examples of the main types of evaluation arrangements, and included, besides the World Bank, IAEA, IFAD, UNDP, Unesco, UNICEF, UN (Office of Internal Oversight Services) and WFP.

3 See para. 5, Principles for Evaluation of Development Assistance, Working Party on Aid Evaluation of the Development Assistance Committee (DAC), OECD, 1991.

4 Principles for Evaluation of Development Assistance, DAC, OECD, para. 466.

5 See the DAC report “Review of the DAC Principles for Evaluation of Development Assistance”, OECD, 1998.

6 In this context, it may be noted that WFP recently decided to transfer evaluation from the internal oversight office to the Results-based Management Division (i.e. a function vested in PBE in FAO).

7 JIU/REP/95/2 Accountability, Management Improvement and Oversight in the UN System.