SlideShare a Scribd company logo
1 of 111
The views expressed in this presentation are the views of the author/s and do not necessarily reflect the views or policies of the Asian
Development Bank, or its Board of Governors, or the governments they represent. ADB does not guarantee the accuracy of the data included
in this presentation and accepts no responsibility for any consequence of their use. The countries listed in this presentation do not imply any
view on ADB's part as to sovereignty or independent status or necessarily conform to ADB's terminology.
Learning from Evaluation
Bruce Britton
and Olivier Serrat
2013
Define: Monitoring and Evaluation
• Monitoring is the systematic and continuous assessment of
progress of a piece of work over time which checks that things
are going according to plan and enables positive adjustments to
be made.
• Evaluation is the systematic and objective assessment of an
ongoing or completed project, program, or policy, its design and
implementation.
• The aim of evaluation is to determine the relevance and
fulfillment of objectives, effectiveness, efficiency, impact, and
sustainability.
• An evaluation should provide information that is credible and
useful, enabling the incorporation of lessons learned into
decision-making processes.
According to the Organisation for Economic Co-operation
and Development
The Planning, Monitoring, and
Evaluation Triangle
Planning
MonitoringEvaluation
Recommendations
for future planning
Plans show what
to evaluate
Monitoring revises
plans during project
implementation
Plans show what needs
to be monitored
Evaluation highlights areas
that need close monitoring
Monitoring provides data
to be used in evaluation
Main Types of Evaluation
Summative
or Formative
Evaluation
Internal or
External
Evaluation
Self- or
Independent
Evaluation
Project
Evaluation
Program
Evaluation
(Geographic
or Thematic)
Real-Time
Evaluation
Impact
Evaluation
A quality evaluation should provide credible and useful evidence to strengthen
accountability for results or contribute to learning processes, or both.
The Results Chain
Outputs
Outcome
Impact
ActivitiesInputs
Outputs, Outcomes, Impacts
Outputs—The
products,
capital goods,
and services
that result from
a project; they
may also
include changes
resulting from
the project that
are relevant to
the
achievement of
its outcome.
Outcomes—The
likely or
achieved short-
term and
medium-term
effects of a
project's
outputs.
Impacts—The
positive and
negative,
primary and
secondary, long-
term effects
produced by a
project, directly
or indirectly,
intended or
unintended.
OECD-DAC Evaluation Criteria
Relevance —Examines the extent to which the objectives of a project
matched the priorities or policies of major stakeholders (including
beneficiaries)
Effectiveness —Examines whether outputs led to the achievement of
the planned outcome
Efficiency—Assesses outputs in relation to inputs
Impact —Assesses what changes (intended and unintended) have
occurred as a result of the work
Sustainability—Looks at how far changes are likely to continue in the longer
term
The Results Chain and the OECD-DAC
Evaluation Criteria
Needs
Objective Inputs Activities Outputs
Outcome
Impact
Relevance Efficiency
Effectiveness
Sustainability
Challenges and Limits to
Management
Logic
Degree of
Control
Challenge of
Monitoring
and
Evaluation
Impact
What the
project is
expected to
contribute to
Outcome What the
project can be
expected to
achieve and
be
accountable
for
Outputs What is within
the direct
control of the
project's
management
Activities
Inputs DecreasingControl
IncreasingDifficulty
Indicators
An indicator is a
quantitative or
qualitative factor or
variable that offers
a means to measure
accomplishment,
reflects the changes
connected with a
project, and helps
assess performance.
Indicators do not
provide proof so
much as a reliable
sign that the
desired changes are
happening (or have
happened).
It is important not
to confuse
indicators with
outputs, outcomes,
or impacts.
Achieving the
expected change in
the indicators
should not become
the main purpose of
a project.
Planning and the Use of
Logic Models
• In development assistance, most projects are planned using
logic models such as the logical framework (logframe).
• Logic models provide a systematic, structured approach to the
design of projects.
• Logic models involve determining the strategic elements
(inputs, outputs, outcome, and impact) and their causal
relationships, indicators, and the assumptions or risks that
may influence success or failure.
• Logic models can facilitate the planning, implementation, and
evaluation of projects; however, they have significant
limitations that can affect the design of evaluation systems.
The Limitations of Logic Models
Usually assume simple, linear cause-effect development relationships
Overlook or undervalue unintended or unplanned outcomes
Do not make explicit the theory of change underlying the initiative
Do not cope well with multi-factor, multi-stakeholder processes
Undervalue creativity and experimentation in the pursuit of long-term,
sustainable impact (the "lockframe" problem)
Encourage fragmented rather than holistic thinking
Require a high level of planning capacity
Purposes of Evaluation
• To provide a basis for
accountability, including
the provision of
information to the public
Accountability
• To improve the
development
effectiveness of future
policies, strategies, and
operations through
feedback of lessons
learned
Learning
Does Evaluation Have to Be Either/Or?
What is Accountability?
Accountability is the obligation to demonstrate
that work has been conducted in compliance with
agreed rules and standards or to report fairly and
accurately on performance results vis-à-vis
mandated roles and/or plans. This may require a
careful, even legally defensible, demonstration that
the work is consistent with the contract aims.
Accountability is about demonstrating to donors,
beneficiaries, and implementing partners that
expenditure, actions, and results are as agreed or
are as can reasonably be expected in a given
situation.
Evaluation for Accountability
Relates to
standards, roles,
and plans
Is shaped by
reporting
requirements
Focuses on
effectiveness and
efficiency
Measures outputs
and outcomes
against original
intentions
Has a limited focus
on the relevance
and quality of the
project
Overlooks
unintended
outcomes (positive
and negative)
Concerns mostly
single-loop learning
What is Learning?
Data WisdomInformation Knowledge
Know WhyKnow HowKnow What
Reductionist Systemic
• Learning is the acquisition of knowledge or skills through
instruction, study, and experience.
• Learning is driven by organization, people, knowledge, and
technology working in harmony—urging better and faster
learning, and increasing the relevance of an organization.
• Learning is an integral part of knowledge management and its
ultimate end.
Evaluation for Learning
Recognizes the difference an
organization has made
Understands how the
organization has helped to
make a difference
Explores assumptions specific
to each component of a project
Shares the learning with a wide
audience
The Experiential Learning Cycle
Evaluation for Accountability
and Evaluation for Learning
Item Evaluation for Accountability Evaluation for Learning
Basic Aim The basic aim is to find out
about the past.
The basic aim is to improve
future performance.
Emphasis Emphasis is on the degree of
success or failure.
Emphasis is on the reasons for
success or failure.
Favored by Parliaments, treasuries, media,
pressure groups
Development agencies,
developing countries, research
institutions, consultants
Selection of
Topics
Topics are selected based on
random samples.
Topics are selected for their
potential lessons.
Status of
Evaluation
Evaluation is an end product. Evaluation is part of the
project cycle.
Evaluation for Accountability
and Evaluation for Learning
Item Evaluation for Accountability Evaluation for Learning
Status of
Evaluators
Evaluators should be impartial
and independent.
Evaluators usually include staff
members of the aid agency.
Importance of
Data from
Evaluations
Data are only one
consideration.
Data are highly valued for the
planning and appraising of
new development activities.
Importance of
Feedback
Feedback is relatively
unimportant.
Feedback is vitally important.
Both/And?
Programs Should Be Held
Accountable For
Asking difficult questions
Maintaining a focus on outcome
Identifying limitations, problems, and successes
Taking risks rather than "playing it safe"
Actively seeking evaluation and feedback
Actively challenging assumptions
Identifying shortcomings and how they might be rectified
Effectively planning and managing based on monitoring data
Acting on findings from evaluation
Generating learning that can be used by others
What is Feedback?
Evaluation feedback is a dynamic process that
involves the presentation and dissemination of
evaluation information in order to ensure its
application into new or existing projects.
Feedback, as distinct from dissemination of
evaluation findings, is the process of ensuring
that lessons learned are incorporated into new
operations.
Actions to Improve the Use
of Evaluation Feedback
Understand how learning happens within and outside an
organization
Identify obstacles to learning and overcome them
Assess how the relevance and timeliness of evaluation
feedback can be improved
Tailor feedback to the needs of different audiences
Involve stakeholders in the design and implementation of
evaluations and the use of feedback results
Who Can Learn from Evaluation?
The wider community
People who are or will be planning, managing, or executing similar
projects in the future
The people who contribute to the evaluation (including direct
stakeholders)
The people who conduct the evaluation
The people who commission the evaluation
The beneficiaries who are affected by the work being evaluated
The people whose work is being evaluated (including implementing
agencies)
Why We Need a Learning Approach
to Evaluation
To reap these
opportunities,
evaluation must be
designed,
conducted, and
followed-up with
learning in mind.
Evaluation
provides unique
opportunities to
learn throughout
the management
cycle of a project.
Learning should be
at the core of
every organization
to enable
adaptability and
resilience in the
face of change.
How Can Stakeholders Contribute
to Learning from Evaluation?
Help design the terms of reference for the evaluation
Be involved in the evaluation process as part of the evaluation
team or reference group or as a source of information)
Discuss and respond to the analyses and findings
Discuss and respond to recommendations
Use findings to influence future practice or policy
Review the evaluation process
What is a "Lesson"?
Lessons learned are findings and conclusions
that can be generalized beyond the
evaluated project. In formulating lessons,
the evaluators are expected to examine the
project in a wider perspective and put it in
relation to current ideas about good and bad
practice.
What is Needed to Learn a "Lesson"?
Reflect:
what
happened?
Identify: was
there a
difference
between
what was
planned and
what actually
happened?
Analyze: why
was there a
difference and
what were its
root causes?
Generalize:
what can be
learned from
this and what
could be done
in the future to
avoid the
problem or
repeat the
success?
Triangulate:
what other
sources
confirm the
lesson?
At this point, we have a lesson identified
but not yet learned: to truly learn a lesson
one must take action.
What Influences Whether a Lesson is
Learned?
Political Factors
Inspired Leadership
The Quality of the Lesson
Access to the Lesson
Conventional Wisdom
Chance
Vested Interests
Risk Aversion
Bandwagons
Pressure to Spend
Bureaucratic Inertia
Quality Standards
for Evaluation Use and Learning
The evaluation is
designed,
conducted, and
reported to
meet the needs
of its intended
users.
Conclusions,
recommendatio
ns, and lessons
are clear,
relevant,
targeted, and
actionable so
that the
evaluation can
be used to
achieve its
intended
accountability
and learning
objectives.
The evaluation is
delivered in time
to ensure
optimal use of
results.
Systematic
storage,
dissemination,
and
management of
the evaluation
report is
ensured to
provide easy
access to all
partners, reach
target
audiences, and
maximize the
benefits of the
evaluation.
Monitoring and Evaluation Systems as
Institutionalized Learning
Learning is also a key tool for
management and, as such, the strategy
for the application of evaluative
knowledge is an important means of
advancing towards outcomes.
Information
must be
disseminated
and available
to potential
users in order
to become
applied
knowledge.
Learning must be incorporated into
the overall management cycle of a
project through an effective
feedback system.
A Learning Approach to Evaluation
In development assistance, the overarching
goal for evaluation is to foster a transparent,
inquisitive, and self-critical organization
culture across the whole international
development community so we can all learn
to do better.
Eight Challenges Facing
Learning-Oriented Evaluations
The inflexibility of logic
models
The demands for
accountability and impact
The constraints created by
rigid reporting frameworks
The constraints of
quantitative indicators
Involving stakeholders
Learning considered as a
knowledge commodity
Underinvestment in
evaluation
Underinvestment in the
architecture of knowledge
management and learning
Focus of the Terms of Reference for
an Evaluation
Evaluation Purpose
Project Background
Stakeholder Involvement
Evaluation Questions
Findings, Conclusions, and Recommendations
Methodology
Work Plan and Schedule
Deliverables
Building Learning into the Terms of
Reference for an Evaluation
Make the drafting of
the terms of reference
a participatory
activity—involve
stakeholders if you can
Consider the utilization
of the evaluation from
the outset—who else
might benefit from it?
Spend time getting the
evaluation questions
clear and include
questions about
unintended outcomes
Ensure that the
"deliverables" include
learning points aimed
at a wide audience
Build in diverse
reporting and
dissemination methods
for a range of audiences
Ensure there is follow-
up by assigning
responsibilities for
implementing
recommendations
Build in a review of the
evaluation process
Why Questions Are the
Heart of Evaluation for Learning
Learning is
best
stimulated
by seeking
answers to
questions
Questions
cut through
bureaucracy
and provide
a
meaningful
focus for
evaluation
Seeking
answers to
questions
can
motivate
and
energize
Questions
make it
easier to
design the
evaluation:
what data
to gather,
how, and
from
whom?
Answers to
questions
can provide
a structure
for findings,
conclusions,
and
recommend
ations
Criteria for Useful Evaluation
Questions
Data can be used to answer each question
There is more than one possible answer to each question: each
question is open and its wording does not pre-determine the
answer
The primary intended users want to answer the questions: they
care about the answers
The primary users want to answer the questions for themselves,
not just for someone else
The intended users have ideas about how they would use the
answers to the questions: they can specify the relevance of the
answers for future action
Utilization-Focused Evaluation
Utilization-focused evaluation is done for and with
specific intended primary users for specific intended
uses.
It begins with the premise that evaluations should be
judged by their utility and actual use.
It concerns how real people in the real world apply
evaluation findings and experience the evaluation
process.
Therefore, the focus in utilization-focused evaluation
is intended use by intended users.
The Stages of Utilization-Focused
Evaluation
1. Identify
primary
intended users
2. Gain
commitment
and focus the
evaluation
3. Decide
on
evaluation
methods
4. Analyze and
interpret findings,
reach conclusions,
and make
recommendations
5.
Disseminate
evaluation
findings
Potential Evaluation Audiences
Program
Funders
Board
Members
Policy
Makers
Program
Managers
Program
Staff
Program
Clients
NGOs Researchers
Media
Other
Agencies
Target
Audiences
for
Evaluation
Feedback
Typology of Evaluation Use
Conceptual Use of Evaluation
• Conceptual use is about generating knowledge in and
understanding of a given area. Then, people think
about the project in a different way.
• Over time and given changes in the contextual and
political circumstances surrounding the project,
conceptual use can lead to significant changes.
Genuine Learning
Instrumental Use of Evaluation
• The evaluation directly affects decision-making and
influences changes in the program under review.
• Evidence for this type of utilization involves decisions
and actions that arise from the evaluation, including
the implementation of recommendations.
Practical Application
Process Use of Evaluation
• Process use concerns how individuals and
organizations are impacted as a result of participating
in an evaluation. Being involved in an evaluation may
lead to changes in the thoughts and behaviors of
individuals which may then lead to beneficial cultural
and organizational change.
• Types of use that precede lessons learned include
learning to learn, creating shared understanding,
developing networks, strengthening projects, and
boosting morale.
Learning by Doing
Symbolic Use of Evaluation
• Symbolic use means that evaluations are undertaken
to signify the purported rationality of the agency in
question. Hence one can claim that good
management practices are in place.
Purposeful Non-Learning
Political Use of Evaluation
• Evaluation occurs after key decisions have been taken.
The evaluation is then used to justify the pre-existing
position, e.g., budget cuts to a program.
Learning is Irrelevant
Factors That Affect Utilization
Relevance of the
findings,
conclusions, and
recommendations
Credibility of the
evaluators
Quality of the
analysis
Actual findings
The evaluator's
communication
practices
Timeliness of
reporting
The organizational
setting
The attitudes of key
individuals towards
the evaluation
The organizational
climate, e.g.,
decision-making,
political, and
financial
Obstacles to Learning from Evaluation
Individual
Defense
Mechanisms
• Immediate personal reaction to feedback
that threatens us tends to be defensive.
In addition, we tend to resist evidence
that does not accord with our own world
views.
• It takes a conscious effort to actively seek
feedback and hear evidence that may be
negative, or may not fit with our own
world view. It is hard to treat discordant
information as something that may help
us to improve, and to navigate that
information with curiosity rather than
suspicion.
Obstacles to Learning from Evaluation
Organizational
Dilemmas
• Do organizational culture, structures,
policies, or procedures help or hinder
learning? Do design processes build
ownership and accountability from the
outset? Do reporting and review
procedures foster honesty and trust? Do
procedures allow for flexibility and
change? Do staff collectively share their
experiences and insights about what
works? Are there incentives for learning?
Are there time and resource constraints?
Enhancing Learning from Evaluation
For Individual Evaluations
• Select topics that are relevant to your intended audiences and
their timeframes.
• Proactively plan for use from the start: this means intended
use by intended users. Identify supportive and influential
individuals who want evaluative information.
• Vigorously engage intended users throughout the evaluation
process, e.g., by means of advisory committees, help with
forming recommendations, data analysis. Learning is an
active, not passive, process.
Enhancing Learning from Evaluation
For Individual Evaluations
• The evaluation needs to be credible in the eyes of users and
of high quality. If your findings are controversial your evidence
needs to be of an even higher standard.
• Timely reporting supports immediate use although research
suggests evaluations have a useful life of 8–10 years.
• Reporting of results needs to make use of multiple formats,
e.g., written, verbal, and/or visual, while presenting 3–5 key
messages in a user-friendly format.
Enhancing Learning from Evaluation
For Individual Evaluations
• Good recommendations are technically, politically,
administratively, legally, and financially viable.
• Evaluation findings must be assertively disseminated in a
manner that supports audience engagement. Your evaluation
findings are competing against other sources of information.
• Think about learning and use from a change management
perspective. Resistance to change is to be expected. Potential
users benefit from support: technical; emotional; financial;
practical help and advice; etc.
Enhancing Learning from Evaluation
Institutionalizing Monitoring and Evaluation Systems
• To enhance the prospects of learning and use, there is a need
to link evaluation into mainstream processes such as policy
making, planning, budgeting, accountability and reporting,
managing for results, and organizational incentives
Monitoring and Evaluation:
Conventional and Narrative
Conventional
• Deductive—about expected
outcomes
• Indicators often determined
by senior staff
• Closed or specific questioning
• Analysis by management
• Based on numbers—no
context
• About "proving"
• Central tendencies
Narrative
• Inductive—about unexpected
outcomes
• Diversity of views (from staff
and beneficiaries)
• Open questioning
• Participatory analysis
• Contextual—"rich description"
• About learning and
improvement
• Outer edges of experience
What is Required of Today's
Evaluations
Involve
Stakeholders
Design
Evaluations to
Enhance Use
Focus on
Performance
Improvement
Demonstrate
Transparency
Show Cultural
Competence
Build
Evaluation
Capacity
Why Use a Narrative
(Story-Based) Approach?
Storytelling
Stories start
with the
lived
experience of
beneficiaries
People tell
stories
naturally
People
remember
stories
Stories can
convey
difficult
messages
Stories
provide a
"rich picture"
to decision-
makers
Stories
provide a
basis for
discussion
What is Most Significant Change?
The Most Significant
Change technique is a
qualitative and
participatory form of
monitoring and
evaluation based on
the collection and
systematic selection of
stories of reported
changes from projects.
The Most Significant
Change technique
does not use pre-
designed indicators
but allows these to
emerge from the
stories told by those
affected by the
project.
The Most Significant Change Cycle
Project
Changes
in Peoples'
Lives
Stories
Learning Action
Improved
Project
The Core of the Most Significant
Change Technique
A question: "In your opinion,
what was the most significant
change that took place in …
over the … months?" [Describe
the change and explain why
you think it is significant.]
Re-iteration of the same kind of
question: "Decide which of the
change stories collected
describes the 'most significant'
changes experienced by the
respondents." [Describe the
change and tell why you think it
is significant.]
What Makes Most Significant Change
Different
The advantages of the
Most Significant
Change technique,
compared to
conventional
approaches to
monitoring and
evaluation, are that
Those participating
have a choice about
what sort of
information to collect.
The technique uses
diverse rather than
standard data.
Information is analyzed
by all participants, not
simply by a central unit.
Subjectivity is used
rather than avoided.
The 10 Steps of the Most Significant
Change Technique
1. Get started,
establish champions,
and familiarize
2. Determine domains
of change
3. Define the reporting
period
4. Collect stories
5. Review and filter
stories regularly
6. Discuss and
communicate the
results of the selection
with stakeholders
7. Verify stories
8. Quantify
9. Conduct secondary
analysis and meta-
monitoring
10. Revise the process
Selecting Significant Change Stories
Staff read through and identify the most significant of all the
submitted significant change stories.
Selection criteria emerge through discussion of stories—
these criteria are noted.
Staff document (i) what significant change was selected, (ii)
why it was selected, (iii) the process used to make the
selection, and (iv) who was involved.
Subjectivity is made accountable through transparency.
How to Use the Most Significant Change
Technique
Not as a
stand-alone
method
Alongside
indicator-
based
systems
To identify
unexpected
changes
To engage
people in
analysis of
change
To involve a
wide range of
people
To focus on
outcomes
rather than
outputs
The Conventional Problem-Focused
Approach to Evaluation
Identify the issues
or problems
Determine the root
causes
Generate solutions
Develop action
plans
Implement action
plans
Problem-Focused Approach—
Assumptions
• There is some ideal way for things to be (usually determined
by the logic model).
• If a situation is not as we would like it to be, it is a "problem"
to be solved.
• Deviations from the plan (logic model) are automatically seen
as problems.
• The way to solve a problem is to break it into parts and
analyze it.
• If we find the broken part and fix it, the whole problem will be
solved.
Unintended Consequences
of Problem-Focused Approaches
• Fragmented responses—lack of holistic overview
• Necessary adaptations to plans viewed negatively
• Focus on single-loop learning—lack of creativity and
innovation; untested assumptions
• Reinforces negative vocabulary—drains energy; leads to
hopelessness and wish to simply get work completed
• Reinforces "blame culture"—undermines trust; increases risk
aversion; strains relationships
Appreciative Inquiry
• Recognize the quality, significance or magnitude of
• Recognizing the best in someone or something
• To be fully aware of or sensitive to
• To raise in value or price
Ap-pre'ci-ate, v.
• The act of exploration and discovery
• The process of gathering information for the purpose
of learning and changing
• A close examination in a quest for truth
In-quire', v.
What is Appreciate Inquiry?
• Appreciative inquiry builds on learning from what is working
well rather than focusing on "fixing" problems.
• Appreciative inquiry brings positive experiences and successes
to everyone's awareness.
• Appreciative inquiry uses a process of collaborative inquiry
that collects and celebrates good news stories.
• Stories that emanate from appreciative inquiry generate
knowledge that strengthens the identity, spirit, and vision of
the team involved in the project and helps everyone learn
how to better guide its development.
Appreciative Inquiry and Evaluation
• Appreciative inquiry helps identify and value what is working
well in a project and builds on these good practices.
• Appreciative inquiry is better suited to formative evaluation or
monitoring than to summative evaluation.
• Appreciative inquiry can be used to guide questions during
development of the terms of reference for an evaluation and
at data collection stages.
Comparing Appreciative Inquiry with
Problem-Focused Approaches
What to grow
New grammar of the true,
good, better, possible
"Problem focus" implies
that there is an ideal
Expands vision of
preferred future
Creates new energy fast
Assumes organizations are
sources of capacity and
imagination
What to fix
Underlying grammar =
problem, symptoms,
causes, solutions, action
plan, intervention
Breaks things into pieces
and specialties,
guaranteeing fragmented
responses
Slow! It takes a lot of
positive emotion to make
real change
Assumes organizations are
constellations of problems
to be overcome
AppreciativeInquiry
ProblemFocus
The Appreciative Inquiry Process
—The 5-Ds or 5-Is
1. Definition:
Frame the
inquiry (Initiate)
2. Discovery:
What is good?
What has
worked?
(Inquire)
3. Dream: What
might be?
(Imagine)
4. Design: What
should be?
(Innovate)
5. Destiny: How
to make what
should be
happen?
(Implement)
Example Starter Questions for
Appreciative Inquiry
• Think back on your time with this project. Describe a high
point or exceptional experience that demonstrates what the
project has been able to achieve.
• Describe a time when this project has been at its best—when
people were proud to be a part of it. What happened? What
made it possible for this highpoint to occur? What would
things look like if that example of excellence was the norm?
Good appreciative inquiry questions should illuminate in turn the five dimensions the
technique addresses.
Appreciative Inquiry Can Enrich
Evaluation When …
The organization is interested in using participatory and
collaborative evaluation approaches
The evaluation is happening part way through a project
(formative)
The evaluation includes a wide range of stakeholders with differing
views of "success"
The organization is genuinely interested in learning from
unintended as well as intended outcomes
The organization wants to use evaluation findings to guide change
There is a desire to build evaluation capacity
The Nature of Development
Complex: involves a mix of actors and factors
Indeterminate: independent of the duration of a project
Non-Linear: unexpected, emergent, discontinuous
Two-Way: results may change the project
Beyond Control: but subject to influence
Incremental, Cumulative: watersheds and tipping points
Challenges in Evaluating Development
Interventions
Establishing cause
and effect in open
systems
Measuring what
did not happen
Reporting on
emerging
objectives
Justifying
continuing
"successful"
projects
Timing—when to
evaluate
Encouraging
iterative learning
among partners
Clarifying values
Working in
"insecure"
situations
A Critical Look at Logic Models
Clarify objectives and how
they will be achieved
Make explicit the
assumptions about cause
and effect
Identify potential risks
Establish how progress will
be monitored
Lack of flexibility
Lack of attention to
relationships
Problem-focused approach
to planning
Insufficient attention to
outcomes
Oversimplifies monitoring
and evaluation
Inappropriate at program
and organizational levels
Outcome Mapping
Outcome mapping is an approach to planning, monitoring, and
evaluating social change initiatives.
Outcome mapping uses a set of tools and guidelines that steer
teams through a process to identify their project's desired
changes and to work collaboratively to bring about those
changes.
Results are measured by changes in the relationships,
behaviors, and actions of the individuals, groups, and
organizations the project is working directly with and seeking to
influence.
Outcome Mapping Can Help …
Understand and influence human well-being
Plan and measure social change
Foster social and organizational learning
Identify parties with whom one might work directly to
influence behavioral change
Bring stakeholders into the monitoring and evaluation process
Strengthen partnerships and alliances
Plan and monitor behavioral change
Monitor the internal practices of the project
Design an evaluation plan
The Three Key Concepts
of Outcome Mapping
Sphere of
Influence
Boundary
Partners
Outcomes
Understood as
Changes in
Behavior
Development is about people—it is about how they relate to one another and their
environment, and how they learn in doing so. Outcome mapping puts people and
learning first and accepts unexpected change as a source of innovation. It shifts the focus
from changes in state, viz. reduced poverty, to changes in behaviors, relationships,
actions, and activities.
—Olivier Serrat
There is a Limit to Our Influence
Project Partners Beneficiaries
Sphere
of Control
Sphere
of Influence
Sphere
of Interest
There is a Limit to Our Influence
Inputs Outcomes ImpactOutputs
Sphere
of Control
Sphere
of Influence
Sphere
of Interest
Focus of Outcome Mapping
Outcome Mapping
Inputs Activities Outputs Outcome Impact
Boundary Partners
Beneficiaries
Stakeholders
Boundary Partners
Project
Boundary Partner Example
Participatory
research on
demonstration
plots to reduce use
of chemicals and
introduce
yam cultivation
Farmers
participate
in field trials
Participating
farmers learn how to
identify and
use non-cultivated food
sources and how to grow
yams
Extension workers
visit demonstration
farms
Training of
extension
workers
Documentation of
effective farming
practices
Increased
knowledge of
traditional
practices through
food festivals and
other social
marketing
Extension workers
promote use of non-
cultivated food
sources, natural
fertilizers, and
adoption of yam
cultivation
Other farmers use non-
cultivated food sources
and grow yams
Less
dependency
on market for
food sources
Greater food
security
Improved health and
reduced poverty
The Problem with Impact
Impact Implies …
• Cause and effect
• Positive, intended
results
• Focus on ultimate
effects
• Credit goes to a single
contributor
• Story ends when
project obtains success
The Reality is …
• Open system
• Unexpected positive
and negative results
occur
• Upstream effects are
important
• Multiple actors create
results and deserve
credit
• Change process never
ends
The Principles of Outcome Mapping
Actor-Centered
Development
and Behavioral
Change
Continuous
Learning and
Flexibility
Participation
and
Accountability
Non-Linearity
and
Contribution
(not attribution
and control)
Three Stages of Outcome Mapping
Intentional Design
• Step 1: Vision
• Step 2: Mission
• Step 3: Boundary Partners
• Step 4: Outcome Challenges
• Step 5: Progress Markers
• Step 6: Strategy Maps
• Step 7: Organizational Practices
Outcome and Performance
Monitoring
• Step 8: Monitoring Priorities
• Step 9: Outcome Journals
• Step 10: Strategy Journal
• Step 11: Performance Journal
Evaluation Planning
• Step 12: Evaluation Plan
When Does Outcome Mapping
Work Best?
When working in partnership
When building capacity
When a deeper understanding of social factors is critical
When promoting knowledge and influencing policy
When tackling complex problems
To embed reflection and dialogue
Tips for Introducing Outcome
Mapping
Use it flexibly
Foster
capacities
and
mindsets
Use it to
encourage
collaboration
Use it to
manage
shifts in
organizatio
nal culture
Focus on
timing
Learning and Project Failure
Stage Category
Preparation Failures of intelligence: not knowing enough at the early stages of
project formulation, resulting in crucial aspects of the project's
context being ignored.
Failures of decision making: drawing false conclusions or making
wrong choices from the data that are available, and underestimating
the importance of key pieces of information.
Implementation Failures of implementation: bad or inadequate management of one
or more important aspects of the project.
Failures of reaction: inability or unwillingness to modify the project
in response to new information or changes in conditions that come
to light as the project proceeds.
Evaluation Failures of evaluation: not paying enough attention to the results.
Failures of learning: not transferring the lessons into future plans
and procedures.
Competencies for Knowledge
Management and Learning
Strategy
Development
A strategy is a long-term plan of action to achieve a particular
goal.
Management
Techniques
Knowledge is a resource. It needs to be managed effectively,
just like other resources such as financial and human resources.
Collaboration
Mechanisms
When working with others, efforts sometimes turn out to be
less than the sum of the parts. Too often, not enough attention
is paid to facilitating effective collaborative practices.
Knowledge
Sharing and
Learning
Two-way communication that takes place simply and effectively
builds knowledge.
Knowledge
Capture and
Storage
Organizational memory is a key part of any knowledge
management system. The knowledge needs to be retrievable to
be useful.
Knowledge Solutions for Knowledge
Management and Learning
• Behavior and change; emergence and scenario thinking; institutional
capacity and participation; knowledge assets; marketing; organizational
learning; partnerships and networks of practice
Strategy Development
• Branding and value; complexity and lateral thinking; linear thinking;
organizational change; talent management
Management Techniques
• Collaborative tools; communities of practice and learning alliances;
leadership; social innovations; teamwork
Collaboration Mechanisms
• Creativity, innovation, and learning; learning and development; learning
lessons; dissemination
Knowledge Sharing and Learning
• Knowledge harvesting; reporting; technology platforms
Knowledge Capture and Storage
www.adb.org/site/knowledge-management/knowledge-solutions
Developing Evaluation Capacity
Capacity is the ability of
people, organizations, and
society to manage their
affairs successfully.
Capacity to undertake
effective monitoring and
evaluation is a determining
factor of development
effectiveness.
Evaluation capacity development is the
process of reinforcing or establishing the
skills, resources, structures, and
commitment to conduct and use
monitoring and evaluation over time.
Why Develop Evaluation Capacity?
Stronger evaluation capacity will help development agencies
• Develop as a learning organization.
• Take ownership of their visions for poverty reduction, if the
evaluation vision is aligned with that.
• Profit more effectively from formal evaluations.
• Make self-evaluations an important part of their activities.
• Focus on quality improvement efforts.
• Increase the benefits and decrease the costs associated with their
operations.
• Augment their ability to change programming midstream and adapt
in a dynamic, unpredictable environment.
• Build evaluation equity, if they are then better able to conduct more
of their own self-evaluation, instead of hiring them out.
• Shorten the learning cycle.
Using Knowledge Management
for Evaluation
Evaluation findings only add value when they are used, so:
• Make evaluation findings available when needed by decision
makers, in a user-friendly format, e.g., a searchable lessons
database system.
• Invest in knowledge management architecture.
• Make evaluation findings available in a range of knowledge
products, including web-based.
• Encourage collaboration between evaluation specialists and
knowledge management specialists.
• Improve targeted dissemination of evaluation findings.
How to Share Findings
from Evaluations
To increase the chances evaluation findings will be used they must be
shared widely, so:
• Upload to public websites.
• Hold meetings with interested stakeholders.
• Incorporate findings into existing publications, e.g., annual reports,
newsletters.
• Present findings and learning points at annual meetings.
• Publish an article for a journal.
• Present a paper at a conference or seminar.
• Invite local researchers and academics to discuss evaluation
findings.
• Share findings and learning points through workshops and training.
• Share lessons through knowledge networks and communities of
practice.
Characteristics
of a Good Knowledge Product
A good knowledge product is
• Related to what users want
• Designed for a specific audience
• Relevant to decision-making needs
• Timely
• Written in clear and easily understandable language
• Based on evaluation information without bias
• If possible, developed through a participatory process and validated
through a quality assurance process with relevant stakeholders
• Easily accessible to target audience
• Consistent with what other products enhance visibility and learning
Further Reading
• ADB. 2008. Output Accomplishment and the Design and
Monitoring Framework. Manila.
www.adb.org/publications/output-accomplishment-and-
design-and-monitoring-framework
• ——. 2008. Focusing on Project Metrics. Manila.
www.adb.org/publications/focusing-project-metrics
• ——. 2008. Outcome Mapping. Manila.
www.adb.org/publications/outcome-mapping
• ——. 2008. The Reframing Matrix. Manila.
www.adb.org/publications/reframing-matrix
• ——. 2008. Appreciative Inquiry. Manila.
www.adb.org/publications/appreciative-inquiry
Further Reading
• ADB. 2009. The Most Significant Change Technique. Manila.
www.adb.org/publications/most-significant-change-technique
• ——. 2009. Monthly Progress Notes. Manila.
www.adb.org/publications/monthly-progress-notes
• ——. 2009. Learning from Evaluation. Manila.
www.adb.org/publications/learning-evaluation
• ——. 2009. Asking Effective Questions. Manila.
www.adb.org/publications/learning-evaluation
• ——. 2010. Embracing Failure. Manila.
www.adb.org/publications/embracing-failure
Further Reading
• ADB. 2010. Harvesting Knowledge. Manila.
www.adb.org/publications/harvesting-knowledge
• ——. 2010. The Perils of Performance Measurement. Manila.
www.adb.org/publications/perils-performance-measurement
• ——. 2011. Learning Histories. Manila.
www.adb.org/publications/learning-histories
• Anne Acosta and Boru Douthwaite. 2005. Appreciative
Inquiry: An Approach for Learning and Change Based on Our
Own Best Practices. ILAC Brief 6.
• Ollie Bakewell. 2003. Sharpening the Development Process.
INTRAC.
Further Reading
• Scott Bayley. 2008. Maximizing the Use of Evaluation Findings.
Manila.
• Rick Davies and Jess Dart. 2004. The Most Significant Change
(MSC) Technique: A Guide to Its Use. Monitoring and
Evaluation News.
• Paul Engel and Charlotte Carlsson. 2002. Enhancing Learning
through Evaluation. ECDPM.
• Lucy Earle. 2003. Lost in the Matrix: The Logframe and the
Local Picture. INTRAC.
• Paul Engel, Charlotte Carlsson, and Arin Van Zee. 2003.
Making Evaluation Results Count: Internalizing Evidence by
Learning. ECDPM.
Further Reading
• Ollie Bakewell and Anne Garbutt. 2005. The Use and Abuse of
the Logical Framework Approach. Sida.
• Stephen Gill. 2009. Developing a Learning Culture in Nonprofit
Organizations. Sage
• Harry Jones and Simon Hearn. 2009. Outcome Mapping: A
Realistic Alternative for Planning, Monitoring, and Evaluation.
Overseas Development Institute.
• OECD. 2001. Evaluation Feedback for Effective Learning and
Accountability.
• ——. 2010. DAC Quality Standards for Development
Evaluation.
Further Reading
• OECD. Undated. Evaluating Development Co-Operation:
Summary of Key Norms and Standards.
• Michael Quinn Patton. 2008. Utilization Focused Evaluation.
Sage.
• Michael Quinn Patton and Douglas Horton. 2009. Utilization-
Focused Evaluation for Agricultural Innovation. CGIAR-ILAC.
• Burt Perrin. 2007. Towards a New View of Accountability. In
Marie-Louise Bemelmans-Vide, Jeremy Lonsdale, and Burt
Perrin (eds.). 2007. Making Accountability Work: Dilemmas for
Evaluation and for Audit. Transaction Publishers.
• Hallie Preskill and Rosalie Torres. 1999. Evaluative Inquiry for
Learning in Organizations. Sage.
Further Reading
• Sida. 2004. Looking Back, Moving Forward. Sida Evaluation
Manual. Sida.
• UNDP. 2009. Handbook on Planning Monitoring and
Evaluation for Development Results.
• Rob Vincent and Ailish Byrne. 2006. Enhancing Learning in
Development Partnerships. Development in Practice. Vol. 16,
No. 5, pp. 385-399.
• Eric Vogt, Juanita Brown, and David Isaacs. 2003. The Art of
Powerful Questions: Catalyzing Insight, Innovation, and
Action. Whole Systems Associates.
Further Reading
• Jim Woodhill. 2005. M&E as Learning: Re-Thinking the
Dominant Paradigm. In Monitoring and Evaluation of Soil
Conservation and Watershed Development Projects. World
Association of Soil And Water Conservation.
• World Bank. 2004. Ten Steps to a Results-Based Monitoring
and Evaluation System.
Videos
• Jess Dart. Most Significant Change, Part 1.
www.youtube.com/watch?v=H32FTygl-Zs&feature=related
• ——. Most Significant Change, Part 2.
www.youtube.com/watch?v=b-wpBoVPkc0&feature=related
• ——. Most Significant Change, Part 3.
www.youtube.com/watch?v=PazXICHBDDc&feature=related
• ——. Most Significant Change Part 4.
www.youtube.com/watch?v=8DmMXiJr1iw&feature=related
• ——. Most Significant Change Part 5 (Q&A).
www.youtube.com/watch?v=JuaGmstG8Kc&feature=related
• Sarah Earl. Introduction to Outcome Mapping, Part 1.
www.youtube.com/watch?v=fPL_KEUawnc
Videos
• Sarah Earl. Introduction to Outcome Mapping, Part 2.
www.youtube.com/watch?v=a9jmD-mC2lQ&NR=1
• ——. Introduction to Outcome Mapping, Part 3.
www.youtube.com/watch?v=ulXcE455pj4&feature=related
• ——. Utilization-Focused Evaluation.
www.youtube.com/watch?v=KY4krwHTWPU&feature=related
Quick Response Codes
@ADB
@ADB Sustainable
Development Timeline
@Academia.edu
@LinkedIn
@ResearchGate
@Scholar
@SlideShare
@Twitter

More Related Content

What's hot

Need for Curriculum revision
Need for Curriculum revisionNeed for Curriculum revision
Need for Curriculum revisionBavijesh Thaliyil
 
Validity and reliability in assessment.
Validity and reliability in assessment. Validity and reliability in assessment.
Validity and reliability in assessment. Tarek Tawfik Amin
 
Principles of assessment
Principles of assessmentPrinciples of assessment
Principles of assessmentmunsif123
 
Curriculum evaluation rubrics
Curriculum evaluation rubricsCurriculum evaluation rubrics
Curriculum evaluation rubricsRoshniKS4
 
Assessment and Evaluation
Assessment and EvaluationAssessment and Evaluation
Assessment and EvaluationSuresh Babu
 
Curriculum evaluation
Curriculum evaluationCurriculum evaluation
Curriculum evaluationPunam Thakur
 
Programmed Instruction Dr. Umashree D K
Programmed Instruction   Dr. Umashree D KProgrammed Instruction   Dr. Umashree D K
Programmed Instruction Dr. Umashree D KDr Umashree D K
 
Curriculum and syllabus
Curriculum and syllabusCurriculum and syllabus
Curriculum and syllabusDavid Geelan
 
Types of assessment
Types of assessmentTypes of assessment
Types of assessmentavacara1969
 
Summative and formative evaluation
Summative and formative evaluationSummative and formative evaluation
Summative and formative evaluationKing Ayapana
 
Evaluation – concepts and principles
Evaluation – concepts and principlesEvaluation – concepts and principles
Evaluation – concepts and principlesAruna Ap
 
Program evaluation
Program evaluationProgram evaluation
Program evaluationYen Bunsoy
 
About Aims, Objectives, Goals and Ends
About Aims, Objectives, Goals and EndsAbout Aims, Objectives, Goals and Ends
About Aims, Objectives, Goals and EndsGabriela Catepón
 
Ppt on educational measurement and evaluation
Ppt on educational measurement and evaluationPpt on educational measurement and evaluation
Ppt on educational measurement and evaluationmangistudebele
 
Formative and summative evaluation in Education
Formative and summative evaluation in EducationFormative and summative evaluation in Education
Formative and summative evaluation in EducationSuresh Babu
 
Formative and summative evaluation (assessment of learning 2)
Formative and summative evaluation (assessment of learning 2)Formative and summative evaluation (assessment of learning 2)
Formative and summative evaluation (assessment of learning 2)RMI Volunteer teacher
 

What's hot (20)

Need for Curriculum revision
Need for Curriculum revisionNeed for Curriculum revision
Need for Curriculum revision
 
Assessment
Assessment Assessment
Assessment
 
Validity and reliability in assessment.
Validity and reliability in assessment. Validity and reliability in assessment.
Validity and reliability in assessment.
 
Principles of assessment
Principles of assessmentPrinciples of assessment
Principles of assessment
 
Curriculum evaluation rubrics
Curriculum evaluation rubricsCurriculum evaluation rubrics
Curriculum evaluation rubrics
 
Curriculum monitoring
Curriculum monitoringCurriculum monitoring
Curriculum monitoring
 
Assessment and Evaluation
Assessment and EvaluationAssessment and Evaluation
Assessment and Evaluation
 
Curriculum evaluation
Curriculum evaluationCurriculum evaluation
Curriculum evaluation
 
Programmed Instruction Dr. Umashree D K
Programmed Instruction   Dr. Umashree D KProgrammed Instruction   Dr. Umashree D K
Programmed Instruction Dr. Umashree D K
 
Curriculum and syllabus
Curriculum and syllabusCurriculum and syllabus
Curriculum and syllabus
 
Types of assessment
Types of assessmentTypes of assessment
Types of assessment
 
Rating scale : A Tool of Evaluation
Rating scale :  A Tool of EvaluationRating scale :  A Tool of Evaluation
Rating scale : A Tool of Evaluation
 
Summative and formative evaluation
Summative and formative evaluationSummative and formative evaluation
Summative and formative evaluation
 
Evaluation – concepts and principles
Evaluation – concepts and principlesEvaluation – concepts and principles
Evaluation – concepts and principles
 
Authentic assessment
Authentic assessmentAuthentic assessment
Authentic assessment
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
About Aims, Objectives, Goals and Ends
About Aims, Objectives, Goals and EndsAbout Aims, Objectives, Goals and Ends
About Aims, Objectives, Goals and Ends
 
Ppt on educational measurement and evaluation
Ppt on educational measurement and evaluationPpt on educational measurement and evaluation
Ppt on educational measurement and evaluation
 
Formative and summative evaluation in Education
Formative and summative evaluation in EducationFormative and summative evaluation in Education
Formative and summative evaluation in Education
 
Formative and summative evaluation (assessment of learning 2)
Formative and summative evaluation (assessment of learning 2)Formative and summative evaluation (assessment of learning 2)
Formative and summative evaluation (assessment of learning 2)
 

Similar to Learning from Evaluation

Evaluation approaches presented by hari bhusal
Evaluation approaches presented by hari bhusalEvaluation approaches presented by hari bhusal
Evaluation approaches presented by hari bhusalHari Bhushal
 
Definations for Learning 24 July 2022 [Autosaved].pptx
Definations for Learning 24 July 2022 [Autosaved].pptxDefinations for Learning 24 July 2022 [Autosaved].pptx
Definations for Learning 24 July 2022 [Autosaved].pptxInayatUllah780749
 
Evaluation Capacity Development
Evaluation Capacity DevelopmentEvaluation Capacity Development
Evaluation Capacity DevelopmentOlivier Serrat
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Muthuraj K
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptxgggadiel
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christieharrindl
 
Instructional leadership workshop Session 3
Instructional leadership workshop Session 3Instructional leadership workshop Session 3
Instructional leadership workshop Session 3Education Moving Up Cc.
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorldFish
 
Objectives of Project evaluation
Objectives of Project evaluationObjectives of Project evaluation
Objectives of Project evaluationRobina Nasir
 
Introduction to Development Evaluation 发展评价导言
Introduction to Development Evaluation 发展评价导言Introduction to Development Evaluation 发展评价导言
Introduction to Development Evaluation 发展评价导言Dadang Solihin
 
Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9Harinder Goel
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes successcontentli
 

Similar to Learning from Evaluation (20)

Project Monitoring and Evaluation
Project Monitoring and Evaluation Project Monitoring and Evaluation
Project Monitoring and Evaluation
 
Evaluation approaches presented by hari bhusal
Evaluation approaches presented by hari bhusalEvaluation approaches presented by hari bhusal
Evaluation approaches presented by hari bhusal
 
Definations for Learning 24 July 2022 [Autosaved].pptx
Definations for Learning 24 July 2022 [Autosaved].pptxDefinations for Learning 24 July 2022 [Autosaved].pptx
Definations for Learning 24 July 2022 [Autosaved].pptx
 
Project management
Project managementProject management
Project management
 
Putting Program Evaluation to Work for You
Putting Program Evaluation to Work for YouPutting Program Evaluation to Work for You
Putting Program Evaluation to Work for You
 
Evaluation Capacity Development
Evaluation Capacity DevelopmentEvaluation Capacity Development
Evaluation Capacity Development
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.
 
Ngo’s project management
Ngo’s project managementNgo’s project management
Ngo’s project management
 
Program Evaluation 1
Program Evaluation 1Program Evaluation 1
Program Evaluation 1
 
A2011214171642 1
A2011214171642 1A2011214171642 1
A2011214171642 1
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Measuring the Impact and ROI of community/social investment programs
Measuring the Impact and ROI of community/social investment programsMeasuring the Impact and ROI of community/social investment programs
Measuring the Impact and ROI of community/social investment programs
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christie
 
Instructional leadership workshop Session 3
Instructional leadership workshop Session 3Instructional leadership workshop Session 3
Instructional leadership workshop Session 3
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessment
 
Objectives of Project evaluation
Objectives of Project evaluationObjectives of Project evaluation
Objectives of Project evaluation
 
Introduction to Development Evaluation 发展评价导言
Introduction to Development Evaluation 发展评价导言Introduction to Development Evaluation 发展评价导言
Introduction to Development Evaluation 发展评价导言
 
Project Monitoring & Evaluation
Project Monitoring & EvaluationProject Monitoring & Evaluation
Project Monitoring & Evaluation
 
Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
 

More from Olivier Serrat

Leading Organizations of the Future: A New Framework.pdf
Leading Organizations of the Future: A New Framework.pdfLeading Organizations of the Future: A New Framework.pdf
Leading Organizations of the Future: A New Framework.pdfOlivier Serrat
 
Safeguarding Lake Chad
Safeguarding Lake ChadSafeguarding Lake Chad
Safeguarding Lake ChadOlivier Serrat
 
Gilgamesh: The Prototypical Epic Hero
Gilgamesh: The Prototypical Epic HeroGilgamesh: The Prototypical Epic Hero
Gilgamesh: The Prototypical Epic HeroOlivier Serrat
 
Leading Organizations of the Future: Oral Defense
Leading Organizations of the Future: Oral DefenseLeading Organizations of the Future: Oral Defense
Leading Organizations of the Future: Oral DefenseOlivier Serrat
 
Leading Organizations of the Future: A Dissertation Proposal
Leading Organizations of the Future: A Dissertation ProposalLeading Organizations of the Future: A Dissertation Proposal
Leading Organizations of the Future: A Dissertation ProposalOlivier Serrat
 
Digital Solutions: Reframing Leadership
Digital Solutions: Reframing LeadershipDigital Solutions: Reframing Leadership
Digital Solutions: Reframing LeadershipOlivier Serrat
 
Leading Solutions: Essays in Business Psychology
Leading Solutions: Essays in Business PsychologyLeading Solutions: Essays in Business Psychology
Leading Solutions: Essays in Business PsychologyOlivier Serrat
 
The Global Compact, Human Rights, and Nike, Inc.
The Global Compact, Human Rights, and Nike, Inc.The Global Compact, Human Rights, and Nike, Inc.
The Global Compact, Human Rights, and Nike, Inc.Olivier Serrat
 
Minority Population Analysis: The Aeta of the Philippines
Minority Population Analysis: The Aeta of the PhilippinesMinority Population Analysis: The Aeta of the Philippines
Minority Population Analysis: The Aeta of the PhilippinesOlivier Serrat
 
Reflections on a Multifactor Leadership Questionnaire 360 Leader's Report
Reflections on a Multifactor Leadership Questionnaire 360 Leader's ReportReflections on a Multifactor Leadership Questionnaire 360 Leader's Report
Reflections on a Multifactor Leadership Questionnaire 360 Leader's ReportOlivier Serrat
 
Ethics at the Movies: Erin Brockovich (2000)
Ethics at the Movies: Erin Brockovich (2000)Ethics at the Movies: Erin Brockovich (2000)
Ethics at the Movies: Erin Brockovich (2000)Olivier Serrat
 
The Servant Leadership of Gandhi
The Servant Leadership of GandhiThe Servant Leadership of Gandhi
The Servant Leadership of GandhiOlivier Serrat
 
Idealized Design for Virtual Teaming
Idealized Design for Virtual TeamingIdealized Design for Virtual Teaming
Idealized Design for Virtual TeamingOlivier Serrat
 
Dell Inc.: A 2019 World's Most Ethical Companies Honoree
Dell Inc.: A 2019 World's Most Ethical Companies HonoreeDell Inc.: A 2019 World's Most Ethical Companies Honoree
Dell Inc.: A 2019 World's Most Ethical Companies HonoreeOlivier Serrat
 
Perspectives on Work Teams
Perspectives on Work TeamsPerspectives on Work Teams
Perspectives on Work TeamsOlivier Serrat
 
MediSys Corp.: The IntensCare Product Development Team
MediSys Corp.: The IntensCare Product Development TeamMediSys Corp.: The IntensCare Product Development Team
MediSys Corp.: The IntensCare Product Development TeamOlivier Serrat
 
Independent Evaluation for Learning: Toward Systemic Change
Independent Evaluation for Learning: Toward Systemic ChangeIndependent Evaluation for Learning: Toward Systemic Change
Independent Evaluation for Learning: Toward Systemic ChangeOlivier Serrat
 
Evolving Corporate Culture
Evolving Corporate CultureEvolving Corporate Culture
Evolving Corporate CultureOlivier Serrat
 
Designing an Effective Knowledge Partnership Process
Designing an Effective Knowledge Partnership ProcessDesigning an Effective Knowledge Partnership Process
Designing an Effective Knowledge Partnership ProcessOlivier Serrat
 

More from Olivier Serrat (20)

Leading Organizations of the Future: A New Framework.pdf
Leading Organizations of the Future: A New Framework.pdfLeading Organizations of the Future: A New Framework.pdf
Leading Organizations of the Future: A New Framework.pdf
 
Safeguarding Lake Chad
Safeguarding Lake ChadSafeguarding Lake Chad
Safeguarding Lake Chad
 
Gilgamesh: The Prototypical Epic Hero
Gilgamesh: The Prototypical Epic HeroGilgamesh: The Prototypical Epic Hero
Gilgamesh: The Prototypical Epic Hero
 
Leading Organizations of the Future: Oral Defense
Leading Organizations of the Future: Oral DefenseLeading Organizations of the Future: Oral Defense
Leading Organizations of the Future: Oral Defense
 
Leading Organizations of the Future: A Dissertation Proposal
Leading Organizations of the Future: A Dissertation ProposalLeading Organizations of the Future: A Dissertation Proposal
Leading Organizations of the Future: A Dissertation Proposal
 
Digital Solutions: Reframing Leadership
Digital Solutions: Reframing LeadershipDigital Solutions: Reframing Leadership
Digital Solutions: Reframing Leadership
 
Leading Solutions: Essays in Business Psychology
Leading Solutions: Essays in Business PsychologyLeading Solutions: Essays in Business Psychology
Leading Solutions: Essays in Business Psychology
 
The Global Compact, Human Rights, and Nike, Inc.
The Global Compact, Human Rights, and Nike, Inc.The Global Compact, Human Rights, and Nike, Inc.
The Global Compact, Human Rights, and Nike, Inc.
 
Minority Population Analysis: The Aeta of the Philippines
Minority Population Analysis: The Aeta of the PhilippinesMinority Population Analysis: The Aeta of the Philippines
Minority Population Analysis: The Aeta of the Philippines
 
Reflections on a Multifactor Leadership Questionnaire 360 Leader's Report
Reflections on a Multifactor Leadership Questionnaire 360 Leader's ReportReflections on a Multifactor Leadership Questionnaire 360 Leader's Report
Reflections on a Multifactor Leadership Questionnaire 360 Leader's Report
 
Ethics at the Movies: Erin Brockovich (2000)
Ethics at the Movies: Erin Brockovich (2000)Ethics at the Movies: Erin Brockovich (2000)
Ethics at the Movies: Erin Brockovich (2000)
 
The Servant Leadership of Gandhi
The Servant Leadership of GandhiThe Servant Leadership of Gandhi
The Servant Leadership of Gandhi
 
Idealized Design for Virtual Teaming
Idealized Design for Virtual TeamingIdealized Design for Virtual Teaming
Idealized Design for Virtual Teaming
 
Dell Inc.: A 2019 World's Most Ethical Companies Honoree
Dell Inc.: A 2019 World's Most Ethical Companies HonoreeDell Inc.: A 2019 World's Most Ethical Companies Honoree
Dell Inc.: A 2019 World's Most Ethical Companies Honoree
 
Perspectives on Work Teams
Perspectives on Work TeamsPerspectives on Work Teams
Perspectives on Work Teams
 
MediSys Corp.: The IntensCare Product Development Team
MediSys Corp.: The IntensCare Product Development TeamMediSys Corp.: The IntensCare Product Development Team
MediSys Corp.: The IntensCare Product Development Team
 
Independent Evaluation for Learning: Toward Systemic Change
Independent Evaluation for Learning: Toward Systemic ChangeIndependent Evaluation for Learning: Toward Systemic Change
Independent Evaluation for Learning: Toward Systemic Change
 
Evolving Corporate Culture
Evolving Corporate CultureEvolving Corporate Culture
Evolving Corporate Culture
 
Designing an Effective Knowledge Partnership Process
Designing an Effective Knowledge Partnership ProcessDesigning an Effective Knowledge Partnership Process
Designing an Effective Knowledge Partnership Process
 
Inferring from Data
Inferring from DataInferring from Data
Inferring from Data
 

Recently uploaded

Simplifying Complexity: How the Four-Field Matrix Reshapes Thinking
Simplifying Complexity: How the Four-Field Matrix Reshapes ThinkingSimplifying Complexity: How the Four-Field Matrix Reshapes Thinking
Simplifying Complexity: How the Four-Field Matrix Reshapes ThinkingCIToolkit
 
Shaping Organizational Culture Beyond Wishful Thinking
Shaping Organizational Culture Beyond Wishful ThinkingShaping Organizational Culture Beyond Wishful Thinking
Shaping Organizational Culture Beyond Wishful ThinkingGiuseppe De Simone
 
From Goals to Actions: Uncovering the Key Components of Improvement Roadmaps
From Goals to Actions: Uncovering the Key Components of Improvement RoadmapsFrom Goals to Actions: Uncovering the Key Components of Improvement Roadmaps
From Goals to Actions: Uncovering the Key Components of Improvement RoadmapsCIToolkit
 
Farmer Representative Organization in Lucknow | Rashtriya Kisan Manch
Farmer Representative Organization in Lucknow | Rashtriya Kisan ManchFarmer Representative Organization in Lucknow | Rashtriya Kisan Manch
Farmer Representative Organization in Lucknow | Rashtriya Kisan ManchRashtriya Kisan Manch
 
原版1:1复刻密西西比大学毕业证Mississippi毕业证留信学历认证
原版1:1复刻密西西比大学毕业证Mississippi毕业证留信学历认证原版1:1复刻密西西比大学毕业证Mississippi毕业证留信学历认证
原版1:1复刻密西西比大学毕业证Mississippi毕业证留信学历认证jdkhjh
 
Chapter 1 Performance Management HRM.ppt
Chapter 1 Performance Management HRM.pptChapter 1 Performance Management HRM.ppt
Chapter 1 Performance Management HRM.ppt2020102713
 
From Red to Green: Enhancing Decision-Making with Traffic Light Assessment
From Red to Green: Enhancing Decision-Making with Traffic Light AssessmentFrom Red to Green: Enhancing Decision-Making with Traffic Light Assessment
From Red to Green: Enhancing Decision-Making with Traffic Light AssessmentCIToolkit
 
Mind Mapping: A Visual Approach to Organize Ideas and Thoughts
Mind Mapping: A Visual Approach to Organize Ideas and ThoughtsMind Mapping: A Visual Approach to Organize Ideas and Thoughts
Mind Mapping: A Visual Approach to Organize Ideas and ThoughtsCIToolkit
 
Digital PR Summit - Leadership Lessons: Myths, Mistakes, & Toxic Traits
Digital PR Summit - Leadership Lessons: Myths, Mistakes, & Toxic TraitsDigital PR Summit - Leadership Lessons: Myths, Mistakes, & Toxic Traits
Digital PR Summit - Leadership Lessons: Myths, Mistakes, & Toxic TraitsHannah Smith
 
Choosing the best strategy qspm matrix.pptx
Choosing the best strategy qspm matrix.pptxChoosing the best strategy qspm matrix.pptx
Choosing the best strategy qspm matrix.pptxMadan Karki
 
The Final Activity in Project Management
The Final Activity in Project ManagementThe Final Activity in Project Management
The Final Activity in Project ManagementCIToolkit
 
How-How Diagram: A Practical Approach to Problem Resolution
How-How Diagram: A Practical Approach to Problem ResolutionHow-How Diagram: A Practical Approach to Problem Resolution
How-How Diagram: A Practical Approach to Problem ResolutionCIToolkit
 
THE LEADERSHIP TO CHANGE THE WOLRD THIS IS YOUR HOUR PURSUES YOUR GIFT, TALEN...
THE LEADERSHIP TO CHANGE THE WOLRD THIS IS YOUR HOUR PURSUES YOUR GIFT, TALEN...THE LEADERSHIP TO CHANGE THE WOLRD THIS IS YOUR HOUR PURSUES YOUR GIFT, TALEN...
THE LEADERSHIP TO CHANGE THE WOLRD THIS IS YOUR HOUR PURSUES YOUR GIFT, TALEN...PROF. PAUL ALLIEU KAMARA
 
Unlocking Productivity and Personal Growth through the Importance-Urgency Matrix
Unlocking Productivity and Personal Growth through the Importance-Urgency MatrixUnlocking Productivity and Personal Growth through the Importance-Urgency Matrix
Unlocking Productivity and Personal Growth through the Importance-Urgency MatrixCIToolkit
 
Effective learning in the Age of Hybrid Work - Agile Saturday Tallinn 2024
Effective learning in the Age of Hybrid Work - Agile Saturday Tallinn 2024Effective learning in the Age of Hybrid Work - Agile Saturday Tallinn 2024
Effective learning in the Age of Hybrid Work - Agile Saturday Tallinn 2024Giuseppe De Simone
 
Measuring True Process Yield using Robust Yield Metrics
Measuring True Process Yield using Robust Yield MetricsMeasuring True Process Yield using Robust Yield Metrics
Measuring True Process Yield using Robust Yield MetricsCIToolkit
 
Paired Comparison Analysis: A Practical Tool for Evaluating Options and Prior...
Paired Comparison Analysis: A Practical Tool for Evaluating Options and Prior...Paired Comparison Analysis: A Practical Tool for Evaluating Options and Prior...
Paired Comparison Analysis: A Practical Tool for Evaluating Options and Prior...CIToolkit
 
Beyond the Five Whys: Exploring the Hierarchical Causes with the Why-Why Diagram
Beyond the Five Whys: Exploring the Hierarchical Causes with the Why-Why DiagramBeyond the Five Whys: Exploring the Hierarchical Causes with the Why-Why Diagram
Beyond the Five Whys: Exploring the Hierarchical Causes with the Why-Why DiagramCIToolkit
 

Recently uploaded (18)

Simplifying Complexity: How the Four-Field Matrix Reshapes Thinking
Simplifying Complexity: How the Four-Field Matrix Reshapes ThinkingSimplifying Complexity: How the Four-Field Matrix Reshapes Thinking
Simplifying Complexity: How the Four-Field Matrix Reshapes Thinking
 
Shaping Organizational Culture Beyond Wishful Thinking
Shaping Organizational Culture Beyond Wishful ThinkingShaping Organizational Culture Beyond Wishful Thinking
Shaping Organizational Culture Beyond Wishful Thinking
 
From Goals to Actions: Uncovering the Key Components of Improvement Roadmaps
From Goals to Actions: Uncovering the Key Components of Improvement RoadmapsFrom Goals to Actions: Uncovering the Key Components of Improvement Roadmaps
From Goals to Actions: Uncovering the Key Components of Improvement Roadmaps
 
Farmer Representative Organization in Lucknow | Rashtriya Kisan Manch
Farmer Representative Organization in Lucknow | Rashtriya Kisan ManchFarmer Representative Organization in Lucknow | Rashtriya Kisan Manch
Farmer Representative Organization in Lucknow | Rashtriya Kisan Manch
 
原版1:1复刻密西西比大学毕业证Mississippi毕业证留信学历认证
原版1:1复刻密西西比大学毕业证Mississippi毕业证留信学历认证原版1:1复刻密西西比大学毕业证Mississippi毕业证留信学历认证
原版1:1复刻密西西比大学毕业证Mississippi毕业证留信学历认证
 
Chapter 1 Performance Management HRM.ppt
Chapter 1 Performance Management HRM.pptChapter 1 Performance Management HRM.ppt
Chapter 1 Performance Management HRM.ppt
 
From Red to Green: Enhancing Decision-Making with Traffic Light Assessment
From Red to Green: Enhancing Decision-Making with Traffic Light AssessmentFrom Red to Green: Enhancing Decision-Making with Traffic Light Assessment
From Red to Green: Enhancing Decision-Making with Traffic Light Assessment
 
Mind Mapping: A Visual Approach to Organize Ideas and Thoughts
Mind Mapping: A Visual Approach to Organize Ideas and ThoughtsMind Mapping: A Visual Approach to Organize Ideas and Thoughts
Mind Mapping: A Visual Approach to Organize Ideas and Thoughts
 
Digital PR Summit - Leadership Lessons: Myths, Mistakes, & Toxic Traits
Digital PR Summit - Leadership Lessons: Myths, Mistakes, & Toxic TraitsDigital PR Summit - Leadership Lessons: Myths, Mistakes, & Toxic Traits
Digital PR Summit - Leadership Lessons: Myths, Mistakes, & Toxic Traits
 
Choosing the best strategy qspm matrix.pptx
Choosing the best strategy qspm matrix.pptxChoosing the best strategy qspm matrix.pptx
Choosing the best strategy qspm matrix.pptx
 
The Final Activity in Project Management
The Final Activity in Project ManagementThe Final Activity in Project Management
The Final Activity in Project Management
 
How-How Diagram: A Practical Approach to Problem Resolution
How-How Diagram: A Practical Approach to Problem ResolutionHow-How Diagram: A Practical Approach to Problem Resolution
How-How Diagram: A Practical Approach to Problem Resolution
 
THE LEADERSHIP TO CHANGE THE WOLRD THIS IS YOUR HOUR PURSUES YOUR GIFT, TALEN...
THE LEADERSHIP TO CHANGE THE WOLRD THIS IS YOUR HOUR PURSUES YOUR GIFT, TALEN...THE LEADERSHIP TO CHANGE THE WOLRD THIS IS YOUR HOUR PURSUES YOUR GIFT, TALEN...
THE LEADERSHIP TO CHANGE THE WOLRD THIS IS YOUR HOUR PURSUES YOUR GIFT, TALEN...
 
Unlocking Productivity and Personal Growth through the Importance-Urgency Matrix
Unlocking Productivity and Personal Growth through the Importance-Urgency MatrixUnlocking Productivity and Personal Growth through the Importance-Urgency Matrix
Unlocking Productivity and Personal Growth through the Importance-Urgency Matrix
 
Effective learning in the Age of Hybrid Work - Agile Saturday Tallinn 2024
Effective learning in the Age of Hybrid Work - Agile Saturday Tallinn 2024Effective learning in the Age of Hybrid Work - Agile Saturday Tallinn 2024
Effective learning in the Age of Hybrid Work - Agile Saturday Tallinn 2024
 
Measuring True Process Yield using Robust Yield Metrics
Measuring True Process Yield using Robust Yield MetricsMeasuring True Process Yield using Robust Yield Metrics
Measuring True Process Yield using Robust Yield Metrics
 
Paired Comparison Analysis: A Practical Tool for Evaluating Options and Prior...
Paired Comparison Analysis: A Practical Tool for Evaluating Options and Prior...Paired Comparison Analysis: A Practical Tool for Evaluating Options and Prior...
Paired Comparison Analysis: A Practical Tool for Evaluating Options and Prior...
 
Beyond the Five Whys: Exploring the Hierarchical Causes with the Why-Why Diagram
Beyond the Five Whys: Exploring the Hierarchical Causes with the Why-Why DiagramBeyond the Five Whys: Exploring the Hierarchical Causes with the Why-Why Diagram
Beyond the Five Whys: Exploring the Hierarchical Causes with the Why-Why Diagram
 

Learning from Evaluation

  • 1. The views expressed in this presentation are the views of the author/s and do not necessarily reflect the views or policies of the Asian Development Bank, or its Board of Governors, or the governments they represent. ADB does not guarantee the accuracy of the data included in this presentation and accepts no responsibility for any consequence of their use. The countries listed in this presentation do not imply any view on ADB's part as to sovereignty or independent status or necessarily conform to ADB's terminology. Learning from Evaluation Bruce Britton and Olivier Serrat 2013
  • 2. Define: Monitoring and Evaluation • Monitoring is the systematic and continuous assessment of progress of a piece of work over time which checks that things are going according to plan and enables positive adjustments to be made. • Evaluation is the systematic and objective assessment of an ongoing or completed project, program, or policy, its design and implementation. • The aim of evaluation is to determine the relevance and fulfillment of objectives, effectiveness, efficiency, impact, and sustainability. • An evaluation should provide information that is credible and useful, enabling the incorporation of lessons learned into decision-making processes. According to the Organisation for Economic Co-operation and Development
  • 3. The Planning, Monitoring, and Evaluation Triangle Planning MonitoringEvaluation Recommendations for future planning Plans show what to evaluate Monitoring revises plans during project implementation Plans show what needs to be monitored Evaluation highlights areas that need close monitoring Monitoring provides data to be used in evaluation
  • 4. Main Types of Evaluation Summative or Formative Evaluation Internal or External Evaluation Self- or Independent Evaluation Project Evaluation Program Evaluation (Geographic or Thematic) Real-Time Evaluation Impact Evaluation A quality evaluation should provide credible and useful evidence to strengthen accountability for results or contribute to learning processes, or both.
  • 6. Outputs, Outcomes, Impacts Outputs—The products, capital goods, and services that result from a project; they may also include changes resulting from the project that are relevant to the achievement of its outcome. Outcomes—The likely or achieved short- term and medium-term effects of a project's outputs. Impacts—The positive and negative, primary and secondary, long- term effects produced by a project, directly or indirectly, intended or unintended.
  • 7. OECD-DAC Evaluation Criteria Relevance —Examines the extent to which the objectives of a project matched the priorities or policies of major stakeholders (including beneficiaries) Effectiveness —Examines whether outputs led to the achievement of the planned outcome Efficiency—Assesses outputs in relation to inputs Impact —Assesses what changes (intended and unintended) have occurred as a result of the work Sustainability—Looks at how far changes are likely to continue in the longer term
  • 8. The Results Chain and the OECD-DAC Evaluation Criteria Needs Objective Inputs Activities Outputs Outcome Impact Relevance Efficiency Effectiveness Sustainability
  • 9. Challenges and Limits to Management Logic Degree of Control Challenge of Monitoring and Evaluation Impact What the project is expected to contribute to Outcome What the project can be expected to achieve and be accountable for Outputs What is within the direct control of the project's management Activities Inputs DecreasingControl IncreasingDifficulty
  • 10. Indicators An indicator is a quantitative or qualitative factor or variable that offers a means to measure accomplishment, reflects the changes connected with a project, and helps assess performance. Indicators do not provide proof so much as a reliable sign that the desired changes are happening (or have happened). It is important not to confuse indicators with outputs, outcomes, or impacts. Achieving the expected change in the indicators should not become the main purpose of a project.
  • 11. Planning and the Use of Logic Models • In development assistance, most projects are planned using logic models such as the logical framework (logframe). • Logic models provide a systematic, structured approach to the design of projects. • Logic models involve determining the strategic elements (inputs, outputs, outcome, and impact) and their causal relationships, indicators, and the assumptions or risks that may influence success or failure. • Logic models can facilitate the planning, implementation, and evaluation of projects; however, they have significant limitations that can affect the design of evaluation systems.
  • 12. The Limitations of Logic Models Usually assume simple, linear cause-effect development relationships Overlook or undervalue unintended or unplanned outcomes Do not make explicit the theory of change underlying the initiative Do not cope well with multi-factor, multi-stakeholder processes Undervalue creativity and experimentation in the pursuit of long-term, sustainable impact (the "lockframe" problem) Encourage fragmented rather than holistic thinking Require a high level of planning capacity
  • 13. Purposes of Evaluation • To provide a basis for accountability, including the provision of information to the public Accountability • To improve the development effectiveness of future policies, strategies, and operations through feedback of lessons learned Learning
  • 14. Does Evaluation Have to Be Either/Or?
  • 15. What is Accountability? Accountability is the obligation to demonstrate that work has been conducted in compliance with agreed rules and standards or to report fairly and accurately on performance results vis-à-vis mandated roles and/or plans. This may require a careful, even legally defensible, demonstration that the work is consistent with the contract aims. Accountability is about demonstrating to donors, beneficiaries, and implementing partners that expenditure, actions, and results are as agreed or are as can reasonably be expected in a given situation.
  • 16. Evaluation for Accountability Relates to standards, roles, and plans Is shaped by reporting requirements Focuses on effectiveness and efficiency Measures outputs and outcomes against original intentions Has a limited focus on the relevance and quality of the project Overlooks unintended outcomes (positive and negative) Concerns mostly single-loop learning
  • 17. What is Learning? Data WisdomInformation Knowledge Know WhyKnow HowKnow What Reductionist Systemic • Learning is the acquisition of knowledge or skills through instruction, study, and experience. • Learning is driven by organization, people, knowledge, and technology working in harmony—urging better and faster learning, and increasing the relevance of an organization. • Learning is an integral part of knowledge management and its ultimate end.
  • 18. Evaluation for Learning Recognizes the difference an organization has made Understands how the organization has helped to make a difference Explores assumptions specific to each component of a project Shares the learning with a wide audience
  • 20. Evaluation for Accountability and Evaluation for Learning Item Evaluation for Accountability Evaluation for Learning Basic Aim The basic aim is to find out about the past. The basic aim is to improve future performance. Emphasis Emphasis is on the degree of success or failure. Emphasis is on the reasons for success or failure. Favored by Parliaments, treasuries, media, pressure groups Development agencies, developing countries, research institutions, consultants Selection of Topics Topics are selected based on random samples. Topics are selected for their potential lessons. Status of Evaluation Evaluation is an end product. Evaluation is part of the project cycle.
  • 21. Evaluation for Accountability and Evaluation for Learning Item Evaluation for Accountability Evaluation for Learning Status of Evaluators Evaluators should be impartial and independent. Evaluators usually include staff members of the aid agency. Importance of Data from Evaluations Data are only one consideration. Data are highly valued for the planning and appraising of new development activities. Importance of Feedback Feedback is relatively unimportant. Feedback is vitally important.
  • 23. Programs Should Be Held Accountable For Asking difficult questions Maintaining a focus on outcome Identifying limitations, problems, and successes Taking risks rather than "playing it safe" Actively seeking evaluation and feedback Actively challenging assumptions Identifying shortcomings and how they might be rectified Effectively planning and managing based on monitoring data Acting on findings from evaluation Generating learning that can be used by others
  • 24. What is Feedback? Evaluation feedback is a dynamic process that involves the presentation and dissemination of evaluation information in order to ensure its application into new or existing projects. Feedback, as distinct from dissemination of evaluation findings, is the process of ensuring that lessons learned are incorporated into new operations.
  • 25. Actions to Improve the Use of Evaluation Feedback Understand how learning happens within and outside an organization Identify obstacles to learning and overcome them Assess how the relevance and timeliness of evaluation feedback can be improved Tailor feedback to the needs of different audiences Involve stakeholders in the design and implementation of evaluations and the use of feedback results
  • 26. Who Can Learn from Evaluation? The wider community People who are or will be planning, managing, or executing similar projects in the future The people who contribute to the evaluation (including direct stakeholders) The people who conduct the evaluation The people who commission the evaluation The beneficiaries who are affected by the work being evaluated The people whose work is being evaluated (including implementing agencies)
  • 27. Why We Need a Learning Approach to Evaluation To reap these opportunities, evaluation must be designed, conducted, and followed-up with learning in mind. Evaluation provides unique opportunities to learn throughout the management cycle of a project. Learning should be at the core of every organization to enable adaptability and resilience in the face of change.
  • 28. How Can Stakeholders Contribute to Learning from Evaluation? Help design the terms of reference for the evaluation Be involved in the evaluation process as part of the evaluation team or reference group or as a source of information) Discuss and respond to the analyses and findings Discuss and respond to recommendations Use findings to influence future practice or policy Review the evaluation process
  • 29. What is a "Lesson"? Lessons learned are findings and conclusions that can be generalized beyond the evaluated project. In formulating lessons, the evaluators are expected to examine the project in a wider perspective and put it in relation to current ideas about good and bad practice.
  • 30. What is Needed to Learn a "Lesson"? Reflect: what happened? Identify: was there a difference between what was planned and what actually happened? Analyze: why was there a difference and what were its root causes? Generalize: what can be learned from this and what could be done in the future to avoid the problem or repeat the success? Triangulate: what other sources confirm the lesson? At this point, we have a lesson identified but not yet learned: to truly learn a lesson one must take action.
  • 31. What Influences Whether a Lesson is Learned? Political Factors Inspired Leadership The Quality of the Lesson Access to the Lesson Conventional Wisdom Chance Vested Interests Risk Aversion Bandwagons Pressure to Spend Bureaucratic Inertia
  • 32. Quality Standards for Evaluation Use and Learning The evaluation is designed, conducted, and reported to meet the needs of its intended users. Conclusions, recommendatio ns, and lessons are clear, relevant, targeted, and actionable so that the evaluation can be used to achieve its intended accountability and learning objectives. The evaluation is delivered in time to ensure optimal use of results. Systematic storage, dissemination, and management of the evaluation report is ensured to provide easy access to all partners, reach target audiences, and maximize the benefits of the evaluation.
  • 33. Monitoring and Evaluation Systems as Institutionalized Learning Learning is also a key tool for management and, as such, the strategy for the application of evaluative knowledge is an important means of advancing towards outcomes. Information must be disseminated and available to potential users in order to become applied knowledge. Learning must be incorporated into the overall management cycle of a project through an effective feedback system.
  • 34. A Learning Approach to Evaluation In development assistance, the overarching goal for evaluation is to foster a transparent, inquisitive, and self-critical organization culture across the whole international development community so we can all learn to do better.
  • 35. Eight Challenges Facing Learning-Oriented Evaluations The inflexibility of logic models The demands for accountability and impact The constraints created by rigid reporting frameworks The constraints of quantitative indicators Involving stakeholders Learning considered as a knowledge commodity Underinvestment in evaluation Underinvestment in the architecture of knowledge management and learning
  • 36. Focus of the Terms of Reference for an Evaluation Evaluation Purpose Project Background Stakeholder Involvement Evaluation Questions Findings, Conclusions, and Recommendations Methodology Work Plan and Schedule Deliverables
  • 37. Building Learning into the Terms of Reference for an Evaluation Make the drafting of the terms of reference a participatory activity—involve stakeholders if you can Consider the utilization of the evaluation from the outset—who else might benefit from it? Spend time getting the evaluation questions clear and include questions about unintended outcomes Ensure that the "deliverables" include learning points aimed at a wide audience Build in diverse reporting and dissemination methods for a range of audiences Ensure there is follow- up by assigning responsibilities for implementing recommendations Build in a review of the evaluation process
  • 38. Why Questions Are the Heart of Evaluation for Learning Learning is best stimulated by seeking answers to questions Questions cut through bureaucracy and provide a meaningful focus for evaluation Seeking answers to questions can motivate and energize Questions make it easier to design the evaluation: what data to gather, how, and from whom? Answers to questions can provide a structure for findings, conclusions, and recommend ations
  • 39. Criteria for Useful Evaluation Questions Data can be used to answer each question There is more than one possible answer to each question: each question is open and its wording does not pre-determine the answer The primary intended users want to answer the questions: they care about the answers The primary users want to answer the questions for themselves, not just for someone else The intended users have ideas about how they would use the answers to the questions: they can specify the relevance of the answers for future action
  • 40. Utilization-Focused Evaluation Utilization-focused evaluation is done for and with specific intended primary users for specific intended uses. It begins with the premise that evaluations should be judged by their utility and actual use. It concerns how real people in the real world apply evaluation findings and experience the evaluation process. Therefore, the focus in utilization-focused evaluation is intended use by intended users.
  • 41. The Stages of Utilization-Focused Evaluation 1. Identify primary intended users 2. Gain commitment and focus the evaluation 3. Decide on evaluation methods 4. Analyze and interpret findings, reach conclusions, and make recommendations 5. Disseminate evaluation findings
  • 45. Conceptual Use of Evaluation • Conceptual use is about generating knowledge in and understanding of a given area. Then, people think about the project in a different way. • Over time and given changes in the contextual and political circumstances surrounding the project, conceptual use can lead to significant changes. Genuine Learning
  • 46. Instrumental Use of Evaluation • The evaluation directly affects decision-making and influences changes in the program under review. • Evidence for this type of utilization involves decisions and actions that arise from the evaluation, including the implementation of recommendations. Practical Application
  • 47. Process Use of Evaluation • Process use concerns how individuals and organizations are impacted as a result of participating in an evaluation. Being involved in an evaluation may lead to changes in the thoughts and behaviors of individuals which may then lead to beneficial cultural and organizational change. • Types of use that precede lessons learned include learning to learn, creating shared understanding, developing networks, strengthening projects, and boosting morale. Learning by Doing
  • 48. Symbolic Use of Evaluation • Symbolic use means that evaluations are undertaken to signify the purported rationality of the agency in question. Hence one can claim that good management practices are in place. Purposeful Non-Learning
  • 49. Political Use of Evaluation • Evaluation occurs after key decisions have been taken. The evaluation is then used to justify the pre-existing position, e.g., budget cuts to a program. Learning is Irrelevant
  • 50. Factors That Affect Utilization Relevance of the findings, conclusions, and recommendations Credibility of the evaluators Quality of the analysis Actual findings The evaluator's communication practices Timeliness of reporting The organizational setting The attitudes of key individuals towards the evaluation The organizational climate, e.g., decision-making, political, and financial
  • 51. Obstacles to Learning from Evaluation Individual Defense Mechanisms • Immediate personal reaction to feedback that threatens us tends to be defensive. In addition, we tend to resist evidence that does not accord with our own world views. • It takes a conscious effort to actively seek feedback and hear evidence that may be negative, or may not fit with our own world view. It is hard to treat discordant information as something that may help us to improve, and to navigate that information with curiosity rather than suspicion.
  • 52. Obstacles to Learning from Evaluation Organizational Dilemmas • Do organizational culture, structures, policies, or procedures help or hinder learning? Do design processes build ownership and accountability from the outset? Do reporting and review procedures foster honesty and trust? Do procedures allow for flexibility and change? Do staff collectively share their experiences and insights about what works? Are there incentives for learning? Are there time and resource constraints?
  • 53. Enhancing Learning from Evaluation For Individual Evaluations • Select topics that are relevant to your intended audiences and their timeframes. • Proactively plan for use from the start: this means intended use by intended users. Identify supportive and influential individuals who want evaluative information. • Vigorously engage intended users throughout the evaluation process, e.g., by means of advisory committees, help with forming recommendations, data analysis. Learning is an active, not passive, process.
  • 54. Enhancing Learning from Evaluation For Individual Evaluations • The evaluation needs to be credible in the eyes of users and of high quality. If your findings are controversial your evidence needs to be of an even higher standard. • Timely reporting supports immediate use although research suggests evaluations have a useful life of 8–10 years. • Reporting of results needs to make use of multiple formats, e.g., written, verbal, and/or visual, while presenting 3–5 key messages in a user-friendly format.
  • 55. Enhancing Learning from Evaluation For Individual Evaluations • Good recommendations are technically, politically, administratively, legally, and financially viable. • Evaluation findings must be assertively disseminated in a manner that supports audience engagement. Your evaluation findings are competing against other sources of information. • Think about learning and use from a change management perspective. Resistance to change is to be expected. Potential users benefit from support: technical; emotional; financial; practical help and advice; etc.
  • 56. Enhancing Learning from Evaluation Institutionalizing Monitoring and Evaluation Systems • To enhance the prospects of learning and use, there is a need to link evaluation into mainstream processes such as policy making, planning, budgeting, accountability and reporting, managing for results, and organizational incentives
  • 57. Monitoring and Evaluation: Conventional and Narrative Conventional • Deductive—about expected outcomes • Indicators often determined by senior staff • Closed or specific questioning • Analysis by management • Based on numbers—no context • About "proving" • Central tendencies Narrative • Inductive—about unexpected outcomes • Diversity of views (from staff and beneficiaries) • Open questioning • Participatory analysis • Contextual—"rich description" • About learning and improvement • Outer edges of experience
  • 58. What is Required of Today's Evaluations Involve Stakeholders Design Evaluations to Enhance Use Focus on Performance Improvement Demonstrate Transparency Show Cultural Competence Build Evaluation Capacity
  • 59. Why Use a Narrative (Story-Based) Approach? Storytelling Stories start with the lived experience of beneficiaries People tell stories naturally People remember stories Stories can convey difficult messages Stories provide a "rich picture" to decision- makers Stories provide a basis for discussion
  • 60. What is Most Significant Change? The Most Significant Change technique is a qualitative and participatory form of monitoring and evaluation based on the collection and systematic selection of stories of reported changes from projects. The Most Significant Change technique does not use pre- designed indicators but allows these to emerge from the stories told by those affected by the project.
  • 61. The Most Significant Change Cycle Project Changes in Peoples' Lives Stories Learning Action Improved Project
  • 62. The Core of the Most Significant Change Technique A question: "In your opinion, what was the most significant change that took place in … over the … months?" [Describe the change and explain why you think it is significant.] Re-iteration of the same kind of question: "Decide which of the change stories collected describes the 'most significant' changes experienced by the respondents." [Describe the change and tell why you think it is significant.]
  • 63. What Makes Most Significant Change Different The advantages of the Most Significant Change technique, compared to conventional approaches to monitoring and evaluation, are that Those participating have a choice about what sort of information to collect. The technique uses diverse rather than standard data. Information is analyzed by all participants, not simply by a central unit. Subjectivity is used rather than avoided.
  • 64. The 10 Steps of the Most Significant Change Technique 1. Get started, establish champions, and familiarize 2. Determine domains of change 3. Define the reporting period 4. Collect stories 5. Review and filter stories regularly 6. Discuss and communicate the results of the selection with stakeholders 7. Verify stories 8. Quantify 9. Conduct secondary analysis and meta- monitoring 10. Revise the process
  • 65. Selecting Significant Change Stories Staff read through and identify the most significant of all the submitted significant change stories. Selection criteria emerge through discussion of stories— these criteria are noted. Staff document (i) what significant change was selected, (ii) why it was selected, (iii) the process used to make the selection, and (iv) who was involved. Subjectivity is made accountable through transparency.
  • 66. How to Use the Most Significant Change Technique Not as a stand-alone method Alongside indicator- based systems To identify unexpected changes To engage people in analysis of change To involve a wide range of people To focus on outcomes rather than outputs
  • 67. The Conventional Problem-Focused Approach to Evaluation Identify the issues or problems Determine the root causes Generate solutions Develop action plans Implement action plans
  • 68. Problem-Focused Approach— Assumptions • There is some ideal way for things to be (usually determined by the logic model). • If a situation is not as we would like it to be, it is a "problem" to be solved. • Deviations from the plan (logic model) are automatically seen as problems. • The way to solve a problem is to break it into parts and analyze it. • If we find the broken part and fix it, the whole problem will be solved.
  • 69. Unintended Consequences of Problem-Focused Approaches • Fragmented responses—lack of holistic overview • Necessary adaptations to plans viewed negatively • Focus on single-loop learning—lack of creativity and innovation; untested assumptions • Reinforces negative vocabulary—drains energy; leads to hopelessness and wish to simply get work completed • Reinforces "blame culture"—undermines trust; increases risk aversion; strains relationships
  • 70. Appreciative Inquiry • Recognize the quality, significance or magnitude of • Recognizing the best in someone or something • To be fully aware of or sensitive to • To raise in value or price Ap-pre'ci-ate, v. • The act of exploration and discovery • The process of gathering information for the purpose of learning and changing • A close examination in a quest for truth In-quire', v.
  • 71. What is Appreciate Inquiry? • Appreciative inquiry builds on learning from what is working well rather than focusing on "fixing" problems. • Appreciative inquiry brings positive experiences and successes to everyone's awareness. • Appreciative inquiry uses a process of collaborative inquiry that collects and celebrates good news stories. • Stories that emanate from appreciative inquiry generate knowledge that strengthens the identity, spirit, and vision of the team involved in the project and helps everyone learn how to better guide its development.
  • 72. Appreciative Inquiry and Evaluation • Appreciative inquiry helps identify and value what is working well in a project and builds on these good practices. • Appreciative inquiry is better suited to formative evaluation or monitoring than to summative evaluation. • Appreciative inquiry can be used to guide questions during development of the terms of reference for an evaluation and at data collection stages.
  • 73. Comparing Appreciative Inquiry with Problem-Focused Approaches What to grow New grammar of the true, good, better, possible "Problem focus" implies that there is an ideal Expands vision of preferred future Creates new energy fast Assumes organizations are sources of capacity and imagination What to fix Underlying grammar = problem, symptoms, causes, solutions, action plan, intervention Breaks things into pieces and specialties, guaranteeing fragmented responses Slow! It takes a lot of positive emotion to make real change Assumes organizations are constellations of problems to be overcome AppreciativeInquiry ProblemFocus
  • 74. The Appreciative Inquiry Process —The 5-Ds or 5-Is 1. Definition: Frame the inquiry (Initiate) 2. Discovery: What is good? What has worked? (Inquire) 3. Dream: What might be? (Imagine) 4. Design: What should be? (Innovate) 5. Destiny: How to make what should be happen? (Implement)
  • 75. Example Starter Questions for Appreciative Inquiry • Think back on your time with this project. Describe a high point or exceptional experience that demonstrates what the project has been able to achieve. • Describe a time when this project has been at its best—when people were proud to be a part of it. What happened? What made it possible for this highpoint to occur? What would things look like if that example of excellence was the norm? Good appreciative inquiry questions should illuminate in turn the five dimensions the technique addresses.
  • 76. Appreciative Inquiry Can Enrich Evaluation When … The organization is interested in using participatory and collaborative evaluation approaches The evaluation is happening part way through a project (formative) The evaluation includes a wide range of stakeholders with differing views of "success" The organization is genuinely interested in learning from unintended as well as intended outcomes The organization wants to use evaluation findings to guide change There is a desire to build evaluation capacity
  • 77. The Nature of Development Complex: involves a mix of actors and factors Indeterminate: independent of the duration of a project Non-Linear: unexpected, emergent, discontinuous Two-Way: results may change the project Beyond Control: but subject to influence Incremental, Cumulative: watersheds and tipping points
  • 78. Challenges in Evaluating Development Interventions Establishing cause and effect in open systems Measuring what did not happen Reporting on emerging objectives Justifying continuing "successful" projects Timing—when to evaluate Encouraging iterative learning among partners Clarifying values Working in "insecure" situations
  • 79. A Critical Look at Logic Models Clarify objectives and how they will be achieved Make explicit the assumptions about cause and effect Identify potential risks Establish how progress will be monitored Lack of flexibility Lack of attention to relationships Problem-focused approach to planning Insufficient attention to outcomes Oversimplifies monitoring and evaluation Inappropriate at program and organizational levels
  • 80. Outcome Mapping Outcome mapping is an approach to planning, monitoring, and evaluating social change initiatives. Outcome mapping uses a set of tools and guidelines that steer teams through a process to identify their project's desired changes and to work collaboratively to bring about those changes. Results are measured by changes in the relationships, behaviors, and actions of the individuals, groups, and organizations the project is working directly with and seeking to influence.
  • 81. Outcome Mapping Can Help … Understand and influence human well-being Plan and measure social change Foster social and organizational learning Identify parties with whom one might work directly to influence behavioral change Bring stakeholders into the monitoring and evaluation process Strengthen partnerships and alliances Plan and monitor behavioral change Monitor the internal practices of the project Design an evaluation plan
  • 82. The Three Key Concepts of Outcome Mapping Sphere of Influence Boundary Partners Outcomes Understood as Changes in Behavior Development is about people—it is about how they relate to one another and their environment, and how they learn in doing so. Outcome mapping puts people and learning first and accepts unexpected change as a source of innovation. It shifts the focus from changes in state, viz. reduced poverty, to changes in behaviors, relationships, actions, and activities. —Olivier Serrat
  • 83. There is a Limit to Our Influence Project Partners Beneficiaries Sphere of Control Sphere of Influence Sphere of Interest
  • 84. There is a Limit to Our Influence Inputs Outcomes ImpactOutputs Sphere of Control Sphere of Influence Sphere of Interest
  • 85. Focus of Outcome Mapping Outcome Mapping Inputs Activities Outputs Outcome Impact
  • 87. Boundary Partner Example Participatory research on demonstration plots to reduce use of chemicals and introduce yam cultivation Farmers participate in field trials Participating farmers learn how to identify and use non-cultivated food sources and how to grow yams Extension workers visit demonstration farms Training of extension workers Documentation of effective farming practices Increased knowledge of traditional practices through food festivals and other social marketing Extension workers promote use of non- cultivated food sources, natural fertilizers, and adoption of yam cultivation Other farmers use non- cultivated food sources and grow yams Less dependency on market for food sources Greater food security Improved health and reduced poverty
  • 88. The Problem with Impact Impact Implies … • Cause and effect • Positive, intended results • Focus on ultimate effects • Credit goes to a single contributor • Story ends when project obtains success The Reality is … • Open system • Unexpected positive and negative results occur • Upstream effects are important • Multiple actors create results and deserve credit • Change process never ends
  • 89. The Principles of Outcome Mapping Actor-Centered Development and Behavioral Change Continuous Learning and Flexibility Participation and Accountability Non-Linearity and Contribution (not attribution and control)
  • 90. Three Stages of Outcome Mapping Intentional Design • Step 1: Vision • Step 2: Mission • Step 3: Boundary Partners • Step 4: Outcome Challenges • Step 5: Progress Markers • Step 6: Strategy Maps • Step 7: Organizational Practices Outcome and Performance Monitoring • Step 8: Monitoring Priorities • Step 9: Outcome Journals • Step 10: Strategy Journal • Step 11: Performance Journal Evaluation Planning • Step 12: Evaluation Plan
  • 91. When Does Outcome Mapping Work Best? When working in partnership When building capacity When a deeper understanding of social factors is critical When promoting knowledge and influencing policy When tackling complex problems To embed reflection and dialogue
  • 92. Tips for Introducing Outcome Mapping Use it flexibly Foster capacities and mindsets Use it to encourage collaboration Use it to manage shifts in organizatio nal culture Focus on timing
  • 93. Learning and Project Failure Stage Category Preparation Failures of intelligence: not knowing enough at the early stages of project formulation, resulting in crucial aspects of the project's context being ignored. Failures of decision making: drawing false conclusions or making wrong choices from the data that are available, and underestimating the importance of key pieces of information. Implementation Failures of implementation: bad or inadequate management of one or more important aspects of the project. Failures of reaction: inability or unwillingness to modify the project in response to new information or changes in conditions that come to light as the project proceeds. Evaluation Failures of evaluation: not paying enough attention to the results. Failures of learning: not transferring the lessons into future plans and procedures.
  • 94. Competencies for Knowledge Management and Learning Strategy Development A strategy is a long-term plan of action to achieve a particular goal. Management Techniques Knowledge is a resource. It needs to be managed effectively, just like other resources such as financial and human resources. Collaboration Mechanisms When working with others, efforts sometimes turn out to be less than the sum of the parts. Too often, not enough attention is paid to facilitating effective collaborative practices. Knowledge Sharing and Learning Two-way communication that takes place simply and effectively builds knowledge. Knowledge Capture and Storage Organizational memory is a key part of any knowledge management system. The knowledge needs to be retrievable to be useful.
  • 95. Knowledge Solutions for Knowledge Management and Learning • Behavior and change; emergence and scenario thinking; institutional capacity and participation; knowledge assets; marketing; organizational learning; partnerships and networks of practice Strategy Development • Branding and value; complexity and lateral thinking; linear thinking; organizational change; talent management Management Techniques • Collaborative tools; communities of practice and learning alliances; leadership; social innovations; teamwork Collaboration Mechanisms • Creativity, innovation, and learning; learning and development; learning lessons; dissemination Knowledge Sharing and Learning • Knowledge harvesting; reporting; technology platforms Knowledge Capture and Storage www.adb.org/site/knowledge-management/knowledge-solutions
  • 96. Developing Evaluation Capacity Capacity is the ability of people, organizations, and society to manage their affairs successfully. Capacity to undertake effective monitoring and evaluation is a determining factor of development effectiveness. Evaluation capacity development is the process of reinforcing or establishing the skills, resources, structures, and commitment to conduct and use monitoring and evaluation over time.
  • 97. Why Develop Evaluation Capacity? Stronger evaluation capacity will help development agencies • Develop as a learning organization. • Take ownership of their visions for poverty reduction, if the evaluation vision is aligned with that. • Profit more effectively from formal evaluations. • Make self-evaluations an important part of their activities. • Focus on quality improvement efforts. • Increase the benefits and decrease the costs associated with their operations. • Augment their ability to change programming midstream and adapt in a dynamic, unpredictable environment. • Build evaluation equity, if they are then better able to conduct more of their own self-evaluation, instead of hiring them out. • Shorten the learning cycle.
  • 98. Using Knowledge Management for Evaluation Evaluation findings only add value when they are used, so: • Make evaluation findings available when needed by decision makers, in a user-friendly format, e.g., a searchable lessons database system. • Invest in knowledge management architecture. • Make evaluation findings available in a range of knowledge products, including web-based. • Encourage collaboration between evaluation specialists and knowledge management specialists. • Improve targeted dissemination of evaluation findings.
  • 99. How to Share Findings from Evaluations To increase the chances evaluation findings will be used they must be shared widely, so: • Upload to public websites. • Hold meetings with interested stakeholders. • Incorporate findings into existing publications, e.g., annual reports, newsletters. • Present findings and learning points at annual meetings. • Publish an article for a journal. • Present a paper at a conference or seminar. • Invite local researchers and academics to discuss evaluation findings. • Share findings and learning points through workshops and training. • Share lessons through knowledge networks and communities of practice.
  • 100. Characteristics of a Good Knowledge Product A good knowledge product is • Related to what users want • Designed for a specific audience • Relevant to decision-making needs • Timely • Written in clear and easily understandable language • Based on evaluation information without bias • If possible, developed through a participatory process and validated through a quality assurance process with relevant stakeholders • Easily accessible to target audience • Consistent with what other products enhance visibility and learning
  • 101. Further Reading • ADB. 2008. Output Accomplishment and the Design and Monitoring Framework. Manila. www.adb.org/publications/output-accomplishment-and- design-and-monitoring-framework • ——. 2008. Focusing on Project Metrics. Manila. www.adb.org/publications/focusing-project-metrics • ——. 2008. Outcome Mapping. Manila. www.adb.org/publications/outcome-mapping • ——. 2008. The Reframing Matrix. Manila. www.adb.org/publications/reframing-matrix • ——. 2008. Appreciative Inquiry. Manila. www.adb.org/publications/appreciative-inquiry
  • 102. Further Reading • ADB. 2009. The Most Significant Change Technique. Manila. www.adb.org/publications/most-significant-change-technique • ——. 2009. Monthly Progress Notes. Manila. www.adb.org/publications/monthly-progress-notes • ——. 2009. Learning from Evaluation. Manila. www.adb.org/publications/learning-evaluation • ——. 2009. Asking Effective Questions. Manila. www.adb.org/publications/learning-evaluation • ——. 2010. Embracing Failure. Manila. www.adb.org/publications/embracing-failure
  • 103. Further Reading • ADB. 2010. Harvesting Knowledge. Manila. www.adb.org/publications/harvesting-knowledge • ——. 2010. The Perils of Performance Measurement. Manila. www.adb.org/publications/perils-performance-measurement • ——. 2011. Learning Histories. Manila. www.adb.org/publications/learning-histories • Anne Acosta and Boru Douthwaite. 2005. Appreciative Inquiry: An Approach for Learning and Change Based on Our Own Best Practices. ILAC Brief 6. • Ollie Bakewell. 2003. Sharpening the Development Process. INTRAC.
  • 104. Further Reading • Scott Bayley. 2008. Maximizing the Use of Evaluation Findings. Manila. • Rick Davies and Jess Dart. 2004. The Most Significant Change (MSC) Technique: A Guide to Its Use. Monitoring and Evaluation News. • Paul Engel and Charlotte Carlsson. 2002. Enhancing Learning through Evaluation. ECDPM. • Lucy Earle. 2003. Lost in the Matrix: The Logframe and the Local Picture. INTRAC. • Paul Engel, Charlotte Carlsson, and Arin Van Zee. 2003. Making Evaluation Results Count: Internalizing Evidence by Learning. ECDPM.
  • 105. Further Reading • Ollie Bakewell and Anne Garbutt. 2005. The Use and Abuse of the Logical Framework Approach. Sida. • Stephen Gill. 2009. Developing a Learning Culture in Nonprofit Organizations. Sage • Harry Jones and Simon Hearn. 2009. Outcome Mapping: A Realistic Alternative for Planning, Monitoring, and Evaluation. Overseas Development Institute. • OECD. 2001. Evaluation Feedback for Effective Learning and Accountability. • ——. 2010. DAC Quality Standards for Development Evaluation.
  • 106. Further Reading • OECD. Undated. Evaluating Development Co-Operation: Summary of Key Norms and Standards. • Michael Quinn Patton. 2008. Utilization Focused Evaluation. Sage. • Michael Quinn Patton and Douglas Horton. 2009. Utilization- Focused Evaluation for Agricultural Innovation. CGIAR-ILAC. • Burt Perrin. 2007. Towards a New View of Accountability. In Marie-Louise Bemelmans-Vide, Jeremy Lonsdale, and Burt Perrin (eds.). 2007. Making Accountability Work: Dilemmas for Evaluation and for Audit. Transaction Publishers. • Hallie Preskill and Rosalie Torres. 1999. Evaluative Inquiry for Learning in Organizations. Sage.
  • 107. Further Reading • Sida. 2004. Looking Back, Moving Forward. Sida Evaluation Manual. Sida. • UNDP. 2009. Handbook on Planning Monitoring and Evaluation for Development Results. • Rob Vincent and Ailish Byrne. 2006. Enhancing Learning in Development Partnerships. Development in Practice. Vol. 16, No. 5, pp. 385-399. • Eric Vogt, Juanita Brown, and David Isaacs. 2003. The Art of Powerful Questions: Catalyzing Insight, Innovation, and Action. Whole Systems Associates.
  • 108. Further Reading • Jim Woodhill. 2005. M&E as Learning: Re-Thinking the Dominant Paradigm. In Monitoring and Evaluation of Soil Conservation and Watershed Development Projects. World Association of Soil And Water Conservation. • World Bank. 2004. Ten Steps to a Results-Based Monitoring and Evaluation System.
  • 109. Videos • Jess Dart. Most Significant Change, Part 1. www.youtube.com/watch?v=H32FTygl-Zs&feature=related • ——. Most Significant Change, Part 2. www.youtube.com/watch?v=b-wpBoVPkc0&feature=related • ——. Most Significant Change, Part 3. www.youtube.com/watch?v=PazXICHBDDc&feature=related • ——. Most Significant Change Part 4. www.youtube.com/watch?v=8DmMXiJr1iw&feature=related • ——. Most Significant Change Part 5 (Q&A). www.youtube.com/watch?v=JuaGmstG8Kc&feature=related • Sarah Earl. Introduction to Outcome Mapping, Part 1. www.youtube.com/watch?v=fPL_KEUawnc
  • 110. Videos • Sarah Earl. Introduction to Outcome Mapping, Part 2. www.youtube.com/watch?v=a9jmD-mC2lQ&NR=1 • ——. Introduction to Outcome Mapping, Part 3. www.youtube.com/watch?v=ulXcE455pj4&feature=related • ——. Utilization-Focused Evaluation. www.youtube.com/watch?v=KY4krwHTWPU&feature=related
  • 111. Quick Response Codes @ADB @ADB Sustainable Development Timeline @Academia.edu @LinkedIn @ResearchGate @Scholar @SlideShare @Twitter