SlideShare a Scribd company logo
1 of 30
Download to read offline
Program
Evaluation:
Methods and
Case Studies
Emil J. Posavac and
Raymond G. Carey
7th Edition. 2007. New
Jersey: Pearson,
Prentice Hall.
Aung Thu Nyein
DA- 8020 Policy Studies
Content
 About the authors
 Chapter 1: Program Evaluation: An
  Overview
 Chapter 3: Selecting criteria and setting
  standards
About the authors
 Emil    J. Posavac
Ph. D., University of Illinois, a professor Emeritus of Psychology at
Loyola University of Chicago,
Director of applied social psychology graduate program
Awarded for Myrdal Award by American Evaluation Association

 Raymond          G. Carey
Ph. D., Loyola University of Chicago, principal of R. G. Carey
Associates.
Widely published in the field of health services and quality
assurance.
An Overview
   Evaluation is natural routine.

   “Program evaluation is a collection of methods,
    skills, and sensitivities necessary to determine
    whether a human service is needed and likely to
    be used, whether the service is sufficiently
    intensive to meet the unmet needs identified,
    whether the service is offered as planned, and
    whether the service actually does help people in
    need at a reasonable cost without unacceptable
    side effects.”
An Overview… Contd.
But program evaluation is different with natural, automatic
evaluation.

   First, organization efforts are carried out by team. This
    specialization means that responsibility for program
    evaluation is diffused among many people.
   Secondly, most programs attempt to achieve objectives
    that can only be observed sometime in the future rather
    than in a matter of minutes. Then choice of criteria?
   Third, when evaluating our own ongoing work, a single
    individual fills many roles– workers, evaluator, beneficiary,
    recipient of the feedback, etc.
   Last, programs are usually paid for by parties other than
    clients of the program.
Evaluation tasks that need to be done

PE is designed to assist some audience to access the a
program’s merit or worth.

   Verify that resources would be devoted to meeting unmet
    needs
   Verify that implemented programs do provide services
   Examine the outcomes
   Determine which program produce the most favorable
    outcome
   Select the programs that offer the most needed types of
    services
   Provide information to maintain and improve quality
   Watch for unplanned side effects.
Common Types of Program
Evaluation
   Assess needs of the program participants
     Identify and measure the level of unmet needs,
     Some alternatives
   Examine the process of meeting the needs
     Extent of the implementation,
     the nature of people being served
     The degree to which the program operates as
       planned
   Measure the outcomes of the program
     Who had received what?
     Program service makes changes for better?
     Different opinions of people on outcome?
   Integrate the needs, costs, and outcomes
     Cost-effectiveness
Activities often confused with
program evaluation
 Basic research
 Individual assessment
 Program audit


 Although   these activities are valuable,
 program evaluation is different and more
 difficult to carry out.
Different Types of Evaluations
for Different Kinds of Programs
   No “one size fits all” approach.
   Organizations needing program evaluations
       Health care
       Criminal justice
       Business and Industry
       Government
   Time Frame of needs
       Short-term needs
       Long-term needs
       Potential needs
Extensiveness of the programs
 Some  programs are offered to small
  group of people with similar needs, but
  other are developed for use at many sites
  through out the country.
 Complexities involved.
Purpose of program evaluation
   The over all purpose of program evaluation is
    contributing to the provision of quality services to the
    people in needs.
   Feedback mechanism: formative evaluations or
    summative evaluations or evaluation for knowledge.
   A Feedback Loop
The roles of evaluators
   A variety of work setting
     Internal evaluators
     External: of governmental or regulatory
      agencies
     Private research firms
Comparison of internal and external
evaluators
 Factors    related to competence
   Access and advantages
   Technical expertise

 Personal    qualities
     Evaluator’s personal qualities: objective, fair and
      trustable.
 Factorsrelated to the purpose of an
  evaluation
     Formative, summative or quality assurance
      evaluation?
Evaluation and service
 The role of social scientist concerned with
  theory, the design of research, and
  analysis of data.
 And the role of practitioners dealing with
  people in need.
Evaluation and related activities of
organizations
 Research
 Education   and staff development
 Auditing
 Planning
 Human   resources
Chapter 3:
Selecting Criteria and
Setting Standards
Useful criteria and standards
   Research design is important, but criteria and standards as well.

   Criteria that reflect a program’s purposes
     Immediate short-term effects, but a marginal long-term ones.
   Criteria that the staff can influence
       Could meet with resistance to an evaluation if the program staff feel
        that their program will be judged on criteria that they cannot effect.
   Criteria that can be measured reliably and validly.
       Repeated observation could give same values.
   Criteria that stakeholders participate in selecting
       In consultation with evaluator and stakeholders
Developing Goals and Objectives
   How much agreement on goals is needed?
       A number of issues to be addressed.
   Different types of goals
       Implementation goals
       Intermediate goals
       Outcome goals
   Goals that apply to all programs
       Treating the subjects with respect
       Personal exposure to the program
       Depending on surveys and records to provide
        evaluations, etc.
Evaluation criteria and
evaluation questions
 Does  the program or plan match the
  values of the stakeholders?
 Does the program or plan match the
  needs of the people to be served?
 Does the program as implemented fulfill
  the plans?
 Does the outcomes achieved match the
  goals?
Using Program Theory
 Why a program theory is helpful?
 How to develop a program theory?
 Implausible program theories


   Every program embodies a conception of the
    structure, functions, and procedures appropriate
    to attain its goals.
   The conception constitutes the “logic” or plan of
    the program, which is called “Program Theory”.
Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998.
Evaluation: A Systematic Approach, 6th Ed., SAGE Publications,
Inc., London.
Assessing program theory
Framework for assessing program theory
 In relation to social needs
 Assessment of logic and plausibility
   Are the program goals and objectives well defined?
   Are the program goals and objectives feasible?
   Is the change process presumed in the program theory plausible?
   Are the program procedures for identifying members of the target
    population, delivering service to them, and sustaining that service through
    completion well defined and sufficient?
   Are the constituent components, activities, and functions of the program
    well defined and sufficient?
   Are the resources allocated to the program and its various components
    and activities adequate?
   Assessment through comparison with research and practice
   Assessment via preliminary observation
Assessing program theory-2
   Program theory can be assessed in relation to the support for
    critical assumptions found in research or documented
    program practice elsewhere. Sometimes findings are
    available for similar programs.
   Assessment of program theory yields findings that can help
    improve conceptualization of a program or, to affirm its basic
    design.




Source: Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998. Evaluation: A Systematic
Approach, 6th Ed., SAGE Publications, Inc., London.
More questions..
   Is the program accepted?
   Are the resources devoted to the program
    being expended appropriately?
       Using program costs in the planning phase
       Is offering the program fair to all stakeholders?
       Is this the way the funds are supposed to be
        spent?
       Do the outcomes justify the resources spent?
       Has the evaluation plan allowed for the
        development of criteria that are sensitive to
        undesirable side effects?
Example: Program Theory
Example: Program Theory
Example: Program Theory and theory
failure
E.g. Theory failure
Some practical limitations in
selecting evaluation criteria
 Evaluation  budget: Evaluation is not free.
 Time available for the project
 Criteria that are credible to the
  stakeholders.
Overlap in terminology in program evaluation by
 Jane T. Bertrand




Bertrand, Jane T., Understanding the Overlap in Programme Evaluation Terminology, May 2005, The
communicating initiative network.
Thanks for your attention.

More Related Content

What's hot

Tools of Educational Research - Dr. K. Thiyagu
Tools of Educational Research - Dr. K. ThiyaguTools of Educational Research - Dr. K. Thiyagu
Tools of Educational Research - Dr. K. ThiyaguThiyagu K
 
Program evaluation
Program evaluationProgram evaluation
Program evaluationaneez103
 
Curriculum Introduction & Process
Curriculum Introduction & ProcessCurriculum Introduction & Process
Curriculum Introduction & ProcessAnamika Ramawat
 
Criterion referenced test
Criterion referenced test Criterion referenced test
Criterion referenced test Ulfa
 
Curriculum development sreens
Curriculum development  sreensCurriculum development  sreens
Curriculum development sreenssm mm
 
Descriptive research
Descriptive researchDescriptive research
Descriptive researchArifa T N
 
Elements Of Curriculum Development
Elements Of Curriculum DevelopmentElements Of Curriculum Development
Elements Of Curriculum DevelopmentShaikh Mustafa
 
Program Evaluation
Program EvaluationProgram Evaluation
Program EvaluationBurhan Omar
 
Quality Qualitative Research
Quality Qualitative ResearchQuality Qualitative Research
Quality Qualitative ResearchNan Yang
 
Tool development presentation
Tool development presentationTool development presentation
Tool development presentationSyed imran ali
 
Data analysis – qualitative data presentation 2
Data analysis – qualitative data   presentation 2Data analysis – qualitative data   presentation 2
Data analysis – qualitative data presentation 2Azura Zaki
 
Concept of test , measurement and evaluation
Concept of test , measurement and evaluationConcept of test , measurement and evaluation
Concept of test , measurement and evaluationrajwantiarya
 
Evaluation – concepts and principles
Evaluation – concepts and principlesEvaluation – concepts and principles
Evaluation – concepts and principlesAruna Ap
 
Lecture - ANCOVA 4 Slides.pdf
Lecture - ANCOVA 4 Slides.pdfLecture - ANCOVA 4 Slides.pdf
Lecture - ANCOVA 4 Slides.pdfmuhammad shahid
 
variables in educational research
variables in educational researchvariables in educational research
variables in educational researchHarshita Jhalani
 

What's hot (20)

Tools of Educational Research - Dr. K. Thiyagu
Tools of Educational Research - Dr. K. ThiyaguTools of Educational Research - Dr. K. Thiyagu
Tools of Educational Research - Dr. K. Thiyagu
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Summative evaluation
Summative evaluationSummative evaluation
Summative evaluation
 
Assessment
AssessmentAssessment
Assessment
 
Reliability types
Reliability typesReliability types
Reliability types
 
Curriculum Introduction & Process
Curriculum Introduction & ProcessCurriculum Introduction & Process
Curriculum Introduction & Process
 
Criterion referenced test
Criterion referenced test Criterion referenced test
Criterion referenced test
 
Curriculum development sreens
Curriculum development  sreensCurriculum development  sreens
Curriculum development sreens
 
Descriptive research
Descriptive researchDescriptive research
Descriptive research
 
Elements Of Curriculum Development
Elements Of Curriculum DevelopmentElements Of Curriculum Development
Elements Of Curriculum Development
 
Program Evaluation
Program EvaluationProgram Evaluation
Program Evaluation
 
Quality Qualitative Research
Quality Qualitative ResearchQuality Qualitative Research
Quality Qualitative Research
 
Tool development presentation
Tool development presentationTool development presentation
Tool development presentation
 
Action research
Action researchAction research
Action research
 
Data analysis – qualitative data presentation 2
Data analysis – qualitative data   presentation 2Data analysis – qualitative data   presentation 2
Data analysis – qualitative data presentation 2
 
Concept of test , measurement and evaluation
Concept of test , measurement and evaluationConcept of test , measurement and evaluation
Concept of test , measurement and evaluation
 
Using Educational Effectiveness research to Design Theory-Driven Evaluation A...
Using Educational Effectiveness research to Design Theory-Driven Evaluation A...Using Educational Effectiveness research to Design Theory-Driven Evaluation A...
Using Educational Effectiveness research to Design Theory-Driven Evaluation A...
 
Evaluation – concepts and principles
Evaluation – concepts and principlesEvaluation – concepts and principles
Evaluation – concepts and principles
 
Lecture - ANCOVA 4 Slides.pdf
Lecture - ANCOVA 4 Slides.pdfLecture - ANCOVA 4 Slides.pdf
Lecture - ANCOVA 4 Slides.pdf
 
variables in educational research
variables in educational researchvariables in educational research
variables in educational research
 

Similar to Program evaluation 20121016

Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Brent MacKinnon
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptxgggadiel
 
Program Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxProgram Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxstilliegeorgiana
 
Psychology Techniques - Program Evaluation
Psychology Techniques - Program EvaluationPsychology Techniques - Program Evaluation
Psychology Techniques - Program Evaluationpsychegames2
 
A Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfA Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfnoblex1
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sectorwishart5
 
Evaluation of health programs
Evaluation of health programsEvaluation of health programs
Evaluation of health programsnium
 
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxSOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxsamuel699872
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christieharrindl
 
programme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanprogramme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanPriya Das
 
elines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What Workelines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What WorkEvonCanales257
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation MoseStaton39
 
Workbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxWorkbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxAASTHA76
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation MikeEly930
 
Program Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroProgram Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroHelen Casimiro
 
490The Future of EvaluationOrienting Questions1. H.docx
490The Future of EvaluationOrienting Questions1. H.docx490The Future of EvaluationOrienting Questions1. H.docx
490The Future of EvaluationOrienting Questions1. H.docxblondellchancy
 
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...ROBELYN GARCIA PhD
 

Similar to Program evaluation 20121016 (20)

Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
 
Street Jibe Evaluation
Street Jibe EvaluationStreet Jibe Evaluation
Street Jibe Evaluation
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Program Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxProgram Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docx
 
Psychology Techniques - Program Evaluation
Psychology Techniques - Program EvaluationPsychology Techniques - Program Evaluation
Psychology Techniques - Program Evaluation
 
A Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfA Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdf
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sector
 
Evaluation of health programs
Evaluation of health programsEvaluation of health programs
Evaluation of health programs
 
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docxSOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
SOCW 6311 wk 11 discussion 1 peer responses Respond to a.docx
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christie
 
programme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhanprogramme evaluation by priyadarshinee pradhan
programme evaluation by priyadarshinee pradhan
 
elines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What Workelines for Selecting an Evidence‐Based Program   What Work
elines for Selecting an Evidence‐Based Program   What Work
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation
 
Workbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxWorkbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docx
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation
 
Program Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. CasimiroProgram Evaluation: Forms and Approaches by Helen A. Casimiro
Program Evaluation: Forms and Approaches by Helen A. Casimiro
 
Mentoring 418
Mentoring 418Mentoring 418
Mentoring 418
 
490The Future of EvaluationOrienting Questions1. H.docx
490The Future of EvaluationOrienting Questions1. H.docx490The Future of EvaluationOrienting Questions1. H.docx
490The Future of EvaluationOrienting Questions1. H.docx
 
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...
A Multidisciplinary Analysis of Research and Performance Synthesis Utilized i...
 

More from nida19

Culture matters: a test of rationality on economic growth
Culture matters: a test of rationality on economic growthCulture matters: a test of rationality on economic growth
Culture matters: a test of rationality on economic growthnida19
 
Doing interview
Doing interviewDoing interview
Doing interviewnida19
 
Policy Impact,Evaluation and Change (CoOL J)
Policy Impact,Evaluation and Change (CoOL J)Policy Impact,Evaluation and Change (CoOL J)
Policy Impact,Evaluation and Change (CoOL J)nida19
 
Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)nida19
 
What is this thing called science?
What is this thing called science?What is this thing called science?
What is this thing called science?nida19
 
Presentation 04.09.2012
Presentation 04.09.2012Presentation 04.09.2012
Presentation 04.09.2012nida19
 
Policy(style) by anthony
Policy(style) by anthonyPolicy(style) by anthony
Policy(style) by anthonynida19
 
My presentation erin da802
My presentation   erin da802My presentation   erin da802
My presentation erin da802nida19
 
ppt on understaing policy
ppt on understaing policyppt on understaing policy
ppt on understaing policynida19
 
Public policy theory primer
Public policy theory primer Public policy theory primer
Public policy theory primer nida19
 
Public policy analysis_dunn
Public policy analysis_dunnPublic policy analysis_dunn
Public policy analysis_dunnnida19
 

More from nida19 (11)

Culture matters: a test of rationality on economic growth
Culture matters: a test of rationality on economic growthCulture matters: a test of rationality on economic growth
Culture matters: a test of rationality on economic growth
 
Doing interview
Doing interviewDoing interview
Doing interview
 
Policy Impact,Evaluation and Change (CoOL J)
Policy Impact,Evaluation and Change (CoOL J)Policy Impact,Evaluation and Change (CoOL J)
Policy Impact,Evaluation and Change (CoOL J)
 
Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)Ppt science muddling_critique(joseph)
Ppt science muddling_critique(joseph)
 
What is this thing called science?
What is this thing called science?What is this thing called science?
What is this thing called science?
 
Presentation 04.09.2012
Presentation 04.09.2012Presentation 04.09.2012
Presentation 04.09.2012
 
Policy(style) by anthony
Policy(style) by anthonyPolicy(style) by anthony
Policy(style) by anthony
 
My presentation erin da802
My presentation   erin da802My presentation   erin da802
My presentation erin da802
 
ppt on understaing policy
ppt on understaing policyppt on understaing policy
ppt on understaing policy
 
Public policy theory primer
Public policy theory primer Public policy theory primer
Public policy theory primer
 
Public policy analysis_dunn
Public policy analysis_dunnPublic policy analysis_dunn
Public policy analysis_dunn
 

Program evaluation 20121016

  • 1. Program Evaluation: Methods and Case Studies Emil J. Posavac and Raymond G. Carey 7th Edition. 2007. New Jersey: Pearson, Prentice Hall. Aung Thu Nyein DA- 8020 Policy Studies
  • 2. Content  About the authors  Chapter 1: Program Evaluation: An Overview  Chapter 3: Selecting criteria and setting standards
  • 3. About the authors  Emil J. Posavac Ph. D., University of Illinois, a professor Emeritus of Psychology at Loyola University of Chicago, Director of applied social psychology graduate program Awarded for Myrdal Award by American Evaluation Association  Raymond G. Carey Ph. D., Loyola University of Chicago, principal of R. G. Carey Associates. Widely published in the field of health services and quality assurance.
  • 4. An Overview  Evaluation is natural routine.  “Program evaluation is a collection of methods, skills, and sensitivities necessary to determine whether a human service is needed and likely to be used, whether the service is sufficiently intensive to meet the unmet needs identified, whether the service is offered as planned, and whether the service actually does help people in need at a reasonable cost without unacceptable side effects.”
  • 5. An Overview… Contd. But program evaluation is different with natural, automatic evaluation.  First, organization efforts are carried out by team. This specialization means that responsibility for program evaluation is diffused among many people.  Secondly, most programs attempt to achieve objectives that can only be observed sometime in the future rather than in a matter of minutes. Then choice of criteria?  Third, when evaluating our own ongoing work, a single individual fills many roles– workers, evaluator, beneficiary, recipient of the feedback, etc.  Last, programs are usually paid for by parties other than clients of the program.
  • 6. Evaluation tasks that need to be done PE is designed to assist some audience to access the a program’s merit or worth.  Verify that resources would be devoted to meeting unmet needs  Verify that implemented programs do provide services  Examine the outcomes  Determine which program produce the most favorable outcome  Select the programs that offer the most needed types of services  Provide information to maintain and improve quality  Watch for unplanned side effects.
  • 7. Common Types of Program Evaluation  Assess needs of the program participants  Identify and measure the level of unmet needs,  Some alternatives  Examine the process of meeting the needs  Extent of the implementation,  the nature of people being served  The degree to which the program operates as planned  Measure the outcomes of the program  Who had received what?  Program service makes changes for better?  Different opinions of people on outcome?  Integrate the needs, costs, and outcomes  Cost-effectiveness
  • 8. Activities often confused with program evaluation  Basic research  Individual assessment  Program audit  Although these activities are valuable, program evaluation is different and more difficult to carry out.
  • 9. Different Types of Evaluations for Different Kinds of Programs  No “one size fits all” approach.  Organizations needing program evaluations  Health care  Criminal justice  Business and Industry  Government  Time Frame of needs  Short-term needs  Long-term needs  Potential needs
  • 10. Extensiveness of the programs  Some programs are offered to small group of people with similar needs, but other are developed for use at many sites through out the country.  Complexities involved.
  • 11. Purpose of program evaluation  The over all purpose of program evaluation is contributing to the provision of quality services to the people in needs.  Feedback mechanism: formative evaluations or summative evaluations or evaluation for knowledge.  A Feedback Loop
  • 12. The roles of evaluators  A variety of work setting  Internal evaluators  External: of governmental or regulatory agencies  Private research firms
  • 13. Comparison of internal and external evaluators  Factors related to competence  Access and advantages  Technical expertise  Personal qualities  Evaluator’s personal qualities: objective, fair and trustable.  Factorsrelated to the purpose of an evaluation  Formative, summative or quality assurance evaluation?
  • 14. Evaluation and service  The role of social scientist concerned with theory, the design of research, and analysis of data.  And the role of practitioners dealing with people in need.
  • 15. Evaluation and related activities of organizations  Research  Education and staff development  Auditing  Planning  Human resources
  • 16. Chapter 3: Selecting Criteria and Setting Standards
  • 17. Useful criteria and standards  Research design is important, but criteria and standards as well.  Criteria that reflect a program’s purposes  Immediate short-term effects, but a marginal long-term ones.  Criteria that the staff can influence  Could meet with resistance to an evaluation if the program staff feel that their program will be judged on criteria that they cannot effect.  Criteria that can be measured reliably and validly.  Repeated observation could give same values.  Criteria that stakeholders participate in selecting  In consultation with evaluator and stakeholders
  • 18. Developing Goals and Objectives  How much agreement on goals is needed?  A number of issues to be addressed.  Different types of goals  Implementation goals  Intermediate goals  Outcome goals  Goals that apply to all programs  Treating the subjects with respect  Personal exposure to the program  Depending on surveys and records to provide evaluations, etc.
  • 19. Evaluation criteria and evaluation questions  Does the program or plan match the values of the stakeholders?  Does the program or plan match the needs of the people to be served?  Does the program as implemented fulfill the plans?  Does the outcomes achieved match the goals?
  • 20. Using Program Theory  Why a program theory is helpful?  How to develop a program theory?  Implausible program theories  Every program embodies a conception of the structure, functions, and procedures appropriate to attain its goals.  The conception constitutes the “logic” or plan of the program, which is called “Program Theory”. Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998. Evaluation: A Systematic Approach, 6th Ed., SAGE Publications, Inc., London.
  • 21. Assessing program theory Framework for assessing program theory  In relation to social needs  Assessment of logic and plausibility  Are the program goals and objectives well defined?  Are the program goals and objectives feasible?  Is the change process presumed in the program theory plausible?  Are the program procedures for identifying members of the target population, delivering service to them, and sustaining that service through completion well defined and sufficient?  Are the constituent components, activities, and functions of the program well defined and sufficient?  Are the resources allocated to the program and its various components and activities adequate?  Assessment through comparison with research and practice  Assessment via preliminary observation
  • 22. Assessing program theory-2  Program theory can be assessed in relation to the support for critical assumptions found in research or documented program practice elsewhere. Sometimes findings are available for similar programs.  Assessment of program theory yields findings that can help improve conceptualization of a program or, to affirm its basic design. Source: Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998. Evaluation: A Systematic Approach, 6th Ed., SAGE Publications, Inc., London.
  • 23. More questions..  Is the program accepted?  Are the resources devoted to the program being expended appropriately?  Using program costs in the planning phase  Is offering the program fair to all stakeholders?  Is this the way the funds are supposed to be spent?  Do the outcomes justify the resources spent?  Has the evaluation plan allowed for the development of criteria that are sensitive to undesirable side effects?
  • 26. Example: Program Theory and theory failure
  • 28. Some practical limitations in selecting evaluation criteria  Evaluation budget: Evaluation is not free.  Time available for the project  Criteria that are credible to the stakeholders.
  • 29. Overlap in terminology in program evaluation by Jane T. Bertrand Bertrand, Jane T., Understanding the Overlap in Programme Evaluation Terminology, May 2005, The communicating initiative network.
  • 30. Thanks for your attention.