This document discusses various approaches to program evaluation including objective-oriented, expertise-oriented, participant-oriented, and consumer-oriented approaches. It provides examples of each approach and how they may be applied. Strengths and weaknesses of each approach are considered. The document also discusses evaluation methods such as surveys, interviews, and mixed methods. References are provided on related research and examples of evaluation studies.
7. Management-Oriented Evaluation Context, Input, Process, Product Model (CIPP) UCLA Model Strengths & Limitations Rational & Orderly Fuels high level decision makers Costly & Complex Stability v. Need for Adjustment
8. Consumer-Oriented Approach Typically a summative evaluation approach This approach advocates consumer education and independent reviews of products Scriven’s contributions based on groundswell of federally funded educational programs in 1960s Differentiation between formative/summative evaluation
9. What is a consumer-oriented evaluation approach? When independent agencies, governmental agencies, and individuals compile information on education or other human services products for the consumer. Goal: To help consumers become more knowledgeable about products
10. For what purposes is it applied? Typically applied to educational products and programs Governmental agencies Independent consumer groups Educational Products Information Exchange To represent the voice and concerns of the consumers
11. How is it generally applied? Creating and using stringent checklists and criteria Michael Scriven Educational Products Information Exchange U.S. Dept of Education Program Effectiveness Panel Processes Content Transportability Effectiveness
12. Consumer Oriented Checklist Need Market Performance True field trials [tests in a “real” setting] True consumer tests [tests with real users] Critical comparisons [comparative data] Long term [effects over the long term] Side effects [unintended outcomes] Process [product use fits its descriptions] Causation [experimental study] Statistical significance [supports product effectiveness] Educational significance
13.
14. Increases the consumers’ knowledge about using criteria and standards to objectively and effectively evaluate educational and human services products
24. Approaches Formal Professional Review System Informal Professional Review System Ad Hoc Panel Review Ad Hoc Individual Review Educational Connoisseurship and Criticism
25. Formal Professional Review System Structure or organization established to conduct periodic reviews of educational endeavors Published standards Pre-specified schedule Opinions of several experts Impact on status of that which is reviewed
26.
27.
28. Uses Institutional accreditation Specialized or program accreditation doctoral exams, board reviews, accreditation, reappointment/tenure reviews etc…
49. What's going on in the field? Educational Preparationhttp://www.duq.edu/program-evaluation/ TEAhttp://www.tea.state.tx.us/index2.aspx?id=2934&menu_id=949
50. What's going on in the field? Rockwood School District Clear Creek ISD Educational Link posted by Austin ISD Houston ISD Austin ISD
54. District Initiatives Houston ISD “Real Men Read”http://www.houstonisd.org/portal/site/ResearchAccountability/menuitem.b977c784200de597c2dd5010e041f76a/?vgnextoid=159920bb4375a210VgnVCM10000028147fa6RCRD&vgnextchannel=297a1d3c1f9ef010VgnVCM10000028147fa6RCRDAlvin ISD “MHS (Manvel HS) Reading Initiative Programhttp://www.alvinisd.net/education/staff/staff.php?sectionid=245
55. What does the research say? “Rossman and Salzman (1995) have proposed a classification system for organizing and comparing evaluations of inclusive school programs. They suggest that evaluations be described according to their program features (purpose, complexity, scope, target population, and duration) and features of the evaluation (design, methods, instrumentation, and sample).”Dymond, S. (2001). A Participatory Action Research Approach to Evaluating Inclusive School Programs. Focus on Autism & Other Developmental Disabilities, 16, 54-63.
56. What does the research say? “Twenty-eight school counselors from a large Southwestern school district participated in a program evaluation training workshop designed to help them develop evaluation skills necessary for demonstrating program accountability. The majority of participants expressed high levels of interest in evaluating their programs but believed they needed more training in evaluation procedures.”
57. What does the research say? Group Interview Questions “Graduate research assistants conducted group interviews in Grades 2-5 during the final weeks of the school year. We obtained parent permission by asking teachers to distribute informed consent forms to students in their classes, which invited the students to participate in the group interviews. We received informed consent forms from at least 3 students-the criterion number for a group interview at the school-for 21 schools (66% participation rate). If participation rates were high enough, the research assistants conducted separate interviews for Grades 2-3 and 4-5; the assistants conducted 23 interviews. The research assistants tape recorded all interviews, which averaged about 25 min, for data analysis. The interviewer encouraged responses from all group members. Four questions guided the group interviews.”Frey, B., Lee, S., Massengill, D., Pass, L., & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280.
58. What does the research say? Survey Collection “A survey collected teachers' self-reports of the frequency with which they implemented selected literacy activities and the amount of time in minutes that they used the literacy activities. Teachers also reported their level of satisfaction with the literacy resources available to them.”Frey, B., Lee, S., Massengill, D, Pass, L, & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280.
59. Tips on making a survey: Make the survey response time around 20 minutes Make the survey easy to answer What changes should the school board make in its policies regarding the placement of computers in elementary school? -Is this question effective or vague? Survey questions should clarify the time period During the past year, has computer use by the average child in your classroom increased, decreased, or stayed the same? Avoid Double (or triple, or quadruple) Barreled questions and responses My classroom aide performed his/her tasks carefully, impartially, thoroughly, and on time. Langbein, L. (2006). Public Program Evaluation: A Statistical Guide. New York: M.E. Sharpe, Inc.
60. Examples of Evaluation Methods EXAMPLES OF EVALUATION METHODS USEDOne study uses a mixed-methods approach of objective-oriented, expertise-oriented, and participant-oriented approaches. Evaluations were based on the models provided in “Program EvaluationAlternative Approaches and Practical Guidelines”
61.
62. Cont. The purpose of this report is to illustrate the procedures necessary to complete an evaluation of the Naval Aviation Survival Training Program (NASTP) written by Anthony R. Artino Jr. a program manager and instructor within the NASTP for eight years“In very few instances have we adhered to any particular “model” of evaluation. Rather, we find we can ensure a better fit by snipping and sewing bits and pieces off the more traditional ready-made approaches and even weaving a bit of homespun, if necessary, rather than by pulling any existing approach off the shelf. Tailoring works” (Worthen, Sanders, Fitzpatrick p. 183).
63. Cont. objective-oriented evaluation – Objective-oriented evaluation was used because a) the NASTP has a number of well-written objectives; b) it would be relatively easy to measure student attainment of those objectives using pre and post-assessments; and c) the program sponsor, the CNO, would be very interested to know if the objectives that he approves are in fact being met.
64. Cont. Expertise-oriented evaluation - An outside opinion from an aviation survival training subject matter expert – someone very familiar with the topics being taught and the current research literature in survival training was used.
65. Cont. Participant-oriented evaluation - It was important for evaluators and the SME to be totally immersed in the training environment. This included a focus on audience concerns and issues (i.e. mangers, instructors, and students) and an examination of the program “in situ” without any attempt to manipulate or control it (Worthen, Sanders, & Fitzpatrick, 1997).
66.
67.
68.
69. References Astramovich, R., Coker, J., & Hoskins, W. (2005). Professional School Counseling. The Journal of Educational Research, 9, 49-54. Dymond, S. (2001). A Participatory Action Research Approach to Evaluating Inclusive School Programs. Focus on Autism & Other Developmental Disabilities, 16, 54-63. Frey, B., Lee, S., Massengill, D., Pass, L., & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280. Langbein, L. (2006). Public Program Evaluation: A Statistical Guide. New York: M.E. Sharpe,