Top Rated Bangalore Call Girls Mg Road ⟟ 9332606886 ⟟ Call Me For Genuine S...
Qualitative Methods Course: Moving from Afterthought to Forethought
1. Qualitative Methods Course:
Carolina Mejia, PhD, MPH
Jessica Fehringer, PhD, MPH
MEASURE Evaluation
University of North Carolina
AEA Conference
Washington, D.C.
Nov. 11, 2017
Moving from Afterthought
to Forethought
2. Acknowledgments
Pilar Torres
Phyllis Dako-Gyeke
Liz Archer
Sunil George
Hemali Kulatilaka
Liz Millar
Emily Bobrow
Special thanks to Jen Curran, who assisted
in initial course development
Co-Authors
3. Session Objectives
1. Provide an overview of a short course on
qualitative evaluations and share
innovative content examples.
2. Describe the key challenges in addressing
the development of the course and
provide examples from
the case studies and group activities.
3. Generate a discussion with the audience
on how evaluators can improve teaching
qualitative courses.
4. Innovative Content
Carolina Mejia, PhD, MPH
MEASURE Evaluation
University of North Carolina
AEA Conference
Washington, D.C.
Nov. 11, 2017
Qualitative Methods Course for
Rigorous Evaluation
5. Rigorous Evaluations
Follow a clearly specified protocol
Adhere to recognized scientific
Include formative evaluations, process
evaluations, outcome evaluations,
and impact evaluations
Definition
6. Qualitative Evaluations
Fulfill an important role in rigorous evaluation
of programs
May be used to complement quantitative
data, answer a question not accessible
quantitatively — the “why” behind program
successes or challenges
Illuminate the uniquely human side of health
programming and bring to light important
contextual factors
7. Rationale
A need for a course emerged from:
• MEASURE Evaluation’s impact evaluation
course feedback
• MEASURE Evaluation’s field experience and
literature familiarity
o Limited focus on quality of qualitative
methods; everybody thinks they can do a few
FGDs; an “afterthought” or “add-on”
Global Evaluation and Monitoring Network
for Health (GEMNet)
demand analysis
8. About the Course
Enhance participants’ capacity
to conceptualize, design,
develop, govern,
and manage qualitative
methods in evaluation
and use the information
generated for improved public
health practice
and service delivery
The course contextualizes
qualitative methods within
rigorous evaluation, rather than
offering the basics of
a qualitative approach
9. Audience and Prerequisites
Designed for participants who have basic
knowledge of program evaluation and
qualitative methods
Intended audience: professionals from
monitoring and evaluation in health
and development fields
Prior experience with qualitative methods
and public health program evaluation is
strongly encouraged
10. Course Competencies
Categories
1. Concepts, approaches, and purposes of qualitative methods in
evaluation
2. Creating and conceptualizing evaluation questions
3. Troubleshooting selected qualitative methods
for evaluation
4. Choosing an appropriate qualitative method
5. Developing data collection tools
6. Qualitative data analysis techniques
7. Fieldwork considerations
8. Presentation and dissemination of data
9. Quality standards for qualitative inquiry/trustworthiness
10. Ethical principles for qualitative evaluation
11. Course Content
Eleven sessions covering the key aspects of
rigorous qualitative evaluation
Duration of original course is 40 hours,
covered over seven days of in-person
instruction, including time for practical
application
• Will be adding one day based on
pilot feedback; about 56 hours now
Content tailored to address issues
faced by evaluators in low- and
middle-income countries
12. Sessions (1)
1. Introduction to Qualitative Methods in
Evaluation: Discussion and Use of Paradigms
in Study Design and the Emergent Nature of
Qualitative Evaluation
2. Creating and Conceptualizing Qualitative
Evaluation Questions
3. Troubleshooting in Select Qualitative
Methods
4. Developing Data Collection Tools
5. Sampling Strategies and Saturation
6. Qualitative Data Analysis: Techniques
and Planning
13. Sessions (2)
7. Qualitative Data Analysis: Hands-on
8. Quality Evaluation Standards for Qualitative
Inquiry
9. Developing a Fieldwork Plan for Qualitative
Evaluation
10. Data Presentation and Dissemination
11. Key Ethical Principles and gender integration
in Qualitative Evaluation
Note: Gender is also integrated throughout the
course.
14. Teaching Methods
Course delivery is based on adult
learning principles
Each session includes varied teaching
approaches for its activities
Teaching methods include facilitated
discussion, presentations, storytelling,
group work, debates, thematic analysis,
and case study from relevant region
(based on workshop location) of the
world
15. Course Activities
Debates (paradigm debate)
Case study used across all sessions
Trustworthiness
Audience matters (role play as
presenter for different audiences)
Group work on pre-selected projects
(development of short protocol)
16. Activity Examples (1)
The Third Wave
Positivist
Constructivist/
Interpretivist
Critical/Emancipatory
Pragmatists
17. Group Activity
Split into four groups
representing each of
the four major paradigms
Design an evaluation project around the topic
• Develop a particular evaluation question
and expand on the context
• Develop your group’s
evaluation concept
• 20 minutes
Present your plan to the class
• 5 minutes each
Class discussion
• Combined 20 minutes after all groups present
Activity Examples (2)
18. Activity Examples (3)
Putting Quality First
Split into three groups
Use the template provided and indicate (30
minutes):
• How you would address the different
components of trustworthiness
• Practical implementation
o E.g., how would you conduct
member checks?
Present your plan to the class
• 10 minutes each
19. Activity Examples (3)
Component of
Trustworthiness
Aspects Addressed Application
(Real world operationalization)
Dependability
and
Confirmability
Evaluation process
Methodology
Analysis
o Audit trail: storing and cataloging raw data to be useful in the future
o Careful documentation of the analytic and interpretation process,
code/theme definitions
o Keep “field diaries” to note and theoretical or philosophical approach of
evaluators which may impact the evaluation
o Piloting and refining for data collection tools to be appropriate to study
population
Credibility Study design
Analysis
Confidence in the study
outcomes
Value of the findings
o Appropriate selection of person interviewing – female interviewers for
women’s FGD, etc.
o Field notes – was anyone else present during interviews/FGDs?
o Consistency between data presented and findings
o Work to establish inter-coder reliability during analysis
o Consistency between data and findings of study
o Participants provide feedback on preliminary findings – bring findings to
women’s/men’s community group meetings to receive feedback
Transferability Sampling
Context
Methodology
o Using maximum variation sampling to capture different tribal and religious
backgrounds in communities
o Culturally appropriate approach to recruiting participants
o Data analysis which captures varying perspectives among sample
o Using illustrative quotes in reports/presentation to capture participant
and illustrate themes
o Thorough field notes to capture important details about the study
21. Evaluation Methods (1)
Measuring Success
Student evolution
Pretest and posttest covering
all 11 sessions
Assessment of final group project
Course evaluation
Daily participants’ evaluation form for facilitators
to review covering the following:
• Was content clear?
• Were the facilitators prepared and organized in
conducting the session?
• Overall impression of the day (use a scale)
22. Evaluation Methods (2)
Measuring Success
Final course evaluation, stressing
the following:
• Overall impressions
• Comments on specific module
presentation
• Group comments and ranking
• What worked best; what did not work
• Suggestions for improvement
(general and specific suggestions)
23. This presentation was produced with the support of the United
States Agency for International Development (USAID) under the
terms of MEASURE Evaluation cooperative agreement AID-OAA-L-
14-00004. MEASURE Evaluation is implemented by the Carolina
Population Center, University of North Carolina at Chapel Hill in
partnership with ICF International; John Snow, Inc.; Management
Sciences for Health; Palladium; and Tulane University. Views
expressed are not necessarily those of USAID or the United States
government.
www.measureevaluation.org
24. Learning as Evaluators and Trainers:
Jessica Fehringer, PhD, MPH
MEASURE Evaluation
University of North Carolina
AEA Conference
Washington, D.C.
Nov. 11, 2017
The Development of a Short-Course in
Intermediate Qualitative Methods in
Evaluation
25. Curriculum Development
Formed Curriculum Advisory Committee (CAC)
• Comprised of qualitative evaluation in health
experts nominated by GEMNet-Health institutions
CAC member institutions:
• Public Health Foundation, in India
• University of Pretoria, South Africa
• University of Ghana, in Ghana
• National Institute for Public Health, in Mexico
• MEASURE Evaluation, a USAID-funded project, at
the University of North Carolina in the U.S.
26. Curriculum Development
Competencies Example
Discuss major concepts, approaches, and types of
qualitative methods in evaluation, including the purpose
of using qualitative methods in evaluation, as well as
discussing the use of mixed methods.
LO1: Understand and compare the four major
paradigms of evaluation
LO2: Compare and contrast the use of qualitative
methods for evaluation with other approaches
LO3: Establish the appropriateness of the use
of mixed methods of evaluation
27. Curriculum Development
Training of trainers and curriculum review
meeting in February 2017
• GEMNet-Health faculty
First full pilot workshop in October 2017
in Ghana
• 28 participants from 10 countries
Review and Piloting
28. Participant Selection
Mix of locations to offer
opportunities broadly
Prioritizing academic applicants, so
can pass on to others
Funding
29. Practical Component
Specific program evaluation proposals were
submitted; 5 selected
Each group works on one real evaluation
Work in group across the next 6 days
to develop protocol
• Last session of the day
• Each day answer questions relevant to
topics presented that day
Present protocols on Day 7 for feedback
31. Case Study
Originally used multiple
case studies for session
activities; for topic and
contextual variety
TOT/review meeting
feedback asked for one
case study throughout
Developed gender-
based violence
evaluation case study set
in Tanzania
32. Challenges (1)
Making it affordable for
participants in developing
countries, while also providing
high quality teaching and
accommodation
• Limit length vs. what you can
cover and in what depth in 7
days
• Limit hotel costs vs. comfort
Balancing theory and practical
instruction
Balance
33. Challenges (2)
Teaching participants with
a variety of qualitative skills
and experiences
Groupwork with participants
of different skill levels, cultures,
varying personal investment
(e.g., if program under evaluation
was submitted by you)
35. Challenges (4)
Integrating gender throughout and
having it “stick”
• Rather than having a stand-alone
session, we tried to integrate it
throughout
• At end, minimal retention
• Likely adding at
least a short
stand-alone
session
36. Pilot Evaluation Results (1)
++ Content
++ Facilitation
• Appreciated single
case study throughout
• All levels left with something
38. Next Steps
Listserve for participants
Small revisions based on participant
feedback
Complete facilitator’s notes to ensure that
external trainers who want to teach the
course can do it
Post course online to make available to wide
audience
For use in teaching; not designed as a self-
taught course
Another workshop next year??
39. Any Questions/Input?
For additional information,
contact:
Jessica Fehringer, PhD, MPH
fehringe@email.unc.edu
Carolina Mejia, PhD, MPH
cmejia@unc.edu
Hemali Kulatilaka
hkulatil@email.unc.edu
40. This presentation was produced with the support of the United
States Agency for International Development (USAID) under the
terms of MEASURE Evaluation cooperative agreement AID-OAA-L-
14-00004. MEASURE Evaluation is implemented by the Carolina
Population Center, University of North Carolina at Chapel Hill in
partnership with ICF International; John Snow, Inc.; Management
Sciences for Health; Palladium; and Tulane University. Views
expressed are not necessarily those of USAID or the United States
government.
www.measureevaluation.org