USAID’s Office of Population and Reproductive Health came up with a list of High Impact Practices, or HIPs, which were essentially best practices in FP. A technical advisory group was formed to provide guidance on the HIPs. Within the TAG, an M&E of HIPs working group was formed (sometime in 2010) to look at scale up of HIPs. ExpandNet framework: Provides guidance on using data collected from M&E to assess pilot, develop a scaling-up strategy, and carefully manage scale up as well as guidance on indicator development, and mixed qualitative and quantitative methodology. IBP’s Fostering Change Guide: Did not provide a monitoring step Improvement Collaborative Approach from the Healthcare Improvement Project: Monitoring is noted as an essential phase. Approach is mainly focused on quality improvement. Management Systems International (MSI) Scaling Up Management Framework for Practitioners: The last task, Track Performance and Maintain Momentum, states the importance of assessing outcomes as well as monitoring progress so that the monitoring can be a catalyst for maintaining momentum and accountability , and for keeping the scaling-up process on track.
The WG recognized that a lot had been written about scale up, but it was more focused on how to take something to scale or how to evaluate if your scale up worked. There was a gap in resources on monitoring and evaluating the process of scale up, to ensure there’s fidelity to the model/pilot. So the WG identified a need to fill this gap with an M&E of scale up tool and decided it should be focused on a HIP with a field application. The MEASURE Evaluation PRH project was approached about writing this tool. The field application was scale up of the integration of FP into HIV comprehensive care centers (CCCs) in Kenya. This scale up was already underway; in other words, it was not a pilot. The pilot activity was FHI 360’s PROGRESS project monitoring the process of scale up. The problem was, FP/HIV integration is not actually a defined HIP according to the Pop. Office. Therefore, after several discussions, the thinking changed that a scale up guide should be developed for global application with the field application being a case study. In addition, it should be widely applicable to any health practice or intervention. Futures Group was brought in at this point to serve a technical advisory role.
From discussions with USAID directly, as well as feedback from the Monitoring of Scale Up Community of Practice and HIP Working Group, two things became evident: there was not going to be a whole lot of guidance on the content of the document, but there were going to be some clear parameters.
This is the result. The goal is to assist country stakeholders with identifying if scale up is happening as intended and if the scale up efforts can be sustained to achieve the desired impact. This resource is not meant to be a tool for measuring the achievement of outcomes, such as changes in knowledge, attitudes, beliefs, and behaviors among the scale up beneficiaries, nor is it meant to be a guide for conducting an outcome evaluation. It is to help monitor scale up processes of practices for which scale up is already underway, even if the scale up was not initially well-planned or monitored. The idea is that this guide will help to unlock that “black box” and assist with understanding how you got to where you are in the process of scaling up. In other words, there may have been a baseline and an endline, but if you only have A and E, and never took the time to see what’s happening at A and B and C and D, then you won’t have a causal pathway to help explain what’s happening at E.
These pieces will be highlighted more in future slides.
Expanding or replicating the process includes moving it from one geographic area to more areas or from one level of health service to other levels. Expanding involves training, increasing staffing or service delivery providers, task-shifting and/or task sharing, additional technical assistance and supervision, modifying or upgrading facilities, procuring supplies, equipment, and commodities, distributing materials, and adapting data collection methods or tools, among other steps. Expansion/replication is also referred to as “horizontal scale up”. Institutionalization is more the “systems” piece of scale up and is sometimes referred to as “vertical scale up”. It includes buy-in from leaders and stakeholders which translates into legal, political, and institutional changes. This piece includes: norms and procedures, training curricula, supervision, health information systems, supply distribution, and budget lines. It is generally more time-consuming with long-term results. This is the heart of scale up, which is not really technical, but is more of an organizational, political, and managerial exercise. This is why systematic, well-planned scale up is not quick; to do it well, these more challenging pieces must be monitored consistently to make sure they are working, keeping up momentum, and moving ahead.
The institutional aspect of scale up is more top-down and the expansion/replication piece is more bottom-up. It’s these different levels and angles that need to be monitored. Scale-up is very dynamic, and messy. In a changing environment, you need to closely monitor your scale-up to make sure it remains focused on the objectives and isn’t all over the map.
In any given country, there are typically multiple health practices in the process of being scaled up. However, not all of these practices are high priority or have an equal chance of becoming sustainable practices fully integrated into existing health programs. The Readiness Assessment checklist will help determine which practices being scaled up would benefit from increased monitoring . Thus, the following criteria will assist in the process of systematically reviewing the practices currently being brought to scale in a particular country or context to identify which one(s) to focus monitoring efforts on for success in scale up and long-term sustainability. Evidence exists that the scaled up practice will likely have a positive impact on health outcomes Experience from an evaluation of a local pilot is available The scale up adheres, to the extent possible, to existing health system structures, processes, and practices The costs of scaling up have been identified and can be monitored There is a strong level of interest/commitment There is a commitment to monitoring the scale up process
Although program implementers may be familiar with the steps to monitoring, drawing from various M&E resources, the emphasis here is on monitoring through a scale up lens, addressing the challenges and specific considerations that come with this process. These were initially called “Ten Steps to Monitoring Scale Up”, but in recognition of the fact that these steps are not always linear, they are now being called “considerations”.
How will you know that your scale-up is working? You need to come up with both long-term and short-term objectives (or “benchmarks”). There’s a worksheet for defining: Levels of facilities or providers Target population Key Stakeholders Key Activities Monitoring Activities Timeline
Many frameworks and approaches for scaling up health interventions have been developed and tested in recent years. In the guide we present some of those used in FP and maternal and child health and describe how each of the frameworks address M&E. (MSI: Management Systems International)
Stakeholders are another key human resource and are critical for scaling up and monitoring the agenda and activities. Stakeholders include everyone who will be involved in the scale-up process from the national (e.g., ministries), subnational (e.g., provinces, districts), and program levels (e.g., program managers, administrators, and service providers). These are the individuals whose commitment and cooperation is required to monitor a scale-up. Costs include: information systems (data collection, processing, and analysis); • information dissemination and use (technology and communications, editing and printing reports, holding dissemination meetings or workshops, presenting at conferences); • data quality control system (staff time and communication and transportation costs to verify data quality); and • coordination and capacity building (the human resources and infrastructure and technology to support staff, stakeholder engagement).
The illustrative indicators for monitoring scale up are organized by domain as well as level such as input, process, etc. The table is actually 2 pages long in the guide, but for the purpose of this presentation, about half of the indicators have been removed. The indicators should measure both the institutionalization as well the expansion/replication of the practice and should assist with determining if essential elements of the practice have been lost during scaling up (i.e. degree of fidelity to the model), quality is being maintained, and there are any unexpected results. The intention is that the illustrative list of indicators for monitoring scale up will be modified and expanded according to the practice of interest, scale up objectives, stakeholder priorities, and country context.
MEASURE Evaluation PRH is in the process of developing a short (perhaps 10 pages) GIS supplement for using geospatial data to measure various aspects of the expansion/replication aspect of scale up.
How will the findings be presented (e.g. summary reports, graphs, maps)? How will the findings be disseminated (e.g. meetings, workshops, publications)? Who are the internal (within the scale-up project) and external (stakeholders) audiences for dissemination? Who will be using the data to develop solutions for redirecting or galvanizing the aspects of scale up that have been derailed
How is scale up progressing at the various levels, from the national level on a macro scale to the individual level on a micro scale? How well are the various domains being addressed? Is the number of service delivery points currently implementing the practice on pace with what was planned? Are providers being adequately trained and is there supportive supervision?
Identify where successes have been achieved and how they can be used as opportunities to create momentum; Discuss challenges and potential solutions to getting the scale up back on track or change strategies; Further specify, craft, and prioritize these solutions to respond to problems and challenges (within given financial, human resource, infrastructure, and time parameters); and Develop an action plan for implementing each of these solutions.
The PROGRESS Project case study focuses on data collection methods and the completed framework worksheet. IRH is conducting a five year (2007-2012) prospective study to assess and document the process and effects of large-scale integration and scale-up of SDM in FP and reproductive health systems in the DRC, Guatemala, India, Mali, and Rwanda. So the FAM case study looks more at the process of how they are monitoring the scale up. The Nutrition Communication Project (NCP) Vitamin A Promotion Project was implemented in Tahou Region from 1991-1994. It was USAID-funded and implemented by AED and Helen Keller International. Although the MoH had actively distributed vitamin A capsules, it had no experience with behavior change programs encouraging the consumption of vitamin A-rich foods. A two year pilot study was successfully conducted and reached approximately 26,000 people. The project was then scaled up and reached approximately 250,000 people. This case study looks at key monitoring issues and lessons learned during that scale up in a difficult and unique environment.
Next steps: Guide is currently being reviewed and revised. It will be finalized by the end of Dec. Starting in the new year, it will be piloted in the field. The Respond project and MCHIP have expressed an interest in piloting the guide.
Monitoring Scale-up of Health Practices and Interventions
Monitoring Scale-up of Health
Practices and Interventions
MEASURE Evaluation Population and
Reproductive Health (PRH) Project
Provide background information on the
monitoring scale-up initiative
Introduce the result (the guide)
Present what information is included in the guide
Highlight key components of the guide
Succinct, but also including enough background
narrative, examples, and explanations
Very user-friendly and applicable to a broad
audience, yet technical enough to be helpful
FP-focused, but easily applicable to other health
practices and interventions
Informed by ExpandNet’s work, but not
necessarily dictated by it
This practical “how to” resource includes:
Rationale for monitoring scale-up
Ten considerations to monitoring scale-up
- Three monitoring scale-up case studies
- “Defining the Innovation” piece
- Selected frameworks and approaches for scaling-up
- Using GIS to monitor scale-up expansion
Successful scale-up involves two key elements:
institutionalizing the practice and expanding or
10 Considerations to Monitoring
1. Define objectives and
scope of scale up plan
2. Create a framework
3. Identify necessary
resources to implement
4. Select indicators
5. Establish data sources
and reporting systems
6. Develop a data use and
7. Collect data
8. Analyze data and
determine if scale up is
progressing on track
9. Make program adjustments
based on findings and
10. Continue the M&E process
1: Define objectives and scope of
the scale-up plan
What do you want or need to know in order to
say your scale-up is working?
Is the scale-up fulfilling the objectives of the
original program or pilot?
How will you know if there are problems or that
your scale-up is not achieving its pre-determined
3: Identify necessary resources to
implement monitoring plan
4: Select indicators
Level Indicators (by health domain) Data Sources
process The practice is included in the current year annual operating plan Operational plan; budget
process The budget is linked to the current year annual operational plan Budgets; chart of accounts; operational plan
ACCESS TO PHYSICAL RESOURCES
process Percent of facilities with available equipment to implement the practice Health facility audit; GIS data; specialized survey
process Availability of job aids and service protocols for the given practice Health facility audit; specialized survey
output Percent of facilities or community-based providers that experienced a stock out of
a given commodity at any point during a specified time period
Site visits, physical inventories, stock records; logistics
management information system records; supervision
records; GIS data
process Existence of a strategic plan for expanding coverage of the practice Review of strategic plan or strategy documents; interviews
with key staff (e.g., managers)
output Examples of practice being included and/or supported in national or subnational
policy, strategy, guidelines, curricula, or related documents and communications
Review of national or subnational policy, strategy,
guidelines, curricula, or other related documents
HEALTH MANAGEMENT INFORMATION SYSTEMS
input HMIS adapted to capture and report on key information on the implementation of
Meeting notes; HMIS
output Key information on the implementation of the practice routinely reported in the
process Number of providers trained in the practice, by type of personnel Training records
output Number or percent of providers competent to provide specific services upon
completion of training
Competency tests (such as a checklist administered by the
trainers and/or external expert observer)
output Percent of trained providers performing the core components of the practice National guidelines/standards for service delivery; checklists
and notes of an expert observer
outcome Number of clients receiving the practice of interest in a given timeframe Facility records
outcome Percent of sites implementing the practice with fidelity to the model, based on
pre-determined criteria and definition of the practice/innovation
Facility records; facility audits; program records (for
community-based services); interviews with key staff
output Community participation in and support for the practice Program records; focus groups; in-depth interviews
output Percent of audience with a favorable attitude toward the practice Sample surveys with members of the intended audience;
focus groups; in-depth interviews
5: Establish data sources and
6: Develop a data use and
Who will be responsible for analyzing and
interpreting the data?
How will the findings be presented and
Who are the internal and external audiences
Who will be using the data?
8: Analyze Data and Determine if
Scale-up Is Progressing On Track
How is scale up progressing at the various
levels and within the different domains?
Do the service delivery points have the
necessary commodities for the practice to be
Is the scale-up maintaining fidelity to the
9: Make program adjustments based
on findings & recommendations
Discuss challenges and
potential solutions to getting
the scale-up back on track or
Develop an action plan for
implementing each of these
IRH’s Defining the Innovation, with
Selected frameworks and approaches for
scale-up of health interventions
- PROGRESS Project: Monitoring the scale-up of
FP integration into comprehensive care centers
- FAM Project: Monitoring the scale-up of
standard days method in five countries
- NCP Project: Monitoring vitamin A promotion in
Using GIS to monitor scale-up
"Measurement is the first step that leads to
control and eventually to improvement. If
you can't measure something, you can't
understand it. If you can't understand it, you
can't control it. If you can't control it, you
can't improve it."
- H. James Harrington
MEASURE Evaluation PRH is a MEASURE project funded by
the United States Agency for International Development
(USAID) through Cooperative Agreement GHA-A-00-08-00003-
00 and is implemented by the Carolina Population Center at
the University of North Carolina at Chapel Hill in partnership
with Futures Group International, Management Sciences for
Health, and Tulane University. Views expressed in this
presentation do not necessarily reflect the views of USAID or
the U.S. Government. MEASURE Evaluation PRH supports
improvements in monitoring and evaluation in population,
health and nutrition worldwide.