Assessing the performance of an integrated disease surveillance and response system in the context of varying malaria transmission: A case study from Madagascar
Similar to Assessing the performance of an integrated disease surveillance and response system in the context of varying malaria transmission: A case study from Madagascar
Integrated Disease Surveillance Programme (IDSP).pptxMostaque Ahmed
Similar to Assessing the performance of an integrated disease surveillance and response system in the context of varying malaria transmission: A case study from Madagascar (20)
ANATOMICAL FAETURES OF BONES FOR NURSING STUDENTS .pptx
Assessing the performance of an integrated disease surveillance and response system in the context of varying malaria transmission: A case study from Madagascar
1. Background
Assessing the performance of an integrated disease surveillance and response system in the
context of varying malaria transmission: A case study from Madagascar
Source: http://www.mara.org
Malaria poses a public health challenge in Madagascar, where the entire population is at risk, especially in epidemic-prone areas. Madagascar has an estimated annual
malaria death rate of 27 per 100,000 people1 and an under-5 mortality rate of 72 per 1,000 live births.2 Given that malaria cases and deaths reported through the national
Health Management Information System (HMIS) between 2003 and 2013 have fallen, Madagascar is considering pre-elimination strategies. This requires an effective
surveillance system to monitor cases, detect potential epidemics, and investigate cases and foci as appropriate. In October 2015, with technical assistance from the USAID-
funded MEASURE Evaluation project and support from the President’s Malaria Initiative (PMI), the Ministry of Health of Madagascar conducted a comprehensive
assessment of the routine health information system and the integrated diseases surveillance and response (IDSR) system. The assessment was implemented to better
understand and document the challenges in surveillance.
1 WHO, World Health Statistics 2015
2 Demographic and Health Survey (DHS) 2009
Results
Type of study: Cross-sectional, descriptive study.
Study population and sampling: National level: Six
malaria key partners were deliberately sampled.
Subnational level: 93 health facilities were randomly
selected in the 4 geographical areas: East Coast, West
Coast, Central Highlands and the sub-arid South. The 4
areas cover the two operational zones based on malaria
epidemiology: (1) endemic zone or high transmission
area including the East Coast and the West Coast; (2)
non-endemic (or low transmission epidemic-prone areas
covering the Central Highlands and the sub-arid South.
We assessed 44 districts and 10 regions corresponding to
the health facilities. All of the country’s 19 sentinel sites
were included, along with 218 community health workers
(CHW) from the health facilities in those districts.
Acknowledgments
Table 2. Data quality
Table 3. Management: Training in disease surveillance
Jean-Marie NGbichi1, Moussa Ly2, Jean-Claude Andrianirinarison3, Theo Lippeveld2 , Yazoume Ye1
1 MEASURE Evaluation, ICF International; 2 MEASURE Evaluation, JSI; 3CONSULTUS Antananarivo, Madagascar
Approach: MEASURE Evaluation PRISM (Performance of
Routine Information System Management) approach was
used to evaluate the performance of the system in terms of
Data Quality (data accuracy, timeliness, and completeness),
Management of the system, and Data Use. Data quality was
assessed for period January–February–March 2015.
Data accuracy: was assessed in comparing data reported to
data recounted and re-aggregated from health facilities’
registers. Timeliness of reporting: number of reports
submitted in time versus total reports expected,
Completeness of reporting: number of reports submitted
versus total reports expected.
Tools: The World Health Organization IDSR assessment tools
were adapted to Madagascar specific context for
quantitative assessment along with interviews guides for
the qualitative component.
Type of Facility
Data accuracy Timeliness or reporting
Completeness of reporting
Facilities that
reported accurate
data (%)
Overall
Timeliness (%)
**
% facilities with
Overall
Completeness (%)
**
% facilities with
0 report submitted
on time
All reports
submitted
on time
0 report
submitted
All reports
submitted
CSB1 (n=15) 80 36.1 40.0 13.3 53.3 33.3 40.0
CSB2 (n=45) 87 59.6 16.7 45.0 77.4 13.3 68.3
Hospital (n=5) 4/5* 60 1/5* 4/5 80 2/5* 3/5*
Sentinel Sites (n=19) 80 69.7 10.5 57.8 80.3 10.5 68.4
Private health centers (n=30) 77 29,7 56.7 20.0 33.3 56.7 26.7
All health centers (n=110)* 81 48.3 30.9 33.6 62.2 28.2 53.6
Community health workers (n=105) - 5.2 92.7 3.7 8.6 88.9 7.3
* : percentages not calculated (limited number n=5); ** Number of weekly reports submitted/Total expected (12 x number of centers per type)
27
20
26
7
93 96
100
80
33
56
32
27
40
62
58 57
60
71
68
60
% CSB1 (N=15) % CSB2 (N= 45) %SENTINELS SITES (N=19) % PRIVATE HEALTH CENTERS (N=30)
IDSR Guidelines Baseline data collection tools (registers)
Official list of diseases under surveillance Weekly reporting form
Rapid Alert form
Figure 1. Management: Availability of key IDSR tools
53.3
73
63
20
33
4240
56
74
CSB1 (N=15) CSB2 (N=45) SITES SENTINELLES
% of health facilities that held meetings with the community to inform/discuss surveillance data the last 6 months
% of health facilities that have reports of meeting held with the community to discuss surveillance data
% of health facilities that performed diseases prevention/control activities, based on surveillance data, the last 12 months
Number of
districts %
Have not received alerts based on surveillance data 17/44 39
Have received alerts based on surveillance data
27/44
61
Alert for malaria outbreak 12/27 27
Alert for poliomyelitis cases 19/27 43
Alert for measles outbreak 11/27 25
Alert for case of human rabies 10/27 23
Alert for case of plague 9/27 20,5
Collective foodborne infection (TIAC/ICAM*) 7/44 16
Have investigated all alerts 11/27 40,7
*: Collective foodborne infection including seafood.
Table 4. Data use at district level
Figure 2. Data use at health center level
Number of
units planned
Number of
units
assessed
Achieved
(%)
National 2 2 100
Region 10 10 100
District 45 44 98
Health centers 112 110 98
Hospital public 5 5 100
CSB I public 16 15 100
CSB II public 45 45 100
Sentinel sites 19 19 100
Private centers* 30 30 100
Community Health
Workers
224 218 97
Total 393 384 98%
* : 4 private centers are sentinel sites
Table 1. Study populationMethods
Data accuracy: Overall, 81% of all health facilities reported accurate data. CSB2 had the highest data accuracy rate at 87%.
Timeliness of reporting: Overall, 48% of all health facilities, with 36% for CSB1 and 30% for private facilities. Timeliness at the
community level was 5%. Completeness of reporting: Overall, 62% for all health facilities. Highest rates were observed in CSB2 and
sentinel sites—77% and 80%, respectively. The lowest rates were observed in CSB1 (53%) and private facilities (33%). Thirty-three
percent of CSB1 and 57% of private centers did not send any report for the evaluation period. The completeness of reporting was 7.3%
at the community level, and nearly 90% of CHWs had not submitted any reports.
Few health facilities had the IDSR management tools, such as the IDSR guidelines (<30% of all
facilities), the official list of diseases under surveillance (27%–56% with 33% of CSB1), and the
weekly reporting form (40%–62% of facilities)
Seventy-seven percent of districts and 6/10 regions had persons trained
in diseases surveillance within the district/region’s team. Nearly 50% of
technical staff involved in diseases surveillance at the district level and
57% at region level were trained in surveillance.
Districts Regions
Number % Number %
Districts/Regions with staff trained
in diseases surveillance
35/44 77 6/10 *
Persons trained for surveillance
among technical staff involved in
surveillance activities
68/137 49.6 20/35 57.1
Seventy-four percent of sentinel sites, 56% of CSB2, and 40% of CSB1 reported use of
surveillance data to conduct prevention and control activities in the last 12 months. Fifty-
three percent, 73%, and 63% of the facilities shown in Figure 2, respectively, reported having
discussed surveillance data with the community; 20%-–42% of these facilities were able to
produce proof reports of such discussions with the community.
Sixty-one percent of districts reported alerts base on IDSR
data in the last 12 months. The alerts were mainly for polio
cases (43%), malaria outbreaks (27%), and human plague
(21%). Only 40% of districts were able to investigate all alerts.
Overall, the assessment in Madagascar showed low data quality in terms of data accuracy, timeliness, and completeness—
especially at lower-level facilities and community levels. The system management is weak as well, as key tools such as IDSR
guidelines and weekly reporting forms were available in few facilities. Not all districts and regions had technical staff trained in
diseases surveillance. Overall few technical staff involved in surveillance activities were trained in surveillance. Data use was
limited at health facilities and at the district level, with few districts able to investigate all epidemic alerts in the past 12 months.
These findings suggest the need in Madagascar to develop and implement a comprehensive IDSR-strengthening strategy
including data quality assurance procedures, use of new technologies, capacity building of staff, coordination among partners,
and use of data to respond to disease control.
This publication has been supported by the President’s Malaria Initiative (PMI) through the United States Agency
for International Development (USAID) under the terms of MEASURE Evaluation cooperative agreement AIDOAA-L-
14-00004. MEASURE Evaluation is implemented by the Carolina Population Center at the University of North
Carolina at Chapel Hill, in partnership with ICF International; John Snow, Inc.; Management Sciences for Health;
Palladium; and Tulane University. Views
expressed are not necessarily those of
PMI, USAID, or the United States
government.
Conclusions