More Related Content
Similar to AssessmentReportSBAPassRatesvs.HSPEMPSPassRates
Similar to AssessmentReportSBAPassRatesvs.HSPEMPSPassRates (20)
More from Kenneth Olden (9)
AssessmentReportSBAPassRatesvs.HSPEMPSPassRates
- 1. Abstract
The State of Washington expected (and found) a drop in SmarterBalanced test scores as compared to
the previous graduation assessments. Despite low overall scores, the Mount Adams School District
saw scores drop less than the state average for 8 of 12 exams and drop less than local districts with
similar demographics for 5 of 12 exams. The highest pass rates occurred in gradelevels that had
spent more instructional hours with Common Core State Standards than the Washington State EALRs
or GLEs and the lowest pass rates correlate with gradelevels that experienced outlier events.
Background
On August 3rd, the Center for Education Effectiveness (CEE) and the Office of Superintendent of Public
Instruction (OSPI) met with the member districts of ESD 105 to compare the 4year historic HSPE and MSP
pass rates to the new SmarterBalanced Assessment (SBA) pass rates.
Most of the districts present had already received their SBA results and the 4year HSPE/MSP pass rates were
a matter of public record, having been posted annually on the OSPI School Report Card site. What wasn’t yet
public knowledge were the SBA pass rates for individual districts or the average pass rates for all gradelevels
across the state, so this meeting was an opportunity for districts to calculate how they did in comparison to
their neighbors and prepare for the inevitable inquiries from the press and local communities.
Concerning the Mount Adams School District, the immediate question was how the district fared on the new
assessments compared to its neighboring districts with similar socioeconomic and racial demographics. This
analysis was driven by the following questions:
1. Did pass rates on the SBA rise or drop when compared to the pass rates on the HSPE/MSP?
2. For individual schools or districts, how did their pass rate differ from the that of the state average or
their local cohort?
3. Give a district/school’s drop or rise in test scores, what could be determined about that district/school’s
“readiness” for the SmarterBalanced exams?
Possible Outcomes
A. The pass rates on the SmarterBalanced exams will be lower than the 4year average pass rates on the
HSPE/MSP.
This outcome seems likely, as a drop in scores was widely expected due to the increased rigor of the Common
Core State Standards and the SBA assessments. The HSPE and MSP assessments had been based on the
less rigorous Washington State Standards (EALRs and GLEs) and were not considered to be challenging
assessments. Give the higher expectations of the Common Core and the variety and complexity of the material
on the SBA exams a drop in scores is not unlikely.
B. The pass rates on the SmarterBalanced exams will be higher than the 4year average pass rates on
the HSPE/MSP.
This is far less likely outcome for the state as a whole, but for specific districts, schools, or gradelevels within a
school it is very likely someone came through SmarterBalanced testing with a higher score than their previous
HSPE or MSP average it just isn’t likely that this is true for the majority, for reasons previously stated.
1
- 2.
C. The change in pass rate should be fairly equal (on average) for all districts and/or schools, barring any
factors that could be considered outliers.
Given the number of years districts had to prepare for the both the CCSS and the SBA, it could be assumed
that all districts should be equally prepared and that no matter what skills or skill deficits their students brought
with them into the new testing environment, a similar level of performance shift should be expected. However,
it should be noted that some districts were early adopters of the Common Core and had a greater amount of
time to ramp up their instruction to meet the standards, while other districts were involved in SBA pilot
programs and had a year or more to be “hands on” with the exam.
For the sake of argument we will not consider these factors as outliers, as these opportunities were widely
available and the decision not to take advantage of them is not the same as not having access to those
opportunities. It may be interesting to see if those districts that were early adopters of the CCSS and/or helped
pilot the SBA outperformed those districts that did not, but that is outside the scope of this particular analysis.
D. When excluding factors that could be considered outliers, any disparity in pass rate change between
districts and/or schools would be the result of their readiness and preparation for the Common Core
State Standards and/or the SmarterBalanced exams.
The key factor here lies in what we would consider to be an outlier. If School A has 30 students absent on the
day of the test while School B has perfect attendance and School A has a far lower pass rate than School B
due to those students not taking the exam (and therefore being counted against the school’s scores), we would
consider attendance to be an outlier if we were trying to determine whether or not the students of School A
were as academically prepared to take the exam as the students of School B. Considering the disparity
between districts (and even between schools in the same district) in the ESD 105 cohort, it is very likely that
some schools or groups of students had scores that were in part due to some outside factor that had little to do
with the content and rigor of the test. More on this later.
Results
Based on state averages, it appears that all gradelevels demonstrated a measurable drop on their SBA pass
rates as compared to the historic HSPE/MSP average, with the largest drops occurring on the 3rd, 4th, and 6th
grade ELA exams and on the 5th and 6th grade Math exams.
The following table notes the percentage of students that scored a 3 or a 4 on the SBA exam. It does not take
into account the new cut score of 2.5 for 10th grade. When that cut score is taken into account, pass rates are
likely to increase. There is no score for the HS Math SBA, as that test was only taken by 11th graders and the
End of Course Exam is only taken by 9th graders, giving most districts no clearly comparable data.
2
- 5. Table 4 Local Cohort Pass Rates
Table 5 Local Cohort vs. State Drop Rates
Tables 4 and 5 detail the scores of several school districts in the Lower Valley region of the Yakima Valley.
These districts were selected because of similar socioeconomic and racial demographics, namely high levels
of poverty and a student population that is mainly Hispanic or Native American. The Mount Adams School
District currently has 98% of its student population living below the federal poverty line and has a student
enrollment that is roughly 55% Native American (mostly Yakama), 40% Hispanic, and 5% various ethnic
groups. Though several of the above districts have higher populations than MASD, they generally deal with a
similar clientele and economic resources.
Given those details, Table 5 is pretty compelling. When the drop of pass rates for each district is examined and
compared to the average drop in rates for the state (Table 1), we see that not only did the pass rates for Mount
Adams not drop as severely as they did for the state on average, but for many exams and gradelevels the
district also saw pass rates drop less severely than did their neighboring districts. Of special note we again
have 3rd grade Math, where the district saw scores rise by 2% (8% more of the district’s 3rd graders passed
than did the state on average), as compared to Mabton with 50% fewer students (than the state) passing.
Similarly, Wapato and Granger had 12% and 18% students passing the 3rd grade Math SBA respectively.
For the ELA SBA, 4% more Mount Adams 3rd graders passed the exam than the state average, while
Toppenish, Granger, and Mabton saw 13%, 24%, and 28% fewer students pass the exam. For the HS ELA
exam (with scores of 10th and 11th graders combined), 2% fewer students passed the exam for Mount Adams,
while the remaining districts saw far larger drops in pass rates: Wapato, 11%; Toppenish, 36%; Sunnyside,
23%; Granger, 17%; Mabton, 30%. This trend continues for other gradelevels as detailed in Table 6.
5
- 7.
Data Analysis
At this point it is pretty apparent that Outcome A has been verified as correct and Outcome B has been
partially verified for specific gradelevels in some districts. As for Outcomes C and D…
Why did Mount Adams experience a smaller drop in pass rates for most exams than it’s local cohort? It could
be due to the smaller n (number of students testing), though due to historic issues with attendance in the
district that would more likely have led to even fewer students passing. It is more likely due to a confluence of
factors, such as the fact that the district had been preparing for the Common Core State Standards for several
years and that the district had volunteered to pilot the SBA exams in 20132014. The first of these is probably
more responsible than the second, as the district adopted the CCSS around 2010 to 2012 and the students
with the highest pass rates (3rd and 5th grade) would have started Kindergarten or 1st grade with the new
standards at the core of their instruction. The later grades wouldn’t have met the standards until middle school
and would have had to adapt to the new level of rigor after years of being used to a certain level of instruction,
with the likely result of fewer students passing the exams.
Without knowing more about how the other districts in the Lower Valley cohort prepared for the
SmarterBalanced exams, it is difficult to identify what outliers, if any, may have led to the lower pass rates in
specific gradelevels as compared to either the state or their neighboring districts. We can however explore a
few outliers for Mount Adams that could have greatly affected pass rates.
4th Grade ELA: This cohort of students experienced a teacher leaving midyear and spent the remainder of
the year under a longterm sub who was eventually hired full time for the position. A change of staff midyear
could very easily be disruptive for the students and could explain why there was a dramatic drop in pass rates
for ELA but not a similar drop in Math.
HS ELA/Math: This group had an interesting test experience thanks to two factors: One, several students who
had previously passed the HSPE did not take the SBA, as their parents had chosen to opt out. Considering
these students were predicted to perform well on the SBA, losing that cohort of students most likely caused the
overall pass rate to drop. Two, about 30% of the remaining students taking the exam staged a walkout in
protest over taking a new standardized test after being required to pass the HSPE the year before in order to
meet the state’s graduation requirement. While those students did return to their test locations, many of them
refused to complete the test and scored a Level 1 despite having scored 400+ (Level 3 or 4) on the HSPE
previously. Those two factors could very well explain the low number of students who passed the exam in both
areas.
Absences: The state counts any student who does not complete the exam as a fail (Level 1) for the school.
There is no standalone score level for students who fail to appear for or complete the exam. Considering these
assessments are intended to measure student academic proficiency, giving them a Level 1 for failing to take
the assessment seems to defeat the point. The state has an expectation that all students test, but lumping
absent students in with failing students and then using that data to claim school is not serving the academic
needs of those students seems counterproductive. Given that MASD has struggled with attendance rates for
several years, it is likely that several Level 1 scores in all gradelevels is due to absenteeism rather than
academic performance.
Small n: Population size can have a dramatic impact on pass rates, especially when the population is as small
as it is for Mount Adams. For example, the High School ELA exam had scores for 68 students. Of those 68,
7
- 8. 31% or 21 scored a 3 or higher. When the state added the new cut score of 2.5, 7 additional students achieved
a passing score, bringing the total pass rate up to 41%. That such a small number of students could create
such a dramatic shift in pass rates is something that cannot be ignored. A few more students absent and that
pass rate could drop, a few more present and it could have risen. For a testing cohort size of 68, each student
represents a possible 1.5% pass rate increase or decrease. By contrast, a school with a testing cohort of 300
students would only see a .33% pass rate change for each student that passed, failed, or did not attempt the
exam.
These possible outliers could explain why scores dropped for specific gradelevels and exams. They might
also help illuminate why scores for the Mount Adams School District are low in general. As noted in Table 3, no
gradelevel had more than 30% of its students pass their SBA ELA or Math exams and 4year historic average
for HSPE/MSP scores averaged out around 3040% as well. It’s possible that several factors, such as the
small n and high rate of absences, could have severely impacted the overall pass rate for each gradelevel.
Conclusions
While the overall test scores are low, with no grade receiving over a 30% pass rate for the ELA or Math exam,
scores for the Mount Adams School District did not drop as severely as they did for the state of Washington or
the local cohort of Lower Yakima Valley school districts. Given that the district adopted the Common Core
State Standards several years before the exams were held and saw the highest pass rates and lowest drop
in pass rates compared to the state in the gradelevels that had the longest experience with the CCSS as
their core curriculum, it may be possible to conclude that the Mount Adams School District’s efforts into
preparing for the CCSS and the SBA had had positive results.
Next Steps
More needs to be done to increase test scores without sacrificing the quality of education students receive.
This can be done by better adherence to the expected standards and content of the courses that overlap with
the skill bands assessed by the SBA, by ensuring students learn under the consistent stewardship of qualified
teachers, and by working with the local community to ensure students have accurate attendance to school in
general and on test days in particular. As it is possible that these factors may have contributed to the overall
low score levels as well as the more dramatic score decreases in certain gradelevels, resolving them should
help the district see an increase in test scores and overall academic performance.
8