Why does it appear that achieving high learning gains doesn’t seem to guarantee good state test results?
In the previous blog post of September 13, 2016, we clearly established that there is a very strong relationship between STAR results and state assessment results. That post ended with the conclusion that:
- The strength of the relationships reported in the Imagine study and in all of the state specific Renaissance studies support the conclusion in the Imagine study, that STAR can be utilized as a proxy for any of the state tests. This means that by utilizing the STAR results to drive instruction on both the classroom and individual student intervention level and to monitor progress of students, classrooms, grade levels and schools, the school’s performance on the state assessments will be optimized as well.
We have been asked, however, why there does not seem to be a good relationship between learning gains based on STAR and state proficiency scores.
The basic reason is that these two values (school learning gains and proportion proficient) measure two very different things. A single administration of an assessment instrument (such as STAR or a state test) measures the academic status of a student at a particular point in time (proficiency, percentile rank, etc.) but does not contain any information regarding how that student got to that level. For example, Student A may have become proficient after making a great amount of growth to barely reach the proficiency bar and Student B became proficient by starting the year with enough knowledge to be considered proficient and making very little academic progress during the year.
Learning gains (growth between two points in time) measures the academic growth over the period between the two points in time, but does not tell us where the student ended up in terms of academic status. In the example above, both students would be proficient, yet student A would have a high learning gain (well above 1.0) and student B would have a low learning gain (well below 1.0).
Summary measures of academic status on the school level (e.g. proportion proficient) are highly correlated with the demographic nature of the student population and summary measures of learning gains on the same level are highly correlated with the academic success of the school, particularly when the two points in time represent the time that the school can have an impact (typically Fall to Spring). Therefore, it is possible that a school with very low students enrolling can have high learning gains yet have a low proportion of proficient students, and a school with very high students enrolling can have low learning gains and yet still have a high proportion of proficient students. We have found over the years that there is very little correlation between learning gains measures and status measures across schools.
It is the difference in the points of time used in the measurement of growth (learning gains) that explain the frequent differences between the state growth results and our learning gains calculations. The state growth calculations use a Spring to Spring time period that includes the Summer Break, a period over which the school has much less (or no) influence. This means that there will be a significant difference between the Spring to Spring measure and the Fall to Spring measure. This summer loss can result in students experiencing a good Fall to Spring gain, yet entering the following Fall at the same place that they entered the previous Fall. In the case of schools with a large Spring to Fall drop, it would benefit the school to investigate ways that it can provide assistance to students and families to overcome the summer slump.
A school can exhibit high learning gains yet not do well on the state tests due to the fact that students entering the school come in so far below the proficiency level that they cannot achieve proficiency even after making outstanding gains. This can be exacerbated by the fact that the summer drop can reduce (or eliminate) the gains made during the academic year.
It is an obvious truth that for students below proficiency the only route to proficiency is through academic growth (learning gains). Please note that this does not apply to students that are already well above the proficiency level. Imagine Schools has chosen to focus on learning gains for the evaluation of schools, teachers and programs between Fall and Spring because learning gains measure the academic success of the schools and not the demographic nature of the incoming student body as does proficiency measures. Learning gains also involve ALL students, not just those near the proficiency cut score. The Fall to Spring period utilized best relates to the period in time that the school, teacher or program has a significant influence