We’ve written a lot at the Show-Me Institute lately about A–F letter grades for public schools. The Missouri Department of Elementary and Secondary Education (DESE) will soon begin assigning these grades to all schools and districts under an executive order from Governor Kehoe. Legislation to codify the order may follow, depending on how the 2026 session unfolds.
A central component of these letter grades is student growth. Growth measures how much students learn over the course of a year, based on state assessments. To estimate growth, the state uses a statistical model to generate a “predicted” level of progress for each student. Schools and districts are then evaluated based on how their students perform relative to those predictions. In simple terms, high-growth schools are those where students consistently outperform expectations. You can read more about the Missouri Growth Model here.
I’ve studied academic growth extensively and believe it is the most accurate indicator of school effectiveness we have. No other measure comes close.
New evidence in support of this view comes from a study by researchers at MIT. The study compares test-score growth to a popular alternative for evaluating school quality: schoolwide surveys. The authors assess how well growth-based and survey-based measures predict important student outcomes, including high school graduation, graduating with distinction, and college enrollment and persistence.
The MIT study was conducted in New York City, where the district administers surveys to students, families, teachers, and staff. The surveys are designed to capture school climate across several domains: rigorous instruction, teacher collaboration, supportive environments, and trust. School surveys are intuitively appealing, especially for those who are skeptical of standardized tests. But how do they stack up to growth when it comes to identifying schools that produce strong outcomes for students?
The answer: not very well. The surveys are a little better at predicting high school graduation, but much worse at predicting more meaningful and differentiated outcomes including graduating high school with an advanced diploma, enrolling in college, and persistence in college. The authors conclude: “From the point of view of parents seeking to boost their children’s odds of going to college, test information is most valuable.”
The research evidence on the value of student growth as an indicator of school quality is overwhelming. This is just the newest study to add to the list. School surveys are nice, but when it comes to identifying effective schools, objectively measured growth is far superior.








