Quality counts? It depends on who is counting

Published January 26, 2009 5:00am ET



Education Week, which calls itself American education’s newspaper of record, publishes an annual report called “Quality Counts” that ranks the states according to education policies and outcomes. This year, Maryland came out on top overall and the self-congratulation has been in full bloom, some of which is well-deserved. But a closer look at the data shows student achievement doesn’t match the hoopla.

To measure student achievement, the researchers at Education Week used the percent of public school students who scored “proficient” on the latest National Assessment of Educational Progress (NAEP) http://nces.ed.gov/nationsreportcard/about/  in 4th grade reading and 8th grade mathematics.

Ranking the states by Ed Week’s own measure of student achievement, Maryland was far from number one. Maryland didn’t even make it into the top 10. In fact, Maryland ranked 17th in reading and 14th in math. To be fair, some of the underlying NAEP scores are so close as to make the differences between some states negligible, but the best that can be said is that Maryland’s public school students scored higher, to statistically significant degree, than their counterparts in only about half of the 50 states.

So, how can Maryland be first in education, when student performance is just above average? The folks at Education Week say student achievement is among 35 factors that go into their overall rankings.

It’s helpful to look at those 35 factors by separating them into two broad categories: inputs and outcomes. The inputs are the variables that may affect outcomes. Outcomes are results, such as measurements of student achievement.   

Aside from the results in math and reading on the NAEP, the only other factor used for the Education Week rankings that could be considered an outcome of the public school system is the percent of public high school students who graduate with a diploma.      Maryland tied for 23rd place.

All of the other factors measure inputs, and that’s how Maryland made the grade. Some factors,  like family income, parent education and parent employment are not under the direct control of states or school systems. Other factors measure state financing or how well school systems smooth the transitions from pre-k to college – important factors which policy makers do control.  Still others grade the states on indicators of debatable merit, such as whether college prep is a prerequisite for a high school diploma (should it be?) and whether the state has written a “formal definition for school readiness” in addition to having established standards.

If the factors Ed Week used to rank the states create better outcomes for students and Maryland is the model, why aren’t the outcomes better in Maryland?

Maybe it’s because Ed Week didn’t even consider some pretty meaningful inputs like the quality of the curriculum, where Maryland earned a grade of C+ from the most recent Fordham Institute’s State of the Standards Report.

The disparity between Maryland’s headline-grabbing rank and actual student achievement may have a variety of causes. But if I lived in Massachusetts, which had the top scores in reading and math by a good margin, I’d demand a recount.

 

It’s fine for Education Week put out a report like “Quality Counts”, but student achievement should carry a lot more weight in the rankings or else be ranked in a separate report.  That way, policy makers, parents and other taxpayers can distinguish between what goes into the nation’s public schools and what comes out.  And we can be sure that the investments made in education create outcomes worth crowing about.

Cindy Mumby is a public school parent and advocate living in Harford County who writes about education issues. This article was first published in the Dagger.