Education

There’s New Data On Last Year’s Student Learning Loss. Let’s Not Draw The Wrong Conclusions.


This week, both NWEA (a testing company) and McKinsey and Company (a consulting firm) released reports examining “learning loss” (now being rebranded as “unfinished learning,” a marginally more accurate term). The bottom line is that test scores in math and reading went up over the course of last year, but not as much as in previous years.

Both Chalkbeat (a publication that regularly covers education) and the New York Times (which did not assign an education reporter to the story) were there with coverage this morning. Both studies looked at student test score gains from last year and compared them to other years, finding that students did raise their scores between the beginning and the end of the year, but not as much as students in previous years. They further noted that the wealthier the district, the closer students came to a previous years’ gains.

As this story works its way through the media, there are a couple of important points to remember.

First, NWEA and McKinsey are both businesses with products to market in the education sector. The NYT chose to use the language or marketers, framing the results as showing “months behind,” a framing designed to alarm parents (”There’s a race going on and your kid is behind!! Quick—buy something to fix it!”).

Second, always remember that no matter how these results are framed, we are talking about the results of a standardized test of reading and math, not a full measure of all learning. Researchers have repeatedly demonstrated that test results are predictors of demographics, so don’t assume they mean what some folks—the same folks who use them to drive business—tell you they mean,

No, it’s not good news that test scores were lower than in the past, and certainly the last year did not provide the full normal year’s school experience for most students. But in trying to search out specific causes, we need to avoid coming to dangerous, unhelpful conclusions.

Poring over the data, folks are trying to sort out links to distance learning, teachers being stretched over multiple hybrid modes, students caught on the wrong side of the digital divide, and students who had far bigger concerns than taking a no-stakes standardized test— all of these are clearly factors. But the other major factor here is a year without test prep, a year in which teachers spent far less time getting students ready to take the Big Standardized Test. That’s not the full explanation, but it cannot be dismissed as a factor.

If we fail to look at the full picture, the full breadth and depth of student experience and education in the past year, we will see students—particularly those in poorer districts—inundated with a barrage of intensive test-centered reading and math. This is never a good idea, but in a year in which students have so many other pressing pandemic needs and issues, it’s an even worse idea. If we insist on using these standardized math and reading tests as the measure of all learning, we risk falling back into Campbell’s Law territory, where the measure of social activity ceases to be a good measure and corrupts that activity. Students will have many, many needs to be met this fall; “get a good test score on the Big Standardized Test” should not be anywhere near the top of the list. We should not let tales of “dramatic proof” of “learning loss” panic us; educational practices that would be a mistake in any other year will be an even worse choice in this one.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.