From nprEd …..
The latest results of the test known as the Nation’s Report Card are in. They cover high school seniors, who took the test in math and reading last year. The numbers are unlikely to give fodder either to educational cheerleaders or alarmists: The average score in both subjects was just one point lower in 2015 compared with the last time the test was given, in 2013. This tiny downtick was statistically significant in mathematics, but not for the reading test.
But even though the changes are small, chances are you’re going to be hearing about them in a lot of places.
Why is this test so widely reported on, widely cited and widely debated? And how does it line up with common-sense yardsticks of how students are doing? Let’s take a closer look.
The National Assessment of Educational Progress, or NAEP, has become a standardized test that even some critics of standardized tests rely upon. One big reason: It’s a research project conducted by the U.S. Department of Education, not a state accountability test.
Unlike state tests, which have been shifting year by year with the adoption of the Common Core, NAEP scores are comparable across decades — back to 2005 for math and all the way back to 1993 for reading.
“In our era of incredibly volatile state and local testing practices, it is our North Star,” says Andrew Ho, a measurement expert at the Harvard Graduate School of Education who sits on NAEP’s bipartisan governing board.
A large sample of high school seniors nationwide, in both public and private schools, took the tests last year — 18,700 students in reading and 13,200 in mathematics. This allows direct comparisons across states and cities.
And the absence of consequences for schools or teachers means students are not typically prepped or drilled to take the test, which potentially makes it a more useful measurement of student achievement than some state tests.
But what do NAEP scores mean? On the 12th-grade test in particular, Ho says, research shows that NAEP maps well with estimates of college and career readiness from Common Core-aligned tests, the SAT and the ACT.
According to research by Ho and others, just under 40 percent of students score at college and career ready levels on NAEP.
“College and career ready” means these scores strongly predict that students will be able to succeed doing college-level academics, or with on-the-job training in a position requiring only a high school diploma.
That seems clear enough.
Except when you realize a couple of things.
One is that in 2015 the nationwide high school graduation rate was 82 percent, not 40 percent. That leaves a potentially large group of kids who got diplomas but who weren’t ready to succeed in college.
Who is right: their high schools or NAEP?
“I think the charitable view is that graduation is not just reading and math,” says Ho, meaning that high school diplomas also include things like “social studies, science, the arts, PE and showing up.” In other words, the diploma potentially captures achievements over time, rather than the ability to do well on a short, mostly multiple-choice test taken on a single day.
On the other hand, he says, “the less-than-charitable view would be that graduation is just a lower standard than college readiness. If you get right down to it, the reading and math required by NAEP, the ACT, the SAT, colleges and careers is much greater than what high schools are saying is sufficient.”
And there’s a second issue. Most standardized state tests have a single passing score, known as the cut score. NAEP slices its results into four categories: below basic, basic, proficient and advanced.
But what Ho calls “college and career ready” doesn’t line up perfectly with either the “proficient” or “basic” standard on the NAEP 12th-grade test. Instead, it falls somewhere in between.
“This is the fundamental problem of standards,” says Ho. “You can come up with a different and seemingly defensible standard every day over coffee.”
And this leads us to a third mystery of NAEP. Why even have the “basic” standard at all?
Ho calls the basic standard a “useful signaling device.” It focuses more attention on the lowest performing students — who are all too numerous, it seems.
While overall results are barely changed, it seems that the nation’s struggling students in particular are doing slightly worse than they were two years ago, while higher achievers are doing slightly better.