we don’t know whether students know more or less about civics

(Arlington, VA) It would be nice to know whether civic knowledge–defined in any way you choose–has been rising or falling. But I don’t think we know the trend at all.

Public survey data don’t help much. The questions that pollsters have repeated over time are ones like “Who is the Vice President?” That is an important correlate of voting and news interest, but it doesn’t reliably measure general knowledge or changes in knowledge. After all, responses to that question would rise if the current VP became more prominent.

The best official source is the federal NAEP Civic Assessment, which I am in Virginia to help plan and have been involved with since 2008. I cannot emphasize enough how dedicated and careful the NAEP’s staff and contractors are. They make every decision with extreme scrupulousness. That includes all the decisions–large and small–that contribute to making each iteration of the NAEP as comparable as possible to previous years. And therefore, I don’t think anyone could produce more reliable trend lines than these (taken from the NAEP Civics Report Card):

The asterisks mark changes that are statistically significant in the sense that there is a very small chance (less than 5%) that the changes were due to random variation in who took the test.

But there are reasons to be wary of these as trend lines:

  • Connecting the dots with straight lines suggests that American kids’ knowledge followed that path over time. But we really just have three measures taken at three moments over 12 years. The actual trend could have bounced up and down over that time.
  • Civics involves understanding the world around you. The world changed from ’98 t0 2010. NAEP generally tries to the keep its constructs and even some of its questions identical over time, in order to preserve the validity of the trend. But the same construct (e.g., “petitioning the government”) meant something very different under Clinton than it does under Obama.
  • Despite Herculean efforts to measure whether each test is comparable to the previous one, that process introduces error that is not measured by the statistical significance test.
  • The population of kids in k-12 schools has changed dramatically in that timeframe. For instance, more students are reaching 12th grade.
  • NAEP is not, and does not pretend to be, a comprehensive assessment of civic knowledge. For instance, it omits current events (in part because the assessment takes three years to design and field). NAEP scores could remain flat even if knowledge of current events soared–or plummeted.

Overall, I think we can guess that American kids’ civic knowledge has been pretty flat since the 1990s. Even rougher comparisons to the earlier Civics NAEPs suggest that knowledge has been pretty flat since the 1970s. Dramatic changes would likely have been noticed. But I don’t think we can reliably say that knowledge has ticked up or down. For that, we would need much more regular assessments.