Sections

Research

The strange case of the disappearing NAEP

Applicants for the study of medicine at Vienna University wait for their screening test to begin

The long term trend test of the National Assessment of Educational Progress (LTT NAEP) is the longest running test of student achievement that provides a scientifically valid estimate of what American students have learned.  For over four decades, beginning with a science assessment in 1969, the LTT NAEP has tested randomly-selected groups of students age 9, 13, and 17 in several school subjects.  Reading and mathematics have been assessed the most. The reading test began in 1971 and the math test in 1973.[i] 

The last LTT NAEP was given in 2012. It was scheduled to be given in 2016, but that assessment was cancelled because of budget cuts. It was then scheduled for 2020, but earlier this year–again for budgetary reasons–that assessment was also cancelled. Currently, the next LTT NAEP is scheduled for 2024. If it is indeed administered then—and there is no guarantee that it will be—twelve years will have passed between NAEP LTT tests. Up until 2012, the longest interval without a LTT NAEP was five years.

Researchers have questioned the wisdom of delaying the LTT NAEP for such a long period of time.[ii] Scores on the main NAEP—the other, better known NAEP assessment—have stagnated in recent years. In 2016, the LTT NAEP could have provided another authoritative measure of national achievement, at a time when Common Core and other education reforms are changing U.S. schooling. In fact, when the main NAEP was introduced in the 1990’s, it was designated the national assessment that would change periodically to reflect changes in curricular fashions.  The LTT was designated the national assessment that would remain unchanged by shifting sentiments.

NAEP’s schedule is set by the National Assessment Governing Board (NAGB). At its November 2015 meeting, the board issued a press release explaining that “budget constraints and priorities for the NAEP program” necessitated several cutbacks, including postponement of the LTT NAEP.[iii]

Three of NAGB’s top priorities are questionable:

Digital assessments

All NAEP assessments are to be digitally-based by 2017. Digital assessments have merits, but NAGB has yet to release a cost-benefit study showing that exclusively using digital platforms is worth the cost.[iv] In addition, it’s quite likely, given the rapid advances of assessment technologies, that whatever NAGB decides to embrace today will be obsolete in a few years. The two key questions are: How much will going all-digital cost, not just now but in the immediate years ahead? What are the added benefits beyond paper and pencil tests that justify the exclusive use of digital assessments? The danger is that NAEP, with a severely constrained budget, is now on the hook for escalating costs that will produce negligible or ephemeral benefits.

Trial Urban District Assessment (TUDA)

Administered every two years in math and reading, the main NAEP oversamples in 21 large urban districts and releases the scores separately as TUDA. Is every two years necessary? The districts don’t depend on NAEP to tell them how students are doing; they already take part in annual state assessments. Moreover, even without TUDA, it’s possible to compare big city students’ NAEP scores from state to state. You can compare the performance of schools in New York state’s big cities to the performance of big city students in California. But you can’t disaggregate by city, to compare, for example, New York City students’ reading performance to that of students in Los Angeles. So the only true benefit of TUDA is for its 21 districts to compare themselves to other TUDA districts that are not in their own state. This seems like a pretty slim reward. Doesn’t it make sense to administer TUDA less frequently and to redeploy those funds so that we can compare U.S. students’ reading performance in 2016 to reading performance in 1971?

Technology and Engineering Literacy (TEL)

TEL is NAEP’s newest assessment. After several years in development, it was first given to eighth graders in 2014. The word “literacy” should set off alarm bells. As educational jargon, the term extends far beyond the traditional definition of being able to read and write. When appended to academic subjects other than reading, “literacy” de-emphasizes disciplinary knowledge in favor of applications and problem-solving. NAEP’s Technology and Engineering Literacy assessment is a perfect example.

The test is completely computer-based. It seeks to tap “21st Century learning skills” such as collaboration and communication. Students are presented with scenarios and asked to solve particular problems. NAEP has produced a series of videos to explain TEL. In one scenario, students are asked to play the role of an engineer and help a remote village where the water well is no longer working. Viewers are told, “The student is not expected to have any prior knowledge of wells or hand pumps but is challenged to use the information provided to get to the root of the problem.” The narrator goes on to explain, “The point of a task like this is to measure how students apply real-world trouble-shooting skills.”

The scenario is beautifully crafted and engaging. Students will enjoy the task. But what is really being measured here? Does the item assess key knowledge and skills of engineering? NAEP already has a lot of problem-solving items in its math and science tests, so for the TEL to generate unique information it needs to target the non-mathematical, non-scientific side of technology and engineering. Are engineers who don’t know math and science in demand? Moreover, few eighth graders have actually taken a formal course in technology or engineering, so background information that real engineers would consider elementary must either be provided to the student or avoided altogether.

Questions concerning transfer always come up with items like the water pump problem. For students who do well on the item, does it mean they possess skill at solving a wide variety of engineering problems that students who do poorly do not possess?  Or does it merely mean they could figure out the solution to this particular problem because the right information was given to them? The former may be important; the latter clearly is not—and not worth monitoring with a national test.  Solving problems without content knowledge is a dubious activity.[v]

Conclusion

NAGB has postponed the LTT NAEP until 2024 because of budgetary constraints.  In the meantime, it has pursued other projects, among them: making NAEP all-digital by 2017, continuing the TUDA assessment every two years, and launching the TEL assessment.  The LTT NAEP is a national treasure. It is the only test that measures trends in American academic achievement back to the early 1970s. It is also NAEP’s original reason for existence.  But some question its value.[vi]  If NAGB has decided that the LTT NAEP is no longer worthy of funding at all, that the LTT should be abandoned permanently, it has been silent on the rationale for that decision.

“The LTT NAEP is a national treasure. It is the only test that measures trends in American academic achievement back to the early 1970s”

The three priorities that have taken precedence over the LTT NAEP incur significant costs and may produce benefits that fall short of justifying the expense. Properly evaluating NAGB’s priorities requires evidence. But NAGB hasn’t released figures on how much the new projects cost, how much the LTT NAEP costs, all of the projects’ expected costs in upcoming fiscal years, and the informational benefits each of the projects are expected to yield.  Benefit-cost analyses are a conventional component of policy deliberations.

This is an election year. With a new administration and a new Congress coming to power in January, NAGB can begin to address its budgetary problems by releasing information that Congressional committees will want to see. Why NAEP has abandoned its foundational assessment and embarked on its current agenda should be the central question of NAEP’s request for funding next year.


[i] https://nces.ed.gov/nationsreportcard/ltt/interpreting_results.aspx[ii] See Kristin Blagg and Mattew M. Chingos, “Varsity Blues: Are High Schools Students Being Left Behind?” (Urban Institute, 2016) and Nat Malkus, “Taking too long on the NAEP long term trend assessments,” (American Enterprise Institute, 2015).[iii] Also see the resolution urging increased NAEP funding adopted in August 2015: https://www.nagb.org/content/nagb/assets/documents/policies/NAEP%20Funding%20Resolution%20Approved%208.8.15.pdf[iv] NCES offers a website devoted to NAEP’s transition to digitally based assessments: https://nces.ed.gov/nationsreportcard/dba/[v] The question of whether problem solving skills can be divorced from content knowledge has been debated for decades. For a discussion regarding mathematics, see: Jamin Carson, “A Problem with Problem Solving: Teaching Thinking without Teaching Knowledge,” The Mathematics Educator, v. 17, no. 2 (2007), pp. 7-14.[vi] In an Education Week story on the LTT NAEP, University of Illinois professor Sarah Lubienski described the LTT NAEP as a test of basic skills and dismissed the assessment’s importance.

Author