Sections

Commentary

Cutting research funding would make education less effective and efficient

Cara Jackson,
Cara Jackson President - Association for Education Finance and Policy

Daphna Bassok, Beth Boulay,
Beth Boulay President-elect - Society for Research on Educational Effectiveness

Michal Kurlaender,
Michal Kurlaender Professor - University of California, Davis

Lindsay Page, and
headshot of Lindsay Page
Lindsay Page Associate Professor - Brown University
Elizabeth Tipton
Elizabeth Tipton Professor - Northwestern University

February 24, 2025


  • The Trump administration cancelled nearly all education research contracts, claiming these cuts to data collection, assessments and program evaluations would boost efficiencies.
  • Research and data help policymakers create a more efficient education system; cutting them reduces efficiency by undermining their ability to understand which investments pay off and which do not.
  • Federal investments are essential for identifying interventions to support more effective education systems.
The U.S. Department of Education is seen on December 6, 2024 in Washington, D.C. The U.S. Department of Education is one of the many federal agencies and programs in the crosshairs of President-elect Donald Trump as he looks to cut the federal budget when in office.
The U.S. Department of Education is seen on December 6, 2024 in Washington, D.C. The U.S. Department of Education is one of the many federal agencies and programs in the crosshairs of President-elect Donald Trump as he looks to cut the federal budget when in office. Samuel Corum/Sipa USA
Editor's note:

This is part of the “Why we have and need a US Department of Education” series, which seeks to examine the role of the U.S. Department of Education at a time when the president of the United States has called for the Department’s demise. It considers what the Department does to shape education policy and practice in the United States. It also addresses misconceptions about the Department’s role and the president’s authority to dismantle it.

On February 10, as part of its mission to “maximize government efficiency,” the Trump administration announced its cancellation of roughly $1 billion in federal contracts for education research. These contracts were held under the Institute of Education Sciences (IES)—the research, statistics, and evaluation arm of the U.S. Department of Education, which was created as part of the Education Sciences Reform Act (ESRA) of 2002. The contracts affect organizations and individuals all across the United States and impact our ability to monitor the global competitiveness of our students.

The cuts were widespread both in terms of content affected and regions impacted. Contracts covered program evaluations, technical assistance, national databases such as the Common Core of Data, and international assessments including the Trends in International Mathematics and Science Study and Program for International Student Assessment. More recent reporting suggests it also might include parts of the National Assessment of Educational Progress (colloquially known as “the Nation’s Report Card”).

While there are many legitimate debates regarding publicly funded education, we can all agree that efficiency is an important consideration. We all want the highest quality education opportunities for our children and to foster those opportunities with as little money wasted as possible.

So, how likely is it that last week’s cuts improve efficiency?

First, it’s important to note that the cost savings from these cancellations were much less than the administration claimed. As Chalkbeat, Hechinger Report, and others have pointed out, much of the funding for terminated contracts and grants had already been disbursed. This means contractors had to stop work on partially complete projects—after money had been spent but before the projects yielded results that could benefit the public.

But the stated goal wasn’t just saving money. It was efficiency. It’s worth spending some time on what “efficiency” means. Based on recent actions, DOGE appears to be focused primarily on cost cutting. Cost cutting does not guarantee efficiency. Rather, efficiency is inherently about return on investment, which reflects the tradeoff between money spent and the impact of the work that money supported. The federal government can save money by cutting spending on education research, but these cuts aren’t efficient if education leaders lose access to data and research that helps them make more informed, cost-effective decisions about resource allocation. 

Abruptly ending near-complete projects does not drive efficiency. Instead, it ensures that the public sees no benefit from the investments already made, converting what was productive spending into waste.

Ironically, helping policymakers across the country maximize government efficiency by providing access to high-quality data and evidence is actually the core focus of IES, which was severely damaged by last week’s cuts. From its inception, IES has funded research to understand “What Works, for Whom, and Under What Conditions.” Its purpose has been to reduce the waste—in time, money, and other resources—that comes from using programs, curricula, and policies that do not improve student learning. 

To maintain America’s global competitiveness and ensure that taxpayers see a return on their investment in education, we need data to understand the problems, navigate investments in innovative programs and practices, evaluate the impact of new programs, and share knowledge. This is exactly what IES does.

Let’s be clear about what is—and isn’t—efficient when it comes to education.

First, trying to improve education without data isn’t efficient. To solve problems, we need to understand problems. To understand problems, we need trustworthy data. Data on children’s literacy and math learning and on college-going is essential for ensuring that children are getting the support they need and we are getting returns on education investments. Basic data collection and descriptive analysis can help diagnose problems for practitioners and policymakers to address and track over time. IES data such as High School and Beyond, the School Survey on Crime and Safety, the Beginning Teacher Longitudinal Study, and the Common Core of Data have allowed us to track children’s, teachers’, and schools’ outcomes over time in the United States. And datasets such as the Trends in International Mathematics and Science Study and Program for International Student Assessment allowed us to track progress relative to other nations. 

Second, pouring money into untested interventions isn’t efficient. The appeal of going with our guts on what works may be strong, but far too often our guts are wrong. For years, the primary approach to reading instruction in the United States was “balanced literacy” or “whole language,” an approach focused on understanding meaning through context and shared book reading rather than breaking down words into individual sounds. The approach appealed to educators and parents, who may have lacked information about the evidence for different instructional strategies and been susceptible to sleek advertising campaigns. Yet, decades of scientific research proved there were better options.

Following Emily Hanford’s Sold a Story podcast, parents nationwide demanded improvements in reading instruction. Such demands rested on a decades-long federal investment in the scientific study of reading, from the work of the National Reading Panel in 1997 to hundreds of intervention reports produced by IES. Last week, “science of reading” efforts took a hit when a practice guide on K-3 literacy and the Multi-tiered Systems of Support for Reading were among the IES cuts. Both were recently completed, but because the results were not yet released, millions of dollars of investment in supporting children, teachers, and families have been wasted. Education researchers—many of them funded by IES—have scientific toolkits to test whether education interventions produce their intended impacts. Without the resources to scientifically test whether programs work, we are sure to make costly mistakes.

Third, ceding authority to developers and businesses without any oversight isn’t efficient. Education leaders are constantly being pitched programs, services, and curriculum that haven’t been rigorously tested. There is no regulatory body—like the Food and Drug Administration—to ensure that only effective programs make it to schools, and, as a result, new programs and curricula can be marketed and sold to schools based upon flimsy promises. Since the pandemic, more and more technology and AI are being integrated into the classroom, which makes evaluation more urgent than ever. The independent impact evaluations funded by IES have been essential for understanding if federal, state, and local money should be invested in new programs, curricula, and policies. 

Fourth, failing to leverage collective learning isn’t efficient. Publicly available education research and research-informed materials can improve efficiency by dramatically reducing duplication of effort across the roughly 130,000 schools and 13,000 districts in the United States. Educators, instructional coaches, and school leaders across the nation can benefit from using practice guides (which are among IES’s most popular products) and related resources developed with federal government funding. Regional Education Laboratories (RELs) have partnered with states and local districts to provide training, coaching, and technical support to educators and policymakers and to share information about “what works” to improve learner outcomes. Yet, practice guides and RELs were among the cuts. In their absence, schools and districts may make their own small-scale investments in research, but the redundancies in effort will be less efficient overall.

Efficiency is about the payoff for an investment, not just about the price paid. Relative to other federal investments, the IES investment has been small, but its payoff has been mighty. For example, former state education commissioners credit education research with yielding huge academic gains for students in Tennessee and Mississippi. If we want to support our children’s learning and have a productive workforce in a changing economy, we need data and research to support the efficient use of resources.

Each author is a past president, current president, or presidential candidate from either the Association for Education Finance & Policy (Bassok, Jackson, and Kurlaender) or the Society for Research on Educational Effectiveness (Boulay, Page, and Tipton). Some of these authors, like many others in the education research community, have projects affected by the recent cuts.

Authors

The Brookings Institution is committed to quality, independence, and impact.
We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).