
The National Association of Scholars (NAS) has urged the Trump Administration to spare the Institute of Education Sciences (IES) from its planned dismemberment of the U.S. Department of Education (ED), IES being the research and data collection wing of ED. In a March 31, 2025, article, the NAS wrote:
Much of what ED does is useless or counterproductive. IES actually produces useful material. The National Center for Education Statistics’ (NCES) National Assessment of Educational Progress (NAEP) is particularly useful. Whatever the ultimate disposition of ED, we believe that much of IES ought to be preserved in some administrative home.
If the U.S. Education Department dissolves and the National Center for Education Statistics (NCES) must move, the U.S. Census Bureau would serve as a more logical destination than the Bureau of Economic Analysis, which the NAS suggests. NCES and the Census Bureau have often worked together in the past, with the Census Bureau responsible for much of the data collection on which NCES relies. Though many may be unaware, the Census Bureau does most of the actual survey work—i.e., sampling, interviewing, data collecting, and cleaning—for the Education Department’s K-12 data.[1]
I agree with NAS that most non-political and non-ideological data collection efforts should continue. But the IES is more than just the NCES. Its other components manage more subjective work.
[RELATED: Division, Extremism, and Ideology at the National Science Foundation]
For example, IES funds dozens of research consortia, or “Comprehensive Centers.” The “Regional Laboratories” are consultative clearinghouses of resources serving designated geographic areas; the “Content Centers” are topically focused and conduct original research and evaluation projects.
The several Regional Laboratories (e.g., West, Southwest, Southeast, etc.) are meant to help educators and educational institutions with applied research needs. The former Mississippi state education superintendent credits her regional laboratory—the Southeast Lab, headquartered at Florida State University—with crucial assistance in developing and implementing the state’s highly acclaimed early-grades reform of reading instruction. The lab provided adjunct labor, expertise, and other resources that the state could not afford to purchase independently.
Many regard the results from Mississippi’s reading reform as positive, especially those on the “traditionalist” side of curriculum and instruction debates. Some on the other side—the progressive education side—can feel differently about what transpired there.
Probably few would disagree that the progressive side currently dominates U.S. education policy as it is interpreted in graduate schools of education and the many professional associations of educators. Those progressives have equal access to the free resources available at the regional laboratories to serve their interests, too.
Historically, regional laboratories were run by regional institutions. More recent years have seen some of the usual suspects, “Beltway Bandits”—i.e., federal government contractors—awarded most of the regional lab contracts, whether they are located within the region or not. These several large research contractors have settled in as essential ED appendages over the years (e.g., Abt, AIR, Mathematica, MDRC, RTI, SRI, Westat, WestEd).[2]
The second type of ED-funded research center, the “Content Centers,” focuses on topic areas, for which they conduct primary research and dissemination as their principals see fit. The past several decades have seen research centers devoted to, for example, Special Education, Migrant Education, Longitudinal Data Analysis, and Testing and Evaluation. The four most recently funded Content Centers cover Fiscal Equity, the Educator Workforce, Early Childhood Education, and English Language Learners.
The Content Centers are given millions of dollars across few to many years to spend with little oversight or accountability. For some, the hugely disproportionate power and freedom granted by Center status and resources tempt too much. The Centers are supposed to serve society and grow the knowledge bases in their subject fields. Unfortunately, some mainly serve themselves, focusing on advancing the careers of Center members and friends. In short, federal funding of Content Centers concentrates far too much power in too few hands.
Indeed, some Content Centers diminish the knowledge base in their subject areas through a variety of “subtractive scholarship” techniques. These include ignoring other scholars’ research, misrepresenting the research of rival scholars, who have no opportunity to defend their work, or even declaring all previous research nonexistent, while spending taxpayer dollars to disseminate and publicize their own work widely.
IES also provides substantial funding for international education data collections, both directly and with in-kind services. Two organizations dominate this space. The International Association for the Evaluation of Educational Achievement (IEA) is a transnational cooperative that manages the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS), administered to 8—13-year-olds. The Organisation for Economic Co-operation and Development (OECD) manages the Programme in Student Assessment (PISA), administered to 15-year-olds.
[RELATED: Indirect Costs Make Science a Revenue Game Not a Discovery Quest]
Though often considered equivalent, the tests of the two different organizations could hardly be more different. TIMSS and PIRLS are straight-up achievement tests, designed to measure how much of the knowledge taught in schools has been retained by students at test time. PISA, by contrast, attempts to “[measure] 15-year-olds’ ability to use their reading, mathematics and science knowledge and skills to meet real-life challenges,“ and is more like an aptitude test that measures some IQ components. TIMSS and PIRLS are the “traditional education” tests, and PISA the “progressive education” test. Whether or not PISA has ever been validated is a subject of some dispute.
Another contrast: the IEA has its cooperative hands full with its examinations and does little beyond them, whereas the much better-resourced OECD Education and Skills Division inserts itself into most every nook and cranny of curriculum and instruction debates, usually on the progressive side. For example, Finland’s model can be preferred over that of the East Asian nations, despite the latter’s markedly better results. The German head of the Paris-based OECD Education and Skills Directorate even sat on the U.S. Common Core Validation Committee, casting his vote in favor of the slow and fuzzy, but progressive, standards.
Overly ambitious OECD initiative seems to roam anywhere and everywhere in search of the latest education fashions, grasping for all that glitters with programs such as: “innovative pedagogies for powerful learning,” “innovative learning environments,” “socio emotional learning,” “young peoples’ environmental sustainability competence,“ “assessment of personality characteristics,” “a future ready teaching profession,” “deeper learning,” “education for human flourishing,” and “the PISA Happy Life Dashboard: Visualising Key Indicators on Student Well-Being from The PISA Survey.” Whereas the IEA wishes simply to measure academic achievement, the OECD wants to give you advice on pretty much anything educational.
Finally, I would vote against moving any part of the IES devoted to K-12 issues, which is most of it, to the National Science Foundation (NSF), as the NAS proposes. Progressives long ago captured the K-12 STEM operation at NSF and became responsible for a long series of detrimental math and science fads and textbooks over the years.[3] Traditionalists and their research and, indeed, most relevant STEM research has been ignored, dismissed, and shut out at the NSF for decades.
[1] E.g., Annual Survey of School System Finances # Local Education Agency (School District) Finance Survey (F-33) # National Public Education Finance Survey # National Household Education Survey # Current Population Survey # Government Units Survey # Boundary and Annexation Survey # National Survey of College Graduates # National Teacher and Principal Survey # Principal Follow-Up Survey # Private School Universe Survey # School Crime Supplement # School District Review Program # School Pulse Panel # School Survey on Crime and Safety # Statistics in Schools # Teacher Follow-Up Survey
[2] Ironically, Denver-based MCREL, whose acronym derives from “Mid-Continent Regional Laboratory,” was most recently awarded the regional lab contract for the Pacific West Region—i.e., the Territories of Guam, Palau, Micronesia, and the Northern Mariana Islands—while Princeton- and DC-based Mathematica was awarded the contract for the Central region, which includes Colorado and the Plains states.
[3] The NSF has provided many grants for the development and dissemination of fuzzy math programs. For example, here is a listing of some of the NSF grants that supported the Connected Mathematics Project (CMP): #9986372 Connected Mathematics Phase II # #9980760 Adapting and Implementing Conceptually-Based Mathematics Instructional Materials for Developmental-Level Students # #9950679 Preparing Elementary Mathematics Teachers for Success: Implementing a Research-Based Mathematics Curricula # #9911849 Teaching Reflectively: Extending and Sustaining Use of Reforms in the Mathematics Classroom # #9714999 Show-Me Project: A National Center for Standards-based Middle School Mathematics Curriculum Dissemination and Implementation #9619033 The Austin Collaborative for Mathematics Education # #9150217 Connected Mathematics Project.
Image of U.S. Department of Education by Anne Meadows on Flickr