Meet New NCES Commissioner Jack Buckley
Sean P. “Jack” Buckley is commissioner of the National Center for Education Statistics. Previously, he served as deputy commissioner of NCES from 2006 to 2008. Buckley has taught statistics and education policy at Georgetown University, Boston College, and the State University of New York at Stony Brook and spent five years in the U.S. Navy as a surface warfare officer and nuclear reactor engineer. He holds an AB in government from Harvard and an MA and PhD in political science from SUNY Stony Brook.
What about this position appealed to you?
As an academic, my primary research interests have been applied statistics and education policy. NCES, as the federal statistical agency for education, is the perfect nexus of both. In addition, I was fortunate enough to serve as deputy commissioner from 2006–2008, so I knew a lot about both our outstanding staff and our fascinating mission to collect data on all aspects of the condition of education in the United States.
Although the Department of Education only dates back to 1980, a federal office by various names has collected education statistics since 1867 under different Cabinet agencies. It is a great privilege to play a role in maintaining this proud tradition.
Describe the top 2–3 priorities you have for NCES.
First, the integrity of NCES and the data we collect must be paramount. The center has a solid record for honesty and accuracy and our data are generally regarded as apolitical, nonpartisan, and nonideological. I am keenly aware that I have inherited this reputation from my predecessors and I know that my staff and I must pass it on to our successors. There can be no compromise with respect to our integrity.
Second, NCES must strive to retain the relevance of our data and reporting. It does us, and the nation, little good if we accurately and thoroughly measure aspects of education that are irrelevant to policymakers and the public while failing to report on areas central to current policy debates.
A second dimension of relevance is timeliness. The most accurate answers to exactly the right questions are nevertheless irrelevant if they arrive too late. Timeliness is always in tension with quality; it is our responsibility at NCES to let neither win at the expense of the other.
Part of the Institute of Education Sciences at the Department of Education
Fiscal year 2010 budget: $264 million
Staff size: 112
My third priority is rigor. As a federal statistical agency, our reporting and methodology must be at the leading edge of many fields, including measurement, psychometrics, statistical computing, survey sampling, instrument design, field data collection, and confidentiality and statistical disclosure limitation. And yet we need rigor without mortis; the requirement for scientific excellence must not stifle new ideas. A key part of maintaining this balance in the next few years will be the completion of the ongoing review and revision of our internal statistical standards and the expansion of opportunities for our staff to receive training in methodological advances.
Finally, NCES must remain innovative. Our history is one of spectacular innovation: large-scale, nationally representative, multi-level surveys and national, state, and international assessments pushing the boundaries of psychometrics are all the more impressive because their methods were devised to operate when computers were slow and processing time expensive. In the few months I have been back, I have been pleased to observe innovative work across the center.
What do you see as your biggest challenge(s) for NCES?
In the last 10 years, there has been a vast increase in the amount of administrative data in U.S. education, especially in grades K–12, driven at least in part by changes in federal policy. Although NCES has been involved in this revolution, particularly through our State Longitudinal Data Systems grant program, we have not yet realized the full potential of these data for both reporting purposes and improving survey and assessment operations.
A particular challenge with respect to these data is fulfilling our mandate to create voluntary common education data standards for state longitudinal education data systems. We’re about 7–8 years behind here and progress must be quick.
How can the statistical community help you?
We’re very fortunate at NCES to have the assistance of several communities: the fantastically strong statistics and survey research world here in DC, the small but essential community of psychometricians and measurement specialists, and the broader group of empirical education researchers. People from all three groups have, historically, donated their time and invaluable expertise to our data collections. For example, John Tukey’s role on the technical advisory panel to the National Assessment of Educational Progress (NAEP) in 1965, or James Coleman’s pioneering longitudinal study designs in the 1970s.
My hope is that we can ensure this partnership remains vital, and I will be seeking the input of statisticians and statistically minded social scientists to advise us on all aspects of our work.
Prior to your tenure, what do you see as the biggest recent accomplishment of the agency?
The No Child Left Behind Act of 2002 required several substantial changes to the National Assessment of Educational Progress that greatly improved the program. NCLB requires state participation in reading and mathematics assessments in grades 4 and 8 every two years (formerly every four years). NCES redesigned NAEP to permit reporting within six months (formerly 12–18 months) with reports targeted to the general public.
To permit comparisons among the states for reading and mathematics, the sample sizes for grades 4 and 8 combined increased from around 15,000–30,000 to more than 300,000 students assessed every two years in these subjects alone. This increase in scope, combined with the rapid reporting schedule, required significant improvements in all areas of operations.
What makes NCES unique in the federal statistical community?
Like most other statistical agencies, NCES has a portfolio of universe and sample survey data collections and a growing focus on administrative data. What sets us apart, however, is our role in educational assessment. In any given year, through programs like NAEP and the comparative international assessments (PISA, TIMSS, PIRLS), NCES is assessing the cognitive skills of hundreds of thousands of U.S. students at a wide range of grade levels—even the adult population through the OECD’s Program of International Assessment of Adult Competencies (PIAAC) and, historically, the National Assessment of Adult Literacy (NAAL).
These assessments pose a unique set of challenges in terms of item design, psychometrics, and complex sampling. Results from these data collections allow NCES to measure where American students stand in subject areas ranging from reading and mathematics to civics and the arts, and they make it possible for policymakers to compare academic achievement and adult literacy among states and across nations.