Home » Featured

Meet James Lynch, Director of BJS

1 April 2011 2,353 views No Comment
Amstat News invited new Bureau of Justice Statistics Director James Lynch to respond to the following questions so readers could learn more about him and the agency he directs. Look for more interviews with new statistical agency heads in forthcoming issues.

James P. Lynch was confirmed as the director of the Bureau of Justice Statistics by the Senate in June 2010. Lynch is on leave from John Jay College in New York, where he is a distinguished professor. He earned his PhD in sociology from The University of Chicago and has published three books, 25 refereed articles, and more than 40 book chapters and other publications. Lynch also has chaired the American Statistical Association’s Committee on Law and Justice Statistics.

What about this position appeals to you?

I have worked with crime statistics for more than 30 years. The Bureau of Justice Statistics (BJS) is the principal source of statistical data on crime and criminal justice matters in the United States, so being the director gives me a chance to implement many of the changes and improvements in these statistics that I (and others) have advocated for many years. It also gives me the opportunity to work with some very committed and talented colleagues within the agency.

BJS Fast Facts

    BJS is part of the Office of Justice Programs in the Department of Justice

      Website: http://bjs.ojp.usdoj.gov

        Fiscal year 2010 budget: $69 million

          Staff size: 60

            Describe the top two or three priorities you have for the Bureau of Justice Statistics.

            The biggest challenges for BJS involve building statistical infrastructure, including systems to describe those parts of the criminal justice system not well covered and improving the quality of existing data series. More specifically, I am committed to restoring the National Crime Victimization Survey (NCVS) to its former levels of precision and quality and to getting more useful information out of it on a routine basis. I also am committed to exploring ways to change the design and organization of the survey to produce more useful estimates for subnational areas such as states and large cities, improving the quality of estimates of sexual violence, and getting better estimates of crime for juveniles. These goals all involve research and development work that is under way.

            Beyond the NCVS, I am eager to take advantage of the increase in the quality of operational and administrative records in the criminal justice system to give us better data on offenses known to the police, arrest information, and recidivism. Line agencies in every corner of the justice system are using automated data and the exchange of these data to improve service. The federal government has spent a great deal of money improving these operational systems and we should explore their ability to be used for statistical purposes. Exploiting these data poses a number of logistical, statistical, policy, and legal challenges that we are working through with pilot programs.

            What do you see as your biggest challenge(s) for BJS?

            One of the biggest challenges is to maintain the level of funding that will ensure the development and maintenance of high-quality statistical systems. For almost two decades, BJS was essentially flat funded while the demands on the agency increased. This flat funding resulted in the degeneration of the NCVS to the point it could not serve its legally mandated function and failed to expand coverage of the criminal justice system. In 2009, the agency received more resources, which we are using to restore and improve the NCVS and to expand the coverage of the criminal justice system. This level of funding must be sustained for us to deliver on these promises.

            A second big challenge is to successfully negotiate the institutional arrangements necessary to use operational data for statistical purposes while maintaining promises of confidentiality and the integrity and independence of a statistical agency. This is the future of crime statistics.

            Finally, there is a need to shore up the independence of BJS as a federal statistical agency. Maintaining a strong position of independence is a fundamental principle for a federal statistical agency to help ensure its credibility and objectivity.

            How can the statistical community help you?

            The statistical community includes a number of groups with different interests and abilities. It is easier for me to respond to this question if I can focus on specific entities in this broader community, such as the Inter-Agency Council on Statistical Policy (ICSP) and the American Statistical Association (ASA), represented by the Committee on Law and Justice Statistics.

            The ICSP brings together the family of federal statistical agencies that have a unique perspective on the federal statistical system, while the Committee on Law and Justice Statistics includes a much broader group sharing a general set of skills and knowledge. The ICSP has become, for me, a helpful group in sharing approaches to common problems, including strategies for preserving independence, recruiting staff, vetting statistical products, and routinely communicating with constituencies and consumers. This kind of cooperation could blossom into more ambitious commonwealth efforts such as pooling resources to get the assistance of world-class statisticians that no single agency could sustain in terms of worth problems and money.

            The Committee on Law and Justice Statistics can provide advice to the agency on a number of statistical issues, including the following:

            1. Disclosure policies for microdata
            2. Assessing the relative merits of general variance formula estimates of standard errors versus those from direct estimates using empirical variance estimation procedures
            3. Balancing the reporting obligations of federal statistical agencies with the confidentiality rights of agencies supplying administrative record data
            4. Approaches to imputation: When is imputation appropriate, inappropriate, and essential?
            5. Advantages and disadvantages of different approaches to subnational estimation with the NCVS
            6. Estimating standard errors for multi-year aggregations of the NCVS for both rate estimation and multivariate modeling
            7. Methods for assessing data quality, scope of coverage, and record linkage in using administrative records and addressing the consent and disclosure issues resulting from the linkage of administrative records with survey data

            Prior to your tenure, what do you see as the biggest recent accomplishment of the agency?

            Actions taken by prior directors to highlight the negative effects of decades of flat funding and to use that insight to secure more appropriate levels of funding is a real accomplishment that makes other achievements possible. The Committee on National Statistics’ (CNSTAT) review of BJS programs was central to this process. For the first time in its 30-year history, BJS’ programs, policies, and organizational structure was reviewed by members of the statistical and constituent communities and, while this review was overwhelmingly positive, it also provided a useful blueprint for the future of the agency.

            The budget for BJS was increased by nearly a third in fiscal year 2010 to improve the National Crime Victimization Survey (NCVS). Please describe the progress to improve NCVS and what remains to be done.

            The increased budget for the NCVS was divided into two components—funds for restoring the core survey and funds for redesigning the survey to enhance its utility. The CNSTAT panel was not critical of the basic design of the NCVS, but they focused on ways to accommodate the flat funding of BJS and the survey. Some of these accommodations had large negative consequences for the quality of the survey data. With additional funding for the core, exploration of these accommodations was given less priority and efforts were directed at restoring the sample size and quality controls that were gradually, but substantially, eroded over the last two decades.

            The first increment in sample was introduced this past October, and the first interviewer training will be conducted in April. Another increment in sample will be introduced in January.

            Restoring the survey is complicated by the need to avoid another break in series for the victimization statistics as occurred in 2007, when changes to the survey were introduced without a plan for understanding their effects on victimization rates. All the actions taken to restore the NCVS must be done in a manner that lets us understand and take account of effects on the statistical series.

            Efforts to redesign the survey to enhance its utility are focused on three major areas—improving subnational estimates of victimization, measurement of sexual violence, and the victimization of juveniles, including those under 12. In 2010, we began to assess alternative strategies for providing routine estimates for states and large cities. These alternatives include direct estimation with the current sample and with enhanced and reallocated samples; indirect estimates, including blended estimates from low-cost boosts to the current sample; and using low-cost options for free-standing local surveys. The initial results from the direct estimates work will be available in the next month or two.

            The NCVS has long been criticized for its measurement of rape and sexual assault. Substantial improvements were made in this area in 1992, but alternative methodologies have been used since that time that provide very different estimates of the level and change in level of these crimes. These conflicting estimates raise damaging doubts about self-report surveys of victimization. We must confront these difficult measurement issues and determine the optimum set of procedures for measuring sexual violence and whether the NCVS can be altered to accommodate those procedures or a different vehicle is required. We are negotiating an agreement with CNSTAT to begin this process.

            The measurement of juvenile victimization is complicated by a parallel set of methodological issues, and we must decide if the NCVS is the appropriate vehicle for addressing this population and, if so, what changes to the design are required.

            Describe your interactions with other components of the U.S. Department of Justice and the role of BJS within the department.

            Part of BJS’ role in DOJ is the same as its role with respect to the public: to provide routinely high-quality statistics on crime and the criminal justice at the federal, state, and local levels. Increasingly, however, various components of the department have come to rely on BJS for all things statistical and ask for statistical analysis in support of their mission. While we try to service these requests with existing reports, they often result in special analyses of available statistical series. This kind of responsiveness is important for establishing the relevance of statistics, but it also can detract from the maintenance of statistical infrastructure and the production of routine reports.

            With John Laub as director of the National Institute of Justice (NIJ) (the research arm of the DOJ), we are looking for ways that BJS and NIJ can work together more closely. We are considering ways of sharing information at an early point in the planning and budget process to determine how our programs can inform each other. In planning its research agenda, for example, NIJ may be able to use routinely collected data from BJS to determine if a specific solicitation is necessary or whether the assumptions on which the research program is based are accurate. In turn, many of the measurement issues of concern to BJS can be informed by some of the research funded by NIJ. Both John and I look forward to making these and other exchanges more formal and systematic.

            1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 5.00 out of 5)

            Comments are closed.