Home » Cover Story

An Update to the American Community Survey Program

1 January 2015 1,974 views One Comment
James Treat, American Community Survey Office Division Chief

    greencommunity

    It has been a decade since the U.S. Census Bureau launched the American Community Survey (ACS). What is the status of the program in 2015, and what are the significant developments, challenges, and achievements that have marked the ACS in the last decade?

    Benchmarks in ACS Development, 2005–2014

    Sample Design

    Each year, from 2005 until 2010, we selected approximately 2.9 million housing unit (HU) addresses in the United States and 36,000 HU addresses in Puerto Rico. Beginning in 2011, we implemented the following four key changes:

    1. We increased the sample selected to 3.54 million addresses
    2. We added several new HU sampling rates that better control the allocation of the sample and improve estimate reliability for small areas
    3. We increased the follow-up sample to 100% in select geographic areas
    4. Starting in 2013, we restricted the assignment of the group quarters sample for college dorms to non-summer months (January–April and September–December)

    The increase in the follow-up sample was made to increase the reliability of the ACS estimates for certain well-defined geographic areas: Hawaiian Homelands, Remote Alaska (all or parts of 14 Alaskan boroughs where access is difficult to the communities and fishing villages), Alaska Native Village Statistical Areas, and all American-Indian areas with at least 10% of the population responding to the 2010 Decennial Census as American Indian or Alaska Native.

    Data Collection

    From 2005 through 2012, the Census Bureau data collection for HUs consisted of three modes—mail, telephone, and personal visit—spread over a three-month period. Based on the findings of two experiments conducted in 2011, the Census Bureau changed the self-response option for the 2013 ACS by adding an Internet response option and new mailing strategy. Research conducted on the 2013 self-response check-in rates (the proportion of all cases that returned a questionnaire—via mail or Internet—of all cases mailed) showed that such rates were significantly higher than rates in 2012, when an Internet option was not available to respondents (see “The Effects of Adding an Internet Response Option to the American Community Survey.”)

    Data Products and Data User Education

    The ACS data release schedule for the 2013 ACS estimates is typical of ACS release schedules in previous years. The ACS one-year estimates were released first, in September 2014, followed by the three-year, then the five-year estimates. The ACS Public Use Microdata and the release of estimates for the Puerto Rico Community Survey always follow the release of five-year estimates. Three non-overlapping three-year estimates were available in October 2014 (2005–2007, 2008–2010, and 2011–2013). In 2015, two non-overlapping five-year estimates will become available (2005–2009 and 2010–2014). These products will enable data users to compare ACS estimates across times in ways not previously possible to better explore trends for characteristics of population and housing.

    In 2012, with the support of advisory groups and professional organizations, the Census Bureau embraced the need for an online data user community to support the needs of ACS stakeholders by launching the ACS Data Users Group (ACS DUG) with the assistance of Sabre Systems, Inc. and the Population Reference Bureau (PRB). Read more.

    New interactive tools developed to access ACS and other Census Bureau data sets are available here. An application programming interface (API) also is available to let developers create custom apps based on ACS statistics that help business and local governments foster local economic development, promote job creation, or plan for disaster recovery.

    ACS Program Review

    In 2011, following the release of the first ACS five-year estimates, the director of the Census Bureau commissioned a team to plan and implement a comprehensive assessment of the ACS program to ensure it was meeting the needs of data users as effectively as possible. The scope of the review included a comprehensive examination of the ACS program to (1) ensure its products were meeting stakeholder needs, (2) ensure the survey methodology and program management were technically sound and efficient, (3) examine and address concerns raised by survey respondents about their participation in the survey, and (4) identify and reduce program risks. A key challenge facing the ACS program was that the program infrastructure had not kept pace with the growth in size and stature of the survey within the federal statistical system. Accomplishments of the program review include overhauling the governance structure of the ACS program to make it more efficient and strengthening the research and evaluation program. Read the final report for the review.

    ACS Content Review

    The ACS was launched as the replacement for the long form survey. As was the case for that survey, the value of each question on the ACS has been confirmed each year with the federal agencies that sponsor each question. Inventories of federal uses of ACS data have yielded hundreds of legal, required, and programmatic uses of data. With the assessment opportunity afforded by the ACS Program Review, the Census Bureau—working with the Office of Management and Budget (OMB)—decided to launch an examination and confirmation of the value of each question as part of the most comprehensive effort ever undertaken to review content on the ACS. The purpose of the 2014 Content Review, which is under way, is to identify questions for possible removal or modification, while continuing to provide information to meet the nation’s needs.

    In 2012, the OMB and Census Bureau chartered the Interagency Council of Statistical Policy (ICSP) Subcommittee for the ACS to provide advice about how the ACS can provide the most useful information with the least amount of burden. The charter also directed the subcommittee to conduct regular periodic reviews of ACS content with the goal of assuring a clear and specific authority and justification for each question on the ACS, the appropriateness of the ACS as the vehicle for collecting the information, minimization of respondent burden, and appropriateness of the quality of data for its intended use.

    The subcommittee established two analysis factors—benefit as defined by the level of usefulness and cost as defined by the level of respondent burden or difficulty in obtaining the data. Federal agencies were asked to document the justification for question use; the mandatory, regulatory, and programmatic uses; lowest level of geography required; frequency of use; funds distributed based on the questions; and characteristics of the population. They also were asked to identify alternative data sources for the ACS and whether any ACS questions are used in creating another survey’s sampling frame. Census Bureau subject matter experts examined the coefficient of variation associated with an estimate for each question at the county level, providing insight into the equality of the measure by geography. They also computed interquartile ranges associated with an estimate for each question at the county level.

    Four data sets reflecting measures of cost or burden were collected. ACS interviewers were surveyed to identify which questions respondents find cognitively burdensome or sensitive and which questions are the most difficult for respondents. Second, response times to questionnaires via automated modes were measured to determine how long it took respondents to answer each question. Third, allocation rates by question were computed to determine which questions were left blank such that imputation was required due to more missing information. Finally, complaints about the ACS received by email, letter, or telephone were examined to obtain a count of ACS questions, so a count by individual question could be obtained.

    Based on the analysis of information relating to a question’s benefits and costs, each question received a total number of points between 0 and 100 based on its benefits and between 0 and 100 points based on its cost. The points were used to create four categories: (1) high benefit, low cost; (2) high benefit, high cost; (3) low benefit, low cost; and (4) low benefit, high cost. Twenty-one questions that fell into either of the low benefit categories were then reviewed further. This review involved identifying questions designated by the Department of Commerce Office of General Counsel as falling into two categories—(1) NOT Mandatory and (2) NOT Required (i.e., regulatory) with a sub-state use—and excluding those questions from further consideration for removal from the ACS.

    Of the 21 questions that fell into the two Low Benefit categories and for which further analysis took place, seven questions remained. These questions, which follow (the text in italics reflects the 2014 questionnaire wording), are slated for removal from the ACS, subject to the results of the Federal Register notice and further review by the OMB.

    • Housing Question No. 6 – Business/Medical Office on Property
      Is there a business (such as a store or barber shop) or a medical office on this property?
    • Person Question No. 12 – Undergraduate Field of Degree
      This question focuses on this person’s Bachelor’s Degree. Please print below the specific major(s) of any Bachelor’s Degrees this person has received.
    • Person Question No. 21a – Get Married
      In the past 12 months did this person get – Married?
    • Person Question No. 21b – Get Widowed
      In the past 12 months did this person get – Widowed?
    • Person Question 21c – Get Divorced
      In the past 12 months did this person get – Divorced?
    • Person Question No. 22 – Times Married
      How many times has this person been married?
    • Person Question No. 23 – Year Last Married
      In what year did this person last get married?

    An ACS Federal Register notice of October 31, 2014, invited comments by December 30, 2014, about the analysis described above. The Census Bureau will assess all comments received in making a final recommendation to OMB by early spring 2015 on whether to modify the content of the ACS. The OMB will make the final determination on the Census Bureau’s recommendation and provide approval by early summer 2015.

    Conclusion

    Other developments will shape the ACS program’s future as the Census Bureau prepares for the next decennial census and the ACS is leveraged to the extent possible to aid that preparation (e.g., ACS estimates were used to develop a 2020 Census planning database released in 2014). The Census Bureau will submit to Congress the topics for the ACS and 2020 Census in 2017 and the final questions for both in 2018.

    As leveraging the strengths of programs across the Census Bureau is a two-way process, the ACS program will benefit from many of the preparations for the 2020 Census, including improvements to the Census Bureau’s Master Address File.

    The ACS has come a long way since it was implemented in 2005. Count on further initiatives to make the program as efficient, cost effective, and innovative as possible. In the mean time, enjoy the bounty of data the ACS provides and join the Census Bureau and its stakeholders to help ensure communities across our nation have the information they need.

    Read more about the ACS.

    1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 5.00 out of 5)
    Loading...

    Comments are closed.