Minutes of the June 5, 2009 COPAFS Meeting
Ed Spar. Executive Director’s Report
Executive Director Ed Spar reported that the confirmation of Bob Groves as Census Bureau Director is on hold. This is recent news, and Spar did not have information on who initiated the hold and for what reason. He will e-mail COPAFS reps with significant developments.
Spar expressed caution, but noted that the 2010 budget numbers look quite good. If BEA gets the requested amount, they will reinstitute county employment data, and backfill to years for which data were not provided. And if NCHS gets its requested amount, they will be able to return the Health Interview Survey to full sample. Spar also reported that CNSTAT recently released a report on the National Crime Victimization Survey, noting that there seems to be congressional support for added funding, and that BJS is working on creative ways to improve the system. Spar had no observations on the Census budget, but commented that LEHD (the Longitudinal Employer-Household Dynamics program) may be the most interesting thing going on at the Census Bureau. LEHD not only tracks employment at the local level, but produces synthetic data – an approach that many surveys may adopt in response to disclosure concerns. A presentation on LEHD is scheduled for the next COPAFS meeting.
Remaining 2009 meetings are September 11 and December 4.
Upcoming Changes in American Community Survey Products Between 2007 and 2008.
David Johnson. U.S. Census Bureau.
Scott Boggess, U.S. Census Bureau.
David Johnson noted that he is not with the ACS office, and that his giving an ACS presentation is evidence of how wide ACS involvement is at the Census Bureau. He explained that changes in ACS products are in response to recommendations and feedback from OMB, Census, and the inter-agency council, and with input from data users. The goal is to incorporate changes while minimizing disruptions to data consistency.
Scott Boggess said many of the ACS product changes were necessitated by changes in the questionnaire, and the addition questions on new topics, including marital history, health insurance coverage, and service-connected disability. But more than half of the changes relate to changes in the general disability questions, and none of the disability tables will be comparable to previous years – including the most basic count of persons with a disability. Many disability tables are being deleted, and replaced by new tables that will look like the originals, but will provide fundamentally different measures, such as percents calculated with different denominators. The Census Bureau is just now getting a look at data collected from the new questions, and will release relatively few tables until they become more familiar with the new data. Also, because of the differences introduced in 2008, there will be no disability data in the 2006-2008 3-year ACS data products, and the first 5-year data will be those reflecting data collected 2008-2012.
Changes to other ACS topics include employment, relationship, educational attainment, year of naturalization, and food stamp amounts. The ancestry tables will look the same, but the data will be different due to the correction of some errors in processing. For example, it has been discovered that the inclusion of Irish-Scotch with Scotch-Irish is incorrect. Some tables will reflect new industry codes, and there will be two new tables for multigenerational households. Boggess pointed out that there might also be discontinuities in employment estimates due to changes in the questions to make them more comparable with those on the CPS. If the new questions provide data more consistent with the CPS, the Census Bureau will likely expand the number of ACS employment tables provided.
Despite all the changes, Boggess noted that the 2008 changes are actually small compared to previous years, but he said we can expect more changes next year with the expansion of the disability, health insurance, employment, and group quarters data product packages, as well as the addition of race-iterated data for disability, health insurance and marital history, and a new topic – field of degree.
Toward a Health Care Satellite Account.
Ana Aizcorbe, Bureau of Economic Analysis.
Aizcorbe noted that BEA does not gather data the way the Census Bureau does, but is engaged in an accounting or measurement of expenditures and other economic activities. They do a lot of research on their measures, and health care is an area of recent focus. The goal is to improve measures of health care in the national accounts, and Aizcorbe explained some of the challenges associated with this goal.
The problem is how to best measure the services provided by the health care sector, and the extent to which health care expenditures are increasing or decreasing. To illustrate the challenge, and the different ways of looking at health care expenditures, Aizcorbe used the example of depression. The typical BLS approach would be to measure expenditures for specific services or treatments, while health economists tend to focus on expenditures by disease. In the case of depression, treatment once was limited primarily to the costly option of “talking therapy,” but when drug therapy became an option, people tended to substitute that less expensive option. Thus, from the perspective of spending by treatment, the cost of treating depression has increased (e.g., the cost of talk therapy has increased), but from the “treatment of disease” perspective, the cost has decreased because of the substitution of the less expensive treatment.
The point is that one has to consider if the treatments available for specific diseases have changed recently, and also if the total number of people being treated has changed (with lower cost options, more people might seek treatment). BEA is trying to identify diseases where this difference (spending by treatment vs. spending by disease) is important. The work is still preliminary, and is further complicated by the tendency for some diseases to occur in groups. Still some of the early data indicate that treatment-based measures suggest higher increases in health care costs than do the disease-based measures. Findings so far – based on an extensive list of diseases – suggest that treatment of disease based measures usually show lower growth in health care costs.
Population Estimates Testing and Evaluation Research.
Victoria Velkoff, U.S. Census Bureau.
Jason Devine, U.S. Census Bureau.
Tori Velkoff described the Census Bureau’s plans for evaluating their population estimates against the results of the 2010 census (a program they call Estimates Evaluation E2). The objective is to evaluate not only the methods they currently use, but a number of alternative methods.
Currently, the Census Bureau uses the administrative records method (ADREC) for county estimates, and the housing unit method (HU) for subcounty estimates, and Velkoff recalled the HUBERT (Housing Unit Based Estimates Research Team) project undertaken in response to recommendations that that they consider HU methods for the county estimates. The HUBERT results found that the ADREC method has produced more accurate estimates than HU methods in most counties. However, some HU method proponents are concerned that ADREC showed no advantage with respect to bias, so the Bureau is taking a further look at HU methods. The objective is to document the ultimate decision – to either stay with the ADREC method or switch to HU or other methods following the 2010 census.
Phase one of the evaluation is to develop estimation principles and select accuracy measures. Phase two is to select alternative methods to test, develop official estimates and alternative estimates, and then carry out the evaluation. The timeline calls for the completion of phase one in April 2009 (already done), the determination of alternative methods by July 2009, the production of evaluation estimates by December 2010, the completion of evaluations by October 2011, and decisions on post-2010 methodology by spring 2012. Evaluations will cover total population as well as demographic characteristics, and the Census Bureau will look to external researchers to help evaluate specific methods, such as ratio correlation.
Jason Devine described the underlying principles of the estimates – things not subject to change. Among these are that the population estimates will reflect the census concept of usual residence, the most recent census counts will serve as the estimates base, and because of the use in funds distribution, priority is given to a lack of bias. The estimates must be produced within deadlines by Census Bureau staff, estimates within a vintage must sum to others of that vintage, and each vintage must include a time series from the last census.
Methodological principles call for soundness (solid reasoning), accountability (understandable by many parties), availability of data (for all areas of US), availability of resources, robustness (insensitive to small departures from assumptions), comparability, adaptability, parsimony, and reasonableness (in terms of accuracy, demographic appropriateness, and external comparisons).
Accuracy will be defined as the degree of closeness to the 2010 census values, and Devine listed four properties of “good” estimates, drawn from a 1970s review of the Census estimates program by the Committee on National Statistics. These include low average numeric error, low average percent error, few extreme percent errors, and the absence of bias for subgroups.
Devine also identified five selected measures of accuracy – chosen after a review of 19 different measures.
1. Average numeric error (root mean square error).
2. Average percent error (MAPE).
3. Extreme percent error (N greater than an established threshold).
4. Bias (MALPE).
5. Accuracy of share of total population. (absolute error of shares).
These measures for 2000 county population estimates were as follows for ADREC and HU methods.
Root mean square 10,341 13,464
MAPE 3.2 5.9
Extreme percent error (10%) 109 502
MALPE -1.5 1.4
Accuracy of shares 6,574,332 12,173,807
Devine concluded by identifying key points for the upcoming evaluations. These include a focus on the most promising alternatives, the use of production requirements and underlying principles to guide decisions, providing the most comprehensive evaluation possible with limited resources, and providing datasets that will allow others to assess the accuracy of the estimates.
Developing Criteria for Delineating Urban Areas.
Michael Ratcliffe, U.S. Census Bureau.
Chris Henrie, U.S. Census Bureau.
Ratcliffe stressed that today’s presentation does not describe proposed criteria, but ideas to generate feedback for use in preparing proposed criteria for the definition of Urban Areas.
He noted that the Census Bureau has officially defined urban areas since 1910, they define urban areas after each census, and that the areas are defined for statistical, and not programmatic purposes. The criteria are reviewed and revised before each census, but a common theme over the decades has been to identify the built-up, densely settled urban landscape. The population threshold of 2,500 is an enduring part of the changing definitions.
Adopted in 1950, the density-based Urbanized Area (UA) concept allowed for the inclusion of more than just incorporated areas, and even non-contiguous areas. The UA concept was largely unchanged over the following decades, and the delineation of areas required a labor-intensive, interactive paper map-based process that involved some amount of subjectivity – for example, in deciding whether a nearby densely settled area was part of a UA. The inclusion of nearby areas often requires the use of “jumps” or “hops” over unsettle or less dense areas. Although whimsical sounding, these are technical terms, with jumps generally limited to a single jump (in a given direction) of no more than 1.5 miles, but hops permitting multiple spans of up to 2.5 miles.
For 1990, the Census Bureau for the first time published UA criteria in a Federal Register notice, and used interactive delineation software. The process was more automated, but still required subjectivity – as in the application of “hops” which were first used in 1990. For 2000, the UA concept was supplemented with urban clusters (UCs) – the same concept, but applied to smaller areas. UAs have populations of 50,000 or more, and UCs have populations of 2,500 up to 50,000.
Ratcliffe described how the use of automated software has shifted the focus from which areas to include in UAs to questions of where to split large urban areas. If fully automated, the process would do things like split urban areas in the middle of a downtown business district, or identify a single continuous UA from Wilmington, DE to Springfield, MA – and by rule, call it “New York.”
Issues for 2010 include the identification of nonresidential urban land uses on the fringe of UAs, establishing objective methods for splitting large urban agglomerations, the comprehensive identification of undevelopable territory (such as wetlands and steep slopes), and the delineation of UAs in Puerto Rico and the Island Areas. Looking beyond 2010, a challenge will be how to incorporate annual data from the ACS (intercensal UA and UC updates?), and the possible definition of “suburban” and “exurban” areas.
Chris Henrie reinforced two big major points – the continued use of the 2,500 population threshold, and the fact that the threshold no longer relates to incorporated areas. Henrie then filled in with helpful details on the application of jumps and hops, the exemption of areas such as parks, and other issues in the inclusion/exclusion of discontiguous areas. He also described options being explored including the greater use of census tracts (allowing for greater use of commuting data), and the use of data on impervious (paved) land surfaces and steep slopes. They are also reconsidering the importance of the concept of the central place to the realistic delineation of urban areas. Above all, Henrie stressed that the Census Bureau remains committed to an objective, equitable and consistent nationwide delineation of urban areas.
Concerns From COPAFS Constituencies
No concerns were raised, and the meeting was adjourned.