COPAFS
 

Minutes of the March 7, 2008 COPAFS Meeting

COPAFS Chair Ralph Rector started the meeting, and introduced Ed Spar for his Executive Director’s report.

Spar described the Senate hearing earlier in the week in which Census officials were grilled over problems with the program to use handheld computers for non-response follow up and other 2010 census field operations. Reports of these problems have become public, and the problems may be severe enough to force a return to a paper-based census – a move that would require significant additional funding from an already irritated Congress. In response, an internal task force at the Census Bureau has identified four options, ranging from having Harris Corporation (the contractor) proceed with development, while the Bureau prepares paper-based methods as a back-up, to varying combinations of handheld and paper-based methods. For example, the census could revert to paper-based follow up for non-response, but retain handhelds for other operations. By next week, the internal task force is to report its recommendations to a special committee convened by the Commerce Department consisting of Vince Barabba, Ken Prewitt, Dennis Hastert, and an IT expert. By the end of March, this committee is to report its recommendations for how the census should proceed.

Next, Spar drew our attention to a draft document he has prepared on the FY 2009 budget situation. One item of note is that BLS has decided to eliminate its time use study, as a cost-cutting measure. But the study is popular, and some users are advocating for its continuation. In more positive news, Spar noted that for the first time in many years, NCHS is getting a significant increase in funding, which will allow them to enhance survey sample bases, and maintain basic data products. Spar remarked that this increase is especially impressive given the proposed cut in the overall CDC budget. For other budget news, Spar referred us to the draft.

For those interested in Census Designated Places and Census County Divisions, Spar referred to a recent Federal Register Notice describing how these geographic areas will be retained.

Spar wrapped up with a plug for next week’s incentives seminar (with 170 attendees already registered), and reminded us that the remaining 2008 COPAFS meeting dates are June 6, September 12, and December 5.

Innovation Measurement: Tracking the State of Innovation in the American Economy.

Cynthia Glassman. U.S. Department of Commerce

Cynthia Glassman, undersecretary for economic affairs at Commerce, was joined by chief economist Joe Kennedy and Steve Landefeld of BEA, to describe the work of the department’s Advisory Committee on Measuring Innovation in the 21st Century Economy.

Charged with identifying how to measure innovation, the committee was established in September 2006, and includes corporate CEOs, researchers, and academic experts. At the committee’s first public meeting in February 2007, it became apparent that there is no magic number or single index that captures the elusive notion of innovation. When asked how they measure innovation in their companies, CEOs mentioned things like increasing market share in a growing market and the development of new products and services, while the academic members had their own views. A second public meeting was held in September 2007, and a report released in January 2008 (available online at www.innovationmetrics.gov).

The report recommends that innovation measures should be practical, use information that companies already have, and minimize burden. It recommends further that government create a stronger framework for measuring innovation, better leverage existing data (such as data on services), increase access to relevant data, and promote greater “data synchronization” – aka data sharing. The report recommends that businesses create industrial and firm level innovation measures, participate in innovation research activities, and make their own innovation findings available to researchers. The importance of collaboration was a recurring theme, and was reflected in the secretary’s call for a series of forums on innovation and the impediments to innovation. The first forum is scheduled for March 17 in Kansas City.

Steve Landefeld noted that BEA’s traditional economic measures only account for a portion of economic growth, and that the missing portion has been regarded as growth in knowledge and productivity, but remains something of a puzzle, and could be considered a residual measure of innovation. Arguing that residual measures can be misleading, Landefeld commented that direct measures are needed, and described work BEA is doing with BLS to investigate and reconcile differences between the two agency’s data on the performance of the economy. Data synchronization is a major part of this effort.

Joe Kennedy described incentives for data sharing, but with penalties for disclosure. Specifically, he described proposals to share and reconcile differences in the BEA and BLS business lists. After meeting with congressional staff, the agencies are crafting a proposal that they hope will pass on the Hill. The proposal calls for BEA to gain access to non-corporate tax data – the thinking being that small (i.e., non-corporate) companies account for a disproportional share of the services sector associated with innovation. The proposal also would allow the Census Bureau to share its federal tax information to promote the reconciliation of differences with BEA data. Kennedy expressed hope that the proposal will pass, and soon – within the next few months.

In the Q&A session, it was noted that most industrialized countries have conducted innovation surveys that might guide the early efforts in the U.S. Glassman also took the opportunity to describe the committee’s definition of innovation as very broad – including anything that is new to business and which adds value. But there is concern as to where to draw the line. Citing China as an example, she noted that some innovations might be new to the country conducting the survey, but not new to the world. Should such activity count as innovation?

The National Opinion Research Center Data Enclave.

Chet Bowie. National Opinion Research Center.

Bowie described the NORC Data Enclave as a “virtual collaboratory,” and explained that it is a system to provide remote access to sensitive microdata files in a confidentiality protected environment. Opened in July 2007, the enclave currently provides access to data from the National Institute for Standards (NIST), the Kauffman Foundation, and the Economic Research Service at the US Department of Agriculture. The program is a response to the increase in the demand for data access and the growing concern for confidentiality and disclosure avoidance.

Among current data access “modalities,” Bowie described public use files as providing high quality, but lacking in timeliness and access to linked microdata. Licensing options involve issues with security, and also no linked data. Research data centers are expensive to establish, are inconvenient for most users, and many users are not eligible for access. As Bowie described it, the “ideal system” would be secure, flexible, low cost, and able to meet replication standards (providing metadata and other resources necessary to enable the replication of results). It is this ideal that NORC is striving for in the data enclave.

Project director Tim Mulcahy described the security concerns, and how they are dealing with them. He listed some of the confidential data collected by NORC to illustrate their variety, and the point that no single approach is ideal for all. Instead, they take a portfolio approach. Physical data protection is achieved with VPN technology and layers of authentication. When the authorized user dials up the enclave, their computer becomes a “dumb terminal” with normal capabilities (such as print, cut, paste, etc.) disabled. There is also software to track what researchers are doing, and detect anything inappropriate. There are also legal, statistical, operational, and educational elements to security protection. And the protections can be applied with flexibility – different levels of protection for different data files, established in collaboration with the data producer. Users are required to sign confidentiality agreements, and there are heavy fines for violation.

Mulcahy described the importance of user education, and summarized the goals of the program as adding value to the data, reducing costs of analysis, facilitating the publication and visibility of results, and creating value for future research (user feedback to data producers). He identified software applications available in the enclave (e.g., Microsoft Office and SAS) so users do not need to know technical languages, and he described a “data custodians” function for technical support and answers to user questions. The overall objective is to get users into the system, and making productive use of the data.

Asked who can have access to the data enclave, Mulcahy explained that researchers must submit proposals that are reviewed by NORC with producer input. However, the hope is that eventually, the enclave will become more widely available to the public. Asked how much they charge for access, Mulcahy reported that it is $100 per week per user, which is about the marginal cost of having an additional user on the system. The objective is not to make money.

Measuring National and International Remittances.

Elizabeth Grieco. U.S. Census Bureau.

Grieco started by describing a Conference of European Statisticians (CES) Work Plan to Improve Migration Statistics, and an expert group meeting on measuring remittances and migration. The topic was how best to measure remittances, a term not formally defined by the presentation, but which clearly relates to transfers of money from immigrants back to their countries of origin.

The Conference of European Statisticians is responsible for guiding the work of the UN Economic Commission for Europe (UNECE), which holds work sessions about every two years, and addresses issues related to the consistency and collection of migration statistics. One specific CES objective is to harmonize concepts and definitions of remittances between balance of payments and household surveys. The most recent UNECE work session on this topic was November 2006 in Edinburgh.

Grieco noted the increased interest in remittances and migration data, with specific questions concerning the differences between estimates of remittances based on balance of payments data (based primarily on bank data) versus those based on household surveys.

Grieco then described a January 2008 expert group meeting on the measurement of remittances using household surveys. The meeting in Suitland, MD was hosted by the Census Bureau, UNECE and the World Bank, and included participants from universities and international organizations. The topics covered included the types of remittance data needed by users, the use of household surveys to measure remittances in sending and receiving countries, and technical data collection issues (sampling, weighting, etc.). The group proposed the formation of a task force focusing on the measurement of remittances and migration in household surveys. The US will chair the task force, and host the first meeting, with members including UNECE member and non-member countries, international organizations, and other experts. The objectives are to advance methodological work on the measurement of remittances through household surveys, and to develop international recommendations on how best to collect such data. The task force also will work to develop a list of best practices, a research agenda, and standards for data tabulation, publication, and dissemination.

Grieco also reported that the Census Bureau is hoping to add a question on remittances to the CPS, but still needs to get it approved. Asked if such a remittance question would be better asked on the Consumer Expenditure Survey or the ACS, Grieco acknowledged the logic of those suggestions, but noted that it is much easier to get a question added to the CPS. When asked what exactly constitutes a remittance, she indicated that it includes personal transfers from migrants back to their country of origin, where the recipients could be a person, church, business or other entity. Other questions concerned money sent to the origin country in exchange for something sent in return, or cash brought (and left) by the migrant on a visit to their country of origin. Clearly, there are some definitional and measurement challenges.

 

Housing Unit Based Estimates Research Team: A Progress Report.

Jason Devine, Manisha Sengupta. U.S. Census Bureau.

Devine explained that the Census Bureau’s Housing Unit Based Estimates Research Team (HUBERT) was established in response to the 2006 conference and congressional hearing devoted to the Census Bureau’s estimates program. Among the recommendations from these events was that the Census Bureau consider alternative methods based on housing unit data. This presentation was a review of HUBERT’s work to date comparing housing based estimates to those produced with the current administrative records (AdRec) component method, which tracks migration with matched IRS returns.

The basic approach is to use the two methods to produce estimates for 2000, and evaluate against the 2000 census counts. The housing based estimates are based on the Census Bureau’s current housing unit estimates (those used as ACS housing controls) based on building permits, mobile home data, and estimates of housing loss. HUBERT work includes a closer look at the permit data and housing loss estimates, and research into the potential contribution of counts from the Master Address File. The MAF is complicated, however, and HUBERT seeks to identify which of several alternative extracts provides the best basis for housing based estimates.

The results suggest that (for 2000) county population estimates produced with the current AdRec method perform better than those based on housing data. The mean absolute percent error (MAPE) for the AdRec method was 3.2 and the MAPE for the housing based estimate was 5.9. Estimates based on the average of the two methods had a MAPE of 3.8. Although housing based estimates for large counties were about as accurate as AdRec estimates, the AdRec advantage holds across many types of counties, and its estimates were more accurate in 70 percent of counties (containing 61 percent of US population).

The housing unit method starts with housing totals, but uses estimates of occupancy, persons per household, and persons in group quarters to build the estimate of total population. So there is concern that errors in the estimation of these components may inflate the errors for the housing based estimates. Persons per household (PPH) is a particular concern, and HUBERT has used “uncontrolled” PPH estimates from ACS because the PPH estimates reported on ACS products can be volatile, and reflect the relationship between sometimes inconsistent population and housing controls (rather than the PPH of sample households). Devine described work designed to improve PPH estimates (including contract work with external researchers), and it is hoped that PPH estimates can be improved. However, evaluations suggest that even a 50 percent improvement to PPH accuracy (combined with improvements to the housing and occupancy estimates) would only bring the accuracy of the housing based estimates to about the same level already achieved by the AdRec method.

Looking ahead, Devine described a HUBERT/estimates conference scheduled for June, where further results will be presented, and scrutinized. Attendance will be by invitation. The Q&A discussion that followed touched on the relative merits of the AdRec and housing based methods, the estimates challenge program, and the relative importance of PPH, occupancy rates and the basic housing numbers. The discussion was lively, and suggests that the June conference will be a good one.

Concerns of COPAFS Constituents

No concerns were raised, and the meeting was adjourned.