Acting chair Maurine Haver started the meeting with a reminder that Executive Director Ed Spar is retiring at the end of the year. She noted that the Board had selected a search firm, and that an announcement for the position would be distributed. The Board’s Search Committee will narrow to a few candidates, and the full Board will make the final selection. Haver asked attendees for their help in identifying candidates so COPAFS can make the best possible appointment. 

Executive Director’s Report. Ed Spar

Spar described the success of recent conferences, including meetings with the Census Bureau’s Governments Division. The budget numbers are not encouraging, as most are at best stable (when accounting for inflation), and most agencies are in the mode of holding onto what they have. One exception is NCHS, which is asking for a sizable increase, and is looking at projects including enhancements to NHANES. Spar also noted that a large response from the data user community seems to have saved the National Longitudinal Survey, at least for now. BLS had planned to eliminate the survey. 

The big issue at the Census Bureau now is proposed legislation to make response to the ACS voluntary. Previous research suggests a voluntary ACS would experience about a 20 percent drop in response rate, a sharp increase in costs, and a likely reduction in data quality. A congressional briefing on the topic was scheduled for March 19. In other Census news, the 2010 Census Coverage Measurement numbers are due out soon, and in contrast to 2000, the 2010 measures will report net coverage down to the state level. The exact date of the release is not known.   

Spar also noted the nomination of Erica Groshen to be the next BLS Commissioner. He recalls Groshen from the 2010 Census Advisory Committee, and recommends her highly.  

National Crime Victimization Survey: Core and Redesign

James Lynch. Bureau of Justice Statistics

Lynch explained that the National Crime Survey (NCS) was initiated by the Census bureau in 1972, and has been through many redesigns. For example, a 1976 NAS report that was sharply critical of crime statistics led to a 1978-1985 redesign that was eventually introduced in 1992. Another NAS report in 2008 made recommendations that led to the National Crime Victimization Survey (NCVS) redesign from 2009 to present. By 2006/2007, the survey had been flat funded for 20 years, and was in bad shape. Assuming that its recommendations to “fund the survey” would be ignored, the NAS panel identified design changes that would reduce costs and increase the utility of the survey – for example a recommendation for state and city level estimates, and improved measures of rape and sexual assault. Since then, funding for the core survey has been increased, and that has given rise to the redesign. 

A goal of the redesign is to restore the core survey – the thinking being that the basic design was sound, but that execution had deteriorated. Specific goals are to restore the sample size, re-establish training, improve performance monitoring, and establish a program of ongoing methodological research.    

Another objective is cost containment. The major cost is finding crime victims, as they interview about 10 people to find one victim. And Lynch explained that recent reductions in crime rates are a problem for the survey, as they make it more difficult to find a large enough sample of victims. Research into cost reduction has considered expanding the reference period, and increasing the survey’s scope. Counting crimes experienced over a longer period of time increases the sample of victims, but also reduces data quality due to recall issues. Increasing scope to include more than street crime also reduces costs by finding more victims, and can be further justified as a response to real world changes (likened to changing the CPI market basket). Further savings could be achieved by interviewing only one or two persons per household, rather than the current practice of interviewing all persons age 12 and over.   

Lynch noted the bleak prospects for household surveys in general, and the push for blended options using administrative records. But he cautioned that administrative data reflect the criminal justice definition of crime. The police, he explained, define crime as an event at a point in time, while the survey has a public health focus on the harm experienced by victims, and thus a continued need to get reports from households.      

Lynch described the program for sub-national estimates – noting that the relative rarity of violent crime makes it a challenge to get a large enough sample of victims for sub-national areas. He also noted that, because the NCVS covers a broad “market basket” of crimes, it is not an optimum design for measuring specific violent crimes such as rape and sexual assault. One would need to specifically target populations impacted by these crimes. 

Again, Lynch described the difference between the criminal justice view of crime as events at a point in time versus the public health view concerned with conditions, and experiences after the event. Just as criminal justice data do not meet the needs of the survey, public health data do not meet the needs of the criminal justice system.              

The measurement of juvenile victimization is another challenge. Response rates are low, and the cognitive capacity of the respondents is an issue. Also, there are victims younger than the age 12 minimum for interviews, and the legal responsibility to project child victims poses an issue. Lynch said they are looking into how best to measure juvenile victimization (household based surveys? school based surveys?), and are looking for recommendations. 

Research Developments at the Census Bureau

Roderick Little. US Census Bureau

Little explained that he was recruited by Census Director Robert Groves to establish a new Research Directorate at the Census Bureau. Groves and Little believe that a strong research program is more important than ever as the Census Bureau faces increased demand for data products at a time when surveys and censuses are increasingly expensive and challenging to mount. Research is seen as the key to reconciling these tendencies. 

The Research and Methodology Directorate is headed by Little, and consists of five centers – centers for economic studies, statistical research and methodology, survey measurement, disclosure avoidance research, and administrative records research and applications. 

Little then described the directorate’s strategic objectives. Objective 1 is to build a research and methodology directorate that fosters innovation and plays a strategic role in Census Bureau activities. Part of this objective is to develop existing staff and recruit new talent to be thought leaders in research. Objective 2 is to port research on new products and processes to program areas – especially more methodologically complex products and processes. As Little put it, good ideas do not implement themselves. Objective 3 is to establish more robust collaborations with external research organizations and agencies – reflecting the view that the Census Bureau has been too inward looking. Objective 4 is to increase the statistical literacy of the users of Census Bureau data – for example, with respect to small area ACS data.     

Next, Little gave a synopsis of the NSF Census Research Network (NCRN) – a set of research nodes conducting interdisciplinary research and education activities on methodological questions of interest and significance to both the research community and the federal statistical system. Key among the NCRN goals is to advance the development of innovative methods and models for the collection, analysis and dissemination of data. Current research and methodology topics include 2020 census design, multiple mode data collection, disclosure avoidance, record linkage/administrative records, modeling to enable more granular estimates, and innovative data products.      

Challenges to such research and methodology work include recruiting the best researchers (citizenship issues are one obstacle), building better links between research and production, and institutionalizing research excellence. As Little summed it up – “We need your help.”

Dynamics of Small and Young Businesses

Ron Jarmin. US Census Bureau

In a follow up to his December 2009 COPAFS presentation, and citing data from the Census Bureau’s Business Dynamics Statistics, Jarmin presented further findings from his work on the role of small, large, young and established businesses in the US economy. 

Jarmin started with a look at the size distribution of businesses, noting that there are about 21 million “nonemployers” – entities that claim to be a business, but have no employees (such as a person selling Tupperware). There are 4.9 million firms with 1 to 499 employees (5.4 million establishments and 54.8 million total employees), and 30,000 firms with 500 or more employees (1.1 million establishments and 54.5 million employees). The Small Business Administration defines firms with fewer than 500 employees as “small.”    

Among businesses with employees, many have from one to four, and the numbers drop sharply from there. For example, there are few firms with 10,000 or more employees, but many employees in those firms. Next, Jarmin presented the distribution of firms by age. There are relatively few at age 0 (startups), 1, 2, 3, 4, and 5 years, and somewhat more in the 5 year span of 6 to 10 years. The oldest category “firms born prior to 1976” accounts for 45 percent of total employment. Jarmin elaborated that age of firm is defined as the age of the oldest establishment owned by the firm when it was first seen in the data. A firm created by a merger would not count as “new,” and new establishments – such as when a retailer opens a new store – do not count as a new business.      

Turning to the question of jobs, Jarmin defined job creation as an increase in the number of employees at new and growing businesses between two points in time. Job destruction is a decrease in the number of employees at shrinking businesses. When people looked at data only by size of firm (and not by age of firm), they saw an inverse relationship, and this was the source of claims that small businesses are the engine of job creation. However, if one controls for age of firm, the relationship pretty much goes away. There is still a lot of job creation in small firms, but that is because many small firms are new enough that they are in the mode of creating jobs, and are not (yet) destroying jobs.      

Jarmin explained that actually, businesses create jobs roughly in proportion to their share of total employment, and productivity is another key element. Young businesses that survive are those with high levels of productivity, while companies (young and mature) that are exiting are those with low productivity. Surviving firms grow rapidly when they are young, but this growth (and job creation) levels off. Job destruction also starts high for young businesses, and then level off as the least productive exit the economy. The bottom line seems to be that young businesses are volatile, and have disproportionately high levels of job creation and job destruction. 

Looking at trends in private sector job creation and destruction since 1980, Jarmin noted a generally downward trend in both. But what really drops off after the recent economic crisis is job creation. In other words, the economic downturn is less about job destruction than about a lack of job creation.     

Jarmin reported that the share of employment accounted for by young firms has been dropping in all states, but the rate varies by state. The state variations suggest that numerous factors might be involved. Demographic factors could include changing age structures and immigration rates, which can impact the number of potential entrepreneurs and workers. The business environment also plays a role through the structure of financial markets, and variations in regulatory environments, 

Asked if these data reflect jobs created overseas, Jarmin commented that if offshore job creation was a key element, the rates would differ by industry – for example, manufacturing versus retail. He also commented that while the churn in jobs is costly, it also contributes the benefit of increased productivity.    

Jarmin concluded by recapping his major conclusions. The idea that most jobs are created in small businesses is an oversimplification. You cannot understand a complex situation with a single, simple source of data, and adding just one dimension to the data (such as age of firm), can make a huge difference in the conclusions they suggest.  

Estimating Mental Illness in an Ongoing National Survey.

Joe Gfroerer. Substance Abuse and Mental Health Services Administration

Gfroerer started with a summary of the National Survey on Drug Use and Health (NSDUH), which is sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). Conducted since 1971, the survey’s purpose is to estimate the prevalence and correlates of substance use in the US.    

The survey is based on a sample of about 68,000 respondents per year, with an oversample of persons age 12-25. About 88 percent of selected households completed a screener, and 74 percent of those selected complete the interview. A $30 incentive is provided. 

The questionnaire, which takes about an hour to complete, has questions related to the use of alcohol, tobacco and illicit drugs, substance use disorders, substance use and mental health treatment, health conditions and service utilization, and demographics. 

A mental health surveillance study (MHSS) was added in response to legislation requiring SAMHSA to produce methods to estimate serious mental illness (SMI) and serious emotional disturbance (SED) in children. The MHSS was implemented in the 2008 NSDUH.  

Serious mental illness among adults is defined as any DSM-IV mental disorder (other than developmental and substance use disorders) with serious functional impairment (both within the past year). A complete diagnostic assessment is not feasible in the NSDUH interview so as an alternative, a clinical interview is administered on a subsample of respondents (to diagnose SMI), and a regression model is developed and applied to the main sample to predict SMI for each respondent. 

The measurement of impairment in the main NSDUH sample is based on a version of the WHO Disability Assessment Schedule (WHODAS), which assesses functional abilities with respect to daily activities.  At the end of each NSDUH interview, a request is made for a second interview on mental health to respondents selected for the clinical follow up – and another $30 incentive is offered. The interview is conducted by telephone by a trained clinical interviewer. 

Gfroerer described some of the estimation steps, and reported some findings including categories extending beyond serious mental illness (SMI) – such as low/mild mental illness (LMI), moderate mental illness (MMI) and total/any mental illness (AMI). With respect to the prevalence of mental illness among adults 18+, the 2010 results estimated 46 million with any mental illness, 11 million with serious mental illness, 15 million with major depressive episode, and 9 million with serious thoughts of suicide. Gfroerer noted that the most seriously mentally ill are not covered because they are unable to do the interview. Otherwise, the SMI model produced results that compare favorably with other sources.      

Gfroerer wrapped up with a discussion of some methodological issues, such as non-response bias and how often to update the model. For now the intent is to continue accumulating data and evaluating the model, and to update the model when there is evidence that the estimates can be substantially improved. Gfroerer’s conclusion was that the MHSS provides the only current data on trends in mental illness and its co-occurrence with substance use. The data are being widely cited and used, but more work is needed to refine the models and methods. 

Concerns From COPAFS Constituencies

No concerns were raised, and the meeting was adjourned.