COPAFS
 

Minutes of the March 5, 2010 COPAFS Meeting

Ed Spar.  Executive Director’s Report

Spar started with an update on Kathy Wallman’s recovery from a recent stroke.   Her rehab is going well, and she is expected to make a full recovery.  

A budget sheet was distributed, but Spar offered no comments since the numbers are so preliminary.   Instead, he described the new COPAFS website, which will have a “soft release” soon.   The new site is still a work in progress, but the low key release will give members the opportunity to see the major changes.   A more formal release is planned for the next quarterly meeting.   

COPAFS hosted a recent conference on Demographic Analysis (DA).   Spar reported that the Census Bureau is working hard now to come up with a series of DA estimates, and plans to hold another meeting later this year to explain the details.   Spar also noted that COPAFS has hosted six workshops for the Census Bureau’s Governments Division.    

Turning to the ACS, Spar noted that the first five year estimates are due later this year, and he explained that users will see discontinuities as the first set of five year data is consistent with the 2000 census, and the second set will be consistent with the 2010 census.   The handling of this transition is to be a topic at the upcoming advisory committee meetings.  

Spar then noted that data from the new National Household Travel Survey have been released, and described a report on a new supplementary poverty measure.  

Dates for this year’s remaining quarterly meetings are June 4, September 24 and December 3.  

COPAFS Chair Judie Mopsik then introduced Board member Seth Grimes, who described an initiative to increase COPAFS membership.   Grimes distributed a brief questionnaire that asks respondents to identify other organizations that might benefit from COPAFS membership.   Attendees were asked to complete and return the questionnaire by the end of the meeting if possible.   

Data Required for the Prison Rape Elimination Act (PREA)
Allen Beck.  Bureau of Justice Statistics.

The Prison Rape Elimination Act of 2003 (PREA) requires the Bureau of Justice Statistics to collect, report and analyze data on prison rape every year.   Beck described the challenge, as they are to sample not less than 10 percent of all federal state and county prisons, and report results not later than June 30 of each year.   The report is to include a listing of those institutions ranked according to the incidence of prison rape, and the identification of those that do not cooperate with the survey.   Data are collected for statistical, oversight, monitoring, and policy purposes.   Adding to the challenge is the fact that surveys deal in alleged versus substantiated events.   Data are reported at the facility level, and because some are small, confidentiality issues are another challenge.  

Beck listed the complaints often leveled at the survey, including the following.  

  • The report took too long to complete.

  • The report focused on large facilities, and did not include all facilities and youth.  

  • The rates are too low.   There must be something wrong with the survey.

  • The report fails to make it clear that the data are based on allegations.

  • Victimization is under-reported because victims are reluctant to report.

  • The report overstates the severity of allegations.   Not all incidents involve rape.

  • Prevalence rates are based on small numbers

  • The report is too complex.

  • The report does not provide enough detail.

  • The report did not offer policy recommendations.   

  • BJS failed to follow the mandatory reporting of abuse and neglect.

  • The report slanders staff and destroys what facilities are trying to do.  

At this point, Beck clarified that the survey relates only to kids in juvenile facilities, and reviewed some basic findings.   For example, 10.3 percent of kids reported some kind of staff sexual misconduct.   Only some incidents involved the use of force, but as Beck noted, in the juvenile setting nothing is voluntary.   Contrary to common perceptions, 95 percent of incidents involved boys with female staff.   So the findings cast a different light on the nature of sexual misconduct in correctional facilities.    

PREA requires the survey to select all large facilities (non-state with 150+ youth and state with 90+ youth), and a sample of state facilities with 10-89 youth.   All inmates are sampled in small/mid size facilities (less than 240).   In large facilities, males are selected with equal probability, and all females are selected.   A total of 195 facilities participated in the most recent survey (three refused), and data were collected from 10,263 respondents.    

Measuring victimization is a challenge, and requires a complex series of questions, administered in a private setting with computer technology that enhances confidentiality.   At one point, Beck noted that the survey is explicit in what it asks – “enough to make a sailor blush.”     

Addressing the question of whether the allegation-based findings should be believed, Beck noted that responses are checked for extreme and inconsistent patterns, and are supported by co-variation with substantiated data from other BJS sources.   Another indicator of consistency is that some facilities in the “high rate” category have a documented history of problems.         

Beck concluded by returning to the list of criticisms, and pointed to the strong responses they have for them.   But he noted the ongoing work needed in this area, and in particular, the issue of false positives and false negatives in the reporting of sexual misconduct.     

 

Understanding Event History Calendars 
David Johnson.  U.S. Census Bureau.
Jason Fields.  U.S. Census Bureau.

 

Johnson provided a brief introduction to the Survey of Income and Program Participation (SIPP), including recent controversies, and rallying user support.   SIPP is a panel survey that needs to reflect the detailed dynamics of income and program participation, as experienced over time.   Efforts are underway to re-engineer SIPP, and part of the effort involves the development of Event History Calendars (EHC).    

Jason Fields picked up the presentation, noting the panel nature of SIPP, and touting the ability of event history calendars to capture an entire year of data in one interview.   EHC interviewing is designed to exploit people’s memories, and the links between events, and Fields described the generally positive results of EHC methods evaluations.   EHC work started with a paper and pencil proof of concept test in 2008, and is being followed by a 2010 CAPI test that is in the field now.   Additional tests are planned for 2012.  

The basic goal is to determine if EHC interviews can collect data of quality comparable to the current SIPP.   Findings so far suggest three patterns.   First, EHC and standard SIPP month-to-month data are similar for many programs (Fields showed SSI data as an example).   Second, EHC is less than SIPP all year for programs such as Medicare, Social Security, WIC and Food Stamps.   The patterns are similar, but EHC data are consistently lower.   Third, in some cases (such as Food Stamps in Texas) EHC is less than SIPP, but only early in the year.   Comparisons with administrative data suggest that EHC data are, as Fields put it, in the ballpark, and the generally high level of EHC - SIPP agreement is viewed as a successful proof of concept.   

The 2010 field test seeks to determine how well EHC data can be collected with CAPI (as opposed to paper and pencil) methods.   The test includes about 400-500 interviews in high poverty areas.   Fields did a demo of the CAPI interview process.   The two hour interview asks respondents to identify “landmark” events, and information including residence history, marital history, and participation in programs.   For example, “Food Stamps” is a line on the EHC, and a respondent might report that they started Food Stamps in April and stopped in October. The CAPI instrument then follows up with the more detailed SIPP questions on Food Stamp participation.      

Fields explained that there is a lot of training in EHC data collection, as interviewers need to know SIPP and conversational interviewing.   Interviewers also are trained to look for apparent inconsistencies – for example, between move dates and the start of a new job.   Despite the challenges, the interview process is going well.  

Ed Spar asked if there is any evidence that the EHC process might keep people from dropping off the SIPP panels.   Fields said that it is a good question, but testing has not been extensive enough to answer it.  

Fields wrapped up by showing what a completed event history calendar looks like, and showed a mockup for the 2011 CAPI instrument.   He also noted that additional EHC material can be found on the re-engineering area of the SIPP website.  

COPAFS Board

After lunch, Ralph Rector announced the slate for the COPAFS Board for the coming year.   The slate included Don Muff and Ken Hodges as at-large Board members, Linda Jacobsen as Secretary and Bob Parker as Treasurer.   A motion to approve the new Board was seconded and approved.     

ACS Methods Panel Update
Jennifer Tancreto.  U.S. Census Bureau.

 

Tancreto explained that the American Community Survey Methods Panel is a yearly research program for testing improvements to ACS content and data collection methods.

The Methods Panel is currently completing two tests from 2009.   One tests the ability of additional mailings to improve ACS response rates.   Results confirm that sending either a postcard or an additional questionnaire package improve response.   The full questionnaire mailing boosted response slightly more than the postcard (22.3 percent vs. 21.3 percent).   However, the postcard increased response enough to cover its mailing cost, while the more expensive questionnaire mailing did not.    The other 2009 test examined the ability of a multi-language brochure to assist non-English speaking households in responding to the ACS.   Results from this test are due soon.  

The 2010 Methods Panel agenda includes a content test, similar to the 2006 ACS content test.   New questions would gather data on computer ownership and Internet access (requested by the FCC) and parental place of birth (requested by the Census Bureau).   Also to be tested are revisions to existing questions on Food Stamps, veterans identification and period of service, public assistance income, and wage/salary/etc. and interest/dividends/etc. income.  

Field tests involve a split-panel design, with some panels receiving test versions of the form and others a control version.   The initial sample is 35,000 housing units per panel, non-response follow up is by telephone and personal visit, and there is a content follow-up re-interview.   Field tests are scheduled for September – November 2010, with final changes in ACS production targeted at 2013.  

Looking to the future Tancreto described a 2011 test of an ACS Internet response option.   The test will explore different ways of offering the option, ranging from prominent mention to a small note on the questionnaire.   The sample will include targeted and non-targeted strata – targeted being those most likely to use Internet response.   There is concern with “response mode paralysis,” the reduction in response observed if people are offered too many response mode options.   With such issues in mind, Internet response is designed to be consistent with traditional modes, while taking advantage of the new technology.   For example, Tancreto showed screens of the Internet option, pointing out the “Help” links that respondents can click for assistance with each question.   There is also concern that some respondents could get lost in the Internet option, and some details are still being worked on.   For example, respondents can “Save & Logout,” but need a new password to log back in.   The Internet response test is planned for March 2011, with results expected later that year.       

Looking deeper into the future Tancreto described an ACS content re-interview survey that will study response error for ACS questions, and help identify items for content testing.   Timing for this test is to be determined, but they are aiming at 2012.   

Bureau of Labor Statistics Update
Keith Hall.  Bureau of Labor Statistics. 

 

BLS Commissioner Hall started with the day’s high profile numbers – announced by BLS earlier that morning (unemployment unchanged at 9.7 percent and a relatively modest loss of 36,000 jobs).   A big question was the impact of recent snow storms, which Hall noted, would not affect the unemployment rate, but could impact the number of people working.   For example, 1 million people reported not going to work at all (due to snow) during the reference week, and many others had their hours reduced.   Less clear is the impact of snow on the number of payroll jobs reported by establishments, but as Hall explained, any payroll growth that did not show up due to the snow, will show up next month.    

Hall then proceeded with a series of slides showing trends in BLS data, starting with civilian unemployment 1990-2010, and change in the number of payroll jobs – where sharp decreases have moderated to small losses in recent months.   The moderation in job losses has been pretty much uniform across sectors, and service-providing jobs actually show small gains now.   Trends in the number of establishments adding or decreasing jobs confirm the severity of this recession, but also that job losses have peaked.   Employment in temporary help services, where decreases led the recession, are now showing increases that might foretell a rebound in total non-farm employment.  

Data on industrial production, equipment and software investment, and manufacturing employment show a similar pattern of a notably deep dive followed by recovery in recent months.   Early in the recession the big losses in construction employment were in the residential area, but are now in non-residential construction.   Motor vehicle/parts employment has experienced long term declines that have accelerated in recent years (down 49.6 percent since January 2000).   Average hourly earnings show declines typical of a recession, and the downward trend continues.   And looking at unemployment rates, Hall noted that education and race still matter – with Blacks and Hispanics, and those with less than high school education having the highest unemployment rates.   Data also show that teens and recent grads are “really taking a beating” in this recession.        

Hall described the various measures of “labor underutilization” provided by BLS, including the “U-6” measure which includes discouraged workers, and is up to over 16 percent.   He expressed amusement at those who criticize the unemployment data for not including discouraged workers, and then cite BLS data in telling BLS what it is missing.   Hall also pointed to the sharp rise in long-term unemployment (more than 6 months).   Because we did not get strong job growth in the recent expansion, long-term unemployment started high, and has more than doubled in the current recession.          

Hall then showed data comparing the current and previous recessions, noting that this one has been especially severe with respect to job loss (nothing like it going back to 1953).   In previous recessions, we saw little or no loss of service sector jobs, but this time there have been big hits in the service sector.   This recession also breaks the pattern of slow recovery from lengthy shallow recessions, and sharp recovery from deep but brief recessions.   As Hall described it (with employment data overlaid for several recessions), this recession started like a normal one until the financial crisis hit.   And the data now confirm a deep recession with only gradual job recovery.   The question Hall posed is whether the important distinction is between shallow versus deep recessions or old versus new recessions.       

Hall will provide his slides to Ed Spar, who will post them on the COPAFS website.  

Hoping to end the session and the day on a more positive note, Spar asked Hall to comment on the BLS budget.   Hall commented that increases in the 2010 budget were in part a reflection of how bad their previous budgets were, but he noted that they are looking for another bump up to support some new programs for 2011, and hopefully sample size increases for the Consumer Expenditure Survey and the Consumer Price Index.    

Concerns from COPAFS Constituents

No concerns were raised, and the meeting was adjourned.