Decomposing Differences in Impacts on Survey- and Administrative-Measured Earnings from a Job Training Voucher Experiment

Publisher: Evaluation Review (online ahead of print, subscription required)
Oct 16, 2018
Quinn Moore, Irma Perez-Johnson, and Robert Santillano

Background. Differences in earnings measured using either survey or administrative data raise the question of which is preferred for program impact evaluations. This is especially true when the population of interest has varying propensities to be represented in either source.

Objectives. We aim to study differences in impacts on earnings from a job training voucher experiment in order to demonstrate which source is most appropriate to interpret findings.

Research design. Using study participants with survey-reported earnings, we decompose mean earnings differences across sources into those resulting from (1) differences in reported employment and (2) differences in reported earnings for those who are employed in both sources. We study factors related to these two sources of differences and demonstrate how impact estimates change when adjusting for them.

Results. We find that differences in mean earnings are driven by differences in reported employment, but that differences in impacts are driven by differences in reported earnings for those employed in both data sources. Employment and worker characteristics explain much of the research group differences in earnings among the employed. Out-of-state employment, self-employment, and employment in low unemployment insurance (UI) coverage occupations contribute importantly to research group differences in survey- and UI-based employment levels. Employment in more than one job contributes to treatment group differences in earnings among the employed. All of these factors contribute substantially to the difference between survey- and UI-based earnings impact estimates.

Conclusion. Findings underscore the relevance of UI coverage to estimated earnings impacts and suggest assessing employment impacts using both UI- and survey-based measures.

Senior Staff

Quinn Moore
Read More