1. What was the question that prompted this research?
Matthew: “How can job training programs for disadvantaged workers be more effective?” This was our leading question going into this analysis. There is a mixed history on the effectiveness of job training programs. But one promising factor Career Pathways had was that it embraced demand-driven training and curricula. Most successful programs have this; it means that they are creating programs and training people for jobs and with skills for which there is local demand.
Of course, although we were excited by the potential success of the program, we also understood that there would likely be some potential challenges along the way.
2. What were these potential challenges?
Gabriella: One challenge that Career Pathways faced early on had to do with its demand-driven nature. Program leaders had to make sure that they carefully coordinated their training and curricula with local industry partners to make sure that the training they offered was relevant—that is, that the program planners understood New Orleans’ existing job market and were open and responsive to new opportunities that might arrive.
Matthew: Another challenge that program leaders needed to attend to was training attrition. This can be a problem for all workforce training programs but especially for those with low-income students because these students especially can have difficulty taking significant time out of their lives to go through unpaid training. To help lessen this burden, the city devised screening mechanisms that sought to identify training candidates who were most likely to attend and complete training. They also offered a stipend to subsidize costs for books, supplies, and transportation to the training site.
3. How did you assess Career Pathways?
Gabriella: We conducted the study in three parts. The first was an implementation analysis. In this part, we assessed how the program was designed and how the many partners involved in the training program worked together. This meant that we needed to look at the how the Office of Workforce Development, the employers who helped design the training and who needed talent in the region, trainee-screening organizations, and the training providers supported the goal of the program and complemented one another in their efforts. We also wanted to capture trainees’ and partners perspectives on how well the program was doing. We aimed to document what worked well and facilitated implementation and what were hindrances or barriers.
This information was pivotal in understanding which pieces of the program supported the program’s success and which needed modification.
Matthew: The second part of the study was an outcome analysis. This used a randomized-control trial design: From the pool of people who had expressed interest in the program and passed the initial screening, we randomly selected half to be invited to training. The other half did not receive the training during that study period. From the data collected for this analysis, we did simple regressions—basically, comparisons—of outcomes between the two groups, with statistical adjustments for demographics and baseline characteristics to improve the analysis. I need to note here that people who did not receive training for this part of the study had the opportunity to reapply for the program later.
Finally, we conducted a cost–benefit analysis. This allowed us to estimate the benefits of the program for participants and the city government. We used calculations from the outcome analysis for the benefits and, for estimates of program costs, conducted interviews from trainees, participating businesses, and city and program officials.
4. What did the analysis reveal?
Matthew: Along with the challenges we described earlier, there were some positive, promising findings. First, we found significantly greater earnings for those assigned to the training programs than those not assigned. Those assigned training earned roughly $800 more per quarter, which is about a 25-percent increase in earnings over those in the control group.
Another positive finding was that the largest improvements in employment probability and earnings were for those who entered the program without jobs and with the lowest annual earnings. The probability of holding a job in the quarters after training ended was greater for those who were not working when they entered training than those who entered training with jobs. They also earned more than $1,700 more per quarter than their peers in the control group. These are very large shifts, which is encouraging for other regions considering building up programs like this one.
Finally, our cost analysis revealed some positive findings as well. We found that, by the fifth year after training ends, the increase in tax revenue plus the savings created by the decrease in social assistance payments would exceed the cost of implementing the program. For society more generally (when we include the gains to the participants), this would take about three years. The annualized return on investment for society came to just over 7 percent, for both the three-year (short-term) and 30-year (long-term) horizons.
5. What was your most surprising finding?
Matthew: There were two for me. The first was how much the program improved over time. The outcome analysis showed that the first few cohorts had no gains in earnings but that later cohorts had large returns. The second surprising finding had to do with the large impact this program had on the most-disadvantaged populations—those who entered the program not working and those earning the lowest incomes.
Gabriella: I was surprised by those findings as well. Another surprise for me came from the implementation analysis. It was interesting to see just how much strong partnerships and a willingness to modify programming made a clear difference in how effective the program became. There was a clear connection between adjustments in design features to make the program more demand-driven so that it met the needs of the local labor market. It’s quite rare to see such a strong direct positive association!
6. What do you think should be the next steps for New Orleans and other places?
Matthew: The next step is to continue to run these kinds of job training programs. Our research suggests that this is a very important way that cities can use their resources to benefit the residents with the greatest need. Also, New Orleans should continue to work with industry partners to determine the best areas for future training. This is a good lesson for other cities as well: To develop and maintain a relevant training program like Career Pathways, program designers need to stay connected with local industry leaders and trends.
What we need to study more is how training facilitators can better connect participants with local employers after the end of training. The program never promised trainees that they would get a job instantly. But one of the intentions of the Career Pathways program was that they would be connected with interviews and even supported with government-subsidized on-the-job training once they graduated. This did not happen as often as intended. But we would be really interested to know how these kinds of activities would change the effectiveness of the program.
Gabriella: We also would like to look at the outcomes from Career Pathways over a longer timeline. We’d be better able to see the effects that training had on participants, say, five or ten years after training. A longer timeline for study would help us understand whether the effects on earnings persist, grow, or disappear. Early evidence suggests that, at worst, they simply persist in size and do not shrink.
We would also be better able to look at secondary outcomes. We investigated the impact on arrests and saw interesting suggestions that, for men, the training might be associated with lower arrest rates. However, the short follow-up period allowed by the study did not give us sufficient statistical power to come to any real conclusions. A longer follow-up period would allow us to reexamine the effect on arrests with more power.
Read the research brief »
Read the full report »