Summer Career Pathways Evaluation

An apprentice works on a drill in a factory

Photo by HighwayStarz/Fotolia

Health care. Advanced manufacturing. Information technology. There is increasing demand for skilled workers in these and other growing technical fields as our economy evolves. Yet across the nation, many employers are struggling to fill these positions. Moreover, workers who don’t have needed technical skills face a shrinking pool of rewarding job opportunities.

To resolve the mismatch between available talent and skills in demand in their region and to improve career opportunities for local unemployed and underemployed workers, the Office of Workforce Development in New Orleans, Louisiana, implemented an innovative program in 2015 to increase the local talent pool and help lower-skilled, unemployed, and underemployed individuals train for work in growing fields. This program, Career Pathways, was funded with a Workforce Innovation Fund grant from the U.S. Department of Labor.

The RAND Corporation evaluated the program to understand how effective the program was for both workers and hiring firms. For this part of the study, the researchers used a randomized control trial to compare the outcomes of participants and non-participants. They also examined how the program was run and its return on investment through a cost-benefit study.

  • Portrait of young engineer woman working in factory building, photo by serts/Getty Images

    What Works for Job Training Programs for Disadvantaged Workers

    Oct 17, 2019

    The New Orleans Career Pathways program was designed to increase the local talent pool. Did the program succeed in its mission to help trainees learn industry-valued skills and find related jobs? How can workforce development stakeholders benefit from the lessons learned in New Orleans?

Key Research Questions and Answers

1. Did participants in the Summer Career Pathways program go on receive higher earnings?

They did; overall, the participants went on to see an approximately 25 percent increase in their earnings. The program most helped those entering the program without employment and those with the lowest earnings before training. For example, those that were not working before training earned $1,716 per quarter more than they would have without training. This is 75 percent more than their counterparts in the control group who did not receive training.

2. How did the program participants fare in other areas that the study measured?

Some areas showed no or little improvement. For example, the results show that the participating in the program did not increase the likelihood of employment, nor how long the trainee, once hired, stayed on the job. Also, there were no statistical differences between trainees and the control group in the likelihood of being arrested.

On the other hand, program participants experienced more overall job satisfaction than non-participants, and in the case of IT training, a higher likelihood of working in the target industry. The job satisfaction finding, however, is based on a survey with low response rates.

3. Overall, was the program worth the investment?

From the perspective of both participants and the public—yes. But each realized or will realize benefits at different points of the program’s timeframe. Program participants invest time into their training and receive the program’s benefits almost immediately in the form of increased earnings. For the public, it will likely take five years to make up the implementation costs in terms of decreased social assistance transfers and increased tax revenues. For society—the combination of the program participants and the government—it takes three years for benefits to exceed costs. The 30-year annualized return on investment for society is over 7 percent. All benefits together make this particular intervention favorable with respect to the return on the investment.

It should be noted here that the first two years of the program did not generate any benefits for the government. Early in the program, resources were needed to get programs up, running, and “right.”

4. What are some of the elements that made the program work?

RAND found that partnerships were key. New Orleans’ Office of Workforce Development, training providers, and employers noted separately that the partnering with one another created strong and valuable connections, and were critical to selecting training focuses and curricula. The screening of candidates on adult basic literacy and numeracy also showed some promise in selecting better-suited training candidates.

5. What lessons were learned along the way?

Job counseling and employer engagement needed strengthening. Only half of all training participants reported receiving job opportunity information. Similarly, only 40 percent received resume development and job-readiness services. Also, the availability of supplementary program benefits could have been communicated more strongly at the outset of the program. Each trainee was allocated $6,000 for books, equipment, transportation costs, and potential on-the-job training subsidization. Many trainees noted that they were unaware of stipends, materials, and other benefits until late in their programs, if at all.

Recommendations

For workforce investment boards, employers, training organizations, and other stakeholders

Target medium-length job training programs to unemployed and low-income individuals, but screen for literacy and numeracy.
Increased earnings were most prominent among these populations. The research team also found that the Test of Adult Basic Education was the most promising screening mechanism for choosing good candidates for training.

Incorporate hands-on practice and classroom instruction when feasible.
Trainees in advanced manufacturing reported a desire for more hands-on practice, while information technology trainees engaged in an online-only program, which might not have been the most effective format for a vulnerable population that could have limited professional experience. Blended approaches to instruction might be a better option.

Be able to respond to an evolving job market.
The program was initially designed to develop workers for the energy sector, as well as advanced manufacturing. However, decreasing oil prices in 2015 and 2016 led to a reduction in the demand for energy workers. Given their communications with local employers, the Office of Workforce Development was able to learn about this decreased demand and shift attention to other areas (information technology and health care) for which there was demand. Any job training program should be flexible enough to respond to local demand shifts in a timely manner.

Create strong and sustainable partnerships between government and nongovernment entities.
The ability to build such partnerships is affected by funding constraints and changes in the economic and political contexts in which the partnerships are embedded.

Ensure that training programs are connected to local demand and that there are strong industry partnerships.
That employers gave input on training topics and curricula was likely critical to its success, but more could have been done in connecting trainees with local firms after completing training. Managers of such programs should consider forming stronger industry partnerships that allow for post-training introductions and support, as well as potential on-the-job training.

Evaluate both the nonemployment and employment benefits of a program.
Most evaluations of job training programs are limited to employment and earnings outcomes. These evaluations would have missed the gains in job satisfaction.

Take time to get it right.
If the program had been evaluated simply on the initial cohorts, the study would have shown Career Pathways as an ineffective intervention. The city and training providers needed to learn the best approaches for recruitment, industry partnerships, and screening decisions through a process, as well as have time to improve existing practices and teaching modes.

Have patience when seeking investment returns.
The positive returns to society suggest that this program demonstrates a good use of public resources and should potentially be a model for future training programs. However, as the return-on-investment analysis shows, patience is required.