For decades, companies have offered more or less the same deal to Americans in search of office jobs: You pay for your own higher education and skills training, and then we’ll consider employing you.
But with corporations unable to hire and hold on to enough workers to fill jobs in IT, cyber security and software development, a shift may be underway. More companies are assuming the costs and risks of preparing people for entry-level technology roles by offering apprenticeships.
The model combines paid on-the-job training with classroom instruction. Although such apprenticeships have long been available in Europe for a variety of professions, in the U.S. they have mostly been reserved for the skilled trades.
“Apprenticeship as an alternative to education is something that might be starting to pick up a little bit of traction, but it’s certainly not the same as what you see in other countries,” says Kelli Jordan, director of IBM career, skills and performance.
Employers are counting on a few key selling points to attract Americans to these “white-collar” or “new-collar” apprenticeships. They’re designed to help people kick off careers in growing industries that pay well above minimum wage. They operate according to standards set by state governments or the U.S. Department of Labor. They often culminate in a certificate or college credits.
And unlike college programs and coding bootcamps that charge students money, apprenticeships do the opposite: They pay…