Much has been said in recent years about the push toward skills-based hiring. The idea that we should try to figure out whether candidates have the skills required to do the job seems so obvious, it may seem like we shouldn’t have to even think about it further. However, taking skills more seriously requires that we stop paying as much attention to other criteria. What might those be?
In practice, skills-based hiring seems mostly to be about dropping academic degree requirements. Why? They would seem to convey a lot of useful information. Someone able to be admitted to a college and then graduate has, at least—on average—attributes that are universally valuable.
For instance, they may possess the executive skills to get things done in a somewhat unstructured environment, likely have some experience working in teams, possibly skills in reasoning and presenting arguments, and so forth. That is even more true for high-school grads compared to non-grads. Just being able to show up enough and stay out of trouble is a pretty good screen.
Why the push to drop degree requirements? The first is a fairness issue. Thirty years ago, the Clinton administration ago put forward a grand vision for changing hiring and employment led by a National Skill Standards Board, comprised of business and community leaders focused on setting out skill requirements for all jobs and designing assessments to measure them.
The ultimate goal was breaking the stranglehold that college graduates seemed to have on good jobs in order to create more opportunity for people who could not afford to go to college. I was around it when it was working—but it failed rather spectacularly, in my view, for lack of employer willingness to take this approach to skills-based hiring seriously.
The other reason, which is the real motivation now for the attention on skills, is that it would be cheaper to hire if we could expand the applicant pool beyond those with college degrees.
Questions to ask about skills-based hiring
First, is the rise of skills-based hiring really happening? That depends on who is asking. I just reviewed some surveys done on this, and the numbers range from 7% to 75% of employers using this strategy now, a testament to the fact that skill-based hiring does not have a clear definition. A CriteriaCorp survey found 20% of employers dropping college degree requirements, at least for some jobs, and that’s probably a good measure.
Second, is skills-based hiring a good idea? Like many consultant-driven ideas, it may sound sensible and efficient, but not so much if you really understand how hiring actually works. “Just see if they can do the job” sounds really simple, no? Ideas like these about hiring—and there are many of them—are rooted in what I call The Home Depot view of hiring: “I’m missing a person; so, get their part number, go to the store and pick a replacement with that number. They either fit or they don’t.”
The reality is that both people and jobs are complicated, and “fit” is a continuum. There are better ones and worse ones when it comes to meeting the minimum job requirements and sorting that out is really difficult.
We should do more skill testing than we do; in fact, you are much more likely in the U.S. to take a drug test than a skills test. Good tests are expensive to produce and execute. Skills tests mainly measure only one task, and virtually all jobs require multiple tasks, so it is likely that we will need multiple tests.
Almost all jobs have requirements other than task-based occupational skills. Will the candidate be able to show up on time, get along with other people, take directions? Many—and we might say most—jobs are not ones we expect people to be able to do on their first day; we also have to make a prediction about their ability to learn. There is no simple test we can administer for that attribute. The week-long assessment center exercises that management candidates went through in the 1960s would no doubt help, but I suspect no U.S. employer is willing to pay for that now.
Creating a smart skills strategy
The reality is that high school graduation does predict something about job performance, as does being a college graduate. (Interestingly, college grades have stopped being very predictive because of grade inflation.) Dropping those criteria and not doing anything else that is better means that the candidate pool is going to be not just bigger but more diverse in a bad way—that is, more candidates who likely cannot do the job well. Let’s repeat that: More candidates who don’t fit are a problem, not a solution. That makes it harder to find the good ones, and we have to do more testing to sort them out. Otherwise, we will end up with worse hires.
We’ve cut recruiters and budgets for hiring to save money, we’ve pushed the tasks onto line managers who don’t have the time or resources to do skills-based hiring well, and we’ve ended up relying on bad tools, such as endless interviews, because they are cheap. As I’ve written before, the No. 1 criterion that employers use to assess their hiring is cost per hire; the second is time to fill a position, while the quality of hire is typically not measured at all. If you think your CFO is going to give you more money to do that, as we say in Philadelphia, God bless.
Of course, we should drop educational requirements where they don’t predict job performance. We should drop all criteria that don’t predict. If they create adverse impacts, they likely also violate the law. If you want to drop them to improve access to a broader pool of applicants, of course, do that. Just don’t think this is a pot of savings just waiting for you to scoop it up—or that it will be easier to get better hires as a result.