The labor-force-participation rate is rising for those over age 65. So why are companies reluctant to hire more of them?
The American Gerontological Society just released a report about the aging of the U.S. population and what it implies. Part of the motivation for the report was to address the commonly held view that an aging population is a bad thing, especially for the economy.
Guess what? It’s not. (Full disclosure: I chaired the report.) In fact, it’s amazing that we even think that way. The big factor driving the rising average age in the U.S. is because people are living longer; on average, about seven years longer than our parents. Because of improvements in healthcare, those additional years of life do not represent additional years of infirmity. The reality is the years of seriously diminished abilities and age-related illness are coming later and are usually fewer in number. The report points out that only about one in three older individuals will end up in nursing homes or the equivalent, and if they do, the average stay is only one year.
Imagine that: more years of life and more healthier years. Sounds like good news to me.
So what’s the problem? The main concern comes from the assumption that those additional years of life will be spent in retirement, being supported by Social Security or other assistance program.
There is no reason to believe this will be the case, though. Older individuals have always wanted to keep working, at least in some way, beyond traditional retirement ages. Recent evidence suggests that the labor force participation rate–the proportion of the population that is either working or is actively seeking work–has been rising steadily for the over-65 age group.
The question is whether employers will hire them. Why not? They are already trained, and they have better interpersonal skills and work-related attitudes than do their younger counterparts. They aren’t expecting career paths and big payoffs at the end.
The problem, of course, is age discrimination, something that seems to me to have gotten worse, especially with the focus on tech and social media. The IT world likes to present itself as a young person’s game, and we hear all kinds of misinformed assertions about older individuals: that they don’t understand technology, that kids who’ve grown up with computers just think differently, and so forth. We forget that we’ve had personal computers on our desks now for 35 years. The gap in IT use is much more closely related to income (poor people have less internet access) than age.
Or we hear that older individuals cost so much more in healthcare that we can’t afford to employ them, forgetting that those over age 65 are covered by Medicare and don’t need the basic insurance coverage most employers now provide. Nor do we think about the fact that the most expensive employees to ensure are young ones–because they have kids. Or that they cost too much as far as pay is concerned, ignoring the fact that nothing requires an employer to pay older individuals more: Let them know what the job pays before they apply; if they can’t live with that, they won’t apply.
This is just my sense, but I felt as though we were making some progress a few years ago when it came to rethinking how we handled older employees and older applicants. The notion of phased retirements was popular, as was some attention given to recruiting materials that only showed young workers. But I sense we’ve now moved in the other direction. Perhaps the diversity and inclusion movement’s focus on race and gender bias has sucked up all of the attention.
We do know that when applicants get scarce, employers get less picky. That seems especially the case now, given their aversion to raising wages as a way to get more applicants. Recent stories suggest, for example, that many employers are backing off strict drug testing requirements, particularly for marijuana and especially in states where its use is legal.
The few places where older individuals appear to have had job-market success are in retail and fast food, the least-desirable jobs and the ones where employers are most likely to pinch pennies.
So the test is now, when so many employers say they have no candidates to hire. Will we start recruiting older candidates and address the biases that keep them from getting hired? The jury is out.