- Advertisement -

Sumser: Fighting bias through data-driven software development

Avatar photo
John Sumser
Emerging Intelligence columnist John Sumser is the principal analyst at HRExaminer. He researches the impact of data, analytics, AI and associated ethical issues on the workplace. John works with vendors and HR departments to identify problems, define solutions and clarify the narrative. He can be emailed at [email protected].

“Two white dudes talking about diversity, inclusion, belonging and equity. What could go wrong?” Jon Stross, Greenhouse Software’s co-founder, wondered aloud as we began our recent conversation. We scheduled the call at my request. I wanted to deepen my understanding of the Greenhouse approach to software development, how it is evolving and how Greenhouse is thinking about bias and equity in its design and development.

- Advertisement -

There’s a paradox in trying to solve complex human problems with software: The more you focus on the solution, the harder it is to continue to see the problem. In order to create software programs, we have to pin our ideas of the problem down and hold everything still. Meanwhile, the problem itself (and your understanding of it) continues to evolve. And not all human behavior and issues lend themselves well to logic, rules, models and algorithms.

Related: How HR can bring ‘real and impactful change’

Once we come up with an approach, we don’t really know whether and how it will work until people start using it. And then we may learn that the problem we thought we were addressing is not the real problem we have. So, we address the new issues by continually evolving the solutions. On one level, users always experience tools that are slowly being developed. The fact that the next version is better is the ultimate confirmation that the current version is worse. Said another way, software design is an inherently iterative process. So is its closest cousin, intelligent tools design (artificial intelligence, machine learning, natural-language processing).

SaaS development methods, with their short-cycle iterations and releases, offer an opportunity to tighten the feedback loop between product and market. As the user base grows, useful feedback emerges quickly when new or improved features are added.

Stross referred to the 45 million people who apply for jobs each year through Greenhouse as the “QA testers.”

- Advertisement -

“They let us know when something is a little off. They help us improve their experience,” he said. “We are constantly studying the way our tools are used. It’s the most reliable way we have to discover our blind spots.”

Read more from John Sumser HERE.

Greenhouse, he said, is “in the business of helping people make better decisions,” using principles like “Fair decisions are good decisions” to guide the direction of its research. “And,” he added, “we really interrogate the data in search of insight. Once we find an indicator of an opportunity to improve, we dig in.”

Those conversations usually wander into the topic of bias, AI, NLP and intelligent tools. At the pinnacles of AI thinking, in the high-end research labs, the question of whether AI/ML/NLP can eliminate bias is an unanswered one. The more we learn about intelligent tools, the less likely it is that we’ll use them to solve the bias question.

Stross said, “We are laser-focused on helping people make better decisions. We mine the data and reevaluate workflows in search of the things that make a real difference.” That was when he told me about the feature the company is building to help users hear and learn how to pronounce someone’s name. “Linkedin beat us to market with the function but we think embedding it in the hiring workflow makes all of the difference for creating a sense of belonging.”

Name pronunciation is one of those simple, obvious things that everybody overlooks. Many powerful innovations are things like this–small, obvious things that create enormous value.

The idea is that people can record themselves pronouncing their own names or preferred names and have the recording surfaced to the hiring team before a scheduled interview. That way, any time a person’s name comes up, there is a simple way in the ATS to learn the correct pronunciation. The Greenhouse approach brings it one click closer.

As someone whose last name is often mispronounced, I know some of what this means. When people wrinkle their brow and struggle to say my name, I immediately feel different. There’s something uncomfortable about me for them and it’s my own name. I’m a big, booming, white guy that walks through the world fairly effortlessly. For many, a mispronounced name is a signal of subtle and not-so-subtle discrimination to come. It’s a reinforcement of otherness. It’s a barrier to belonging.

By making sure that every user of the system that the candidate will come in contact with has one-click access to the pronunciation, Greenhouse reduces one more barrier to inclusion while improving the experience for everyone. As Stross said, “We are incrementally improving our tools so that our users make better decisions. Helping to ensure that candidates have equal footing begins with the smallest of things. Little barriers have big consequences.”

Bias in our decision-making processes interferes with good decision-making. Over-emphatic proclamations of the total eradication of bias put you and your company at risk. I prefer data-driven reduction of bias. That way, you get to keep an eye on the problem to make sure you are solving it.

In my keynote at the virtual HR Tech next month, I’ll be talking about HR’s emerging role as the shepherd of individual and organizational safety, health and development. The organization in which names are consistently pronounced correctly from the beginning is healthier and safer. That makes real development possible.

See also: Here are the top speakers coming to virtual HR Tech