Hot on the heels of AI helping recruiters and hiring managers automate their workloads to keep up with crushing demands, news broke that AI proved to have an insidious side. It was biased. Here’s why that was anything but the end of recruiting AI as we know it.
First things first: what’s AI? Artificial intelligence simulates and automates human processes with computers. When it comes to employment, at its simplest level, AI can recommend candidates for a job and schedule screening calls.
More advanced uses include facial recognition to score video interviews or chatbots to answer questions that support the candidate experience.
In other words, AI is a helper. Your assistant. Which makes it an easy target for criticism. It can’t talk back. But here’s the thing about AI: it’s only as biased as the human processes it’s simulating.
That’s right. AI is only as biased as you.
Recruiters and hiring managers we talk to realize they are, in fact, biased. Not because they love the status quo, but, like the rest of us, they have a lot on their plates and rely on intuition to help them make faster decisions.
What’s intuition based on? Experience. If you’ve seen success with a certain type of candidate, your instinct grows around that sense there’s a “type” that’s right for the job or the company.
Even if that “type” centers on a proven track record of results or referrals from elite leaders in the field, you need to stop and consider whether those achievements are a matter of hard work and talent — or privilege.
What job candidates haven’t been able to accomplish isn’t necessarily a reflection of what they’re capable of; it could be the result of never having the access or opportunity. Systemic bias rears its head in sneaky ways. It begets unconscious bias and perpetuates a cycle that’s impossible to escape.
In other words, bias is complicated.
Yet, we often approach bias as a simple, annoying behavior we have to stop, like clicking a pen or tapping your foot. That’s why the problem hasn’t gone anywhere.
Confronting bias is hard work. Exorcising it is an out-of-body experience. You need to step into bias. Take the time to feel its effects. Empathize with its victims long enough that you overwrite biased instincts with awareness. That’s not happening overnight.
One of the reasons we think diversity and inclusion have been able to climb recruiting priority lists is not because AI increases bias and has become the singular foe we all need to slay.
Rather: done well, AI strips us of our excuses for not doing better.
It’s time to do better. Here’s how.
Great news is AI can be designed and built not to mimic human bias. But we have to be aware of our own prejudices and where an algorithm can either perpetuate or prevent bias in the hiring process.
The unconscious bias is strong with this one. Sourcing is the art of finding qualified resumes for a job. It shapes the talent pool. We tend to focus further down the recruiting funnel for bias, but it often starts here.
Automated decisions at the sourcing stage can cut applicants before they even knew there was a job, period. Consider how the job description is written, what adjectives are used, what channels you’re using to promote the job, how job alerts function on those channels, what criteria you’re using to screen, what questions you’re asking on the application.
Understanding where bias arises in these variables can help you use AI to optimize the complete sourcing experience to attract more diverse (in every sense) candidates to your job and convert more qualified applications.
Furthermore, AI makes it possible to review all applications in your pipeline, instead of the sampling a time-strapped recruiter can squeeze in on a tight timeline.
Now, we’re ready to narrow the field. Some companies harness AI and machine learning to predict how likely a candidate is to be successful at the prospective job. These algorithms often use historical data — either from the company itself or industry aggregates — to project who’ll be a good hire. The rest are cut loose.
But what if the historical data is incomplete — or flawed?
Back to our earlier example. Let’s say your historical data focuses on candidates with proven track records of results, 10+ years of experience and references from senior management. There’s no telling if those achievements were from hard work, talent or privilege.
What is telling: for some fields, only certain types of candidates could actually meet those criteria because of systemic bias that has limited access and opportunity for others.
If we’re looking at technology, for instance, where men still hold more than 80% of the jobs, historical data would continue to paint a picture of the same guy who’s worked that job for decades, and no one else would ever elbow their way in.
Instead of looking at historical data and random human beings, put AI to work getting to know your actual candidates and their capabilities. AI can help recruiters and employers evaluate interview responses with video analysis. You can also use it on work samples or skills assessments to predict performance, productivity or likelihood to want the job.
Chatbots during the evaluation phase can further limit bias by taking questions and answers at face value — and upholding the truth that “there are no stupid questions.”
Remember, AI is a helper. Your assistant. It can give you data, analysis and projections about whom to screen, interview and hire. But don’t let the robot take your job. Only you and your team can connect with the human being behind the application — and know this is the one.
When you do, your AI assistant can spring back into action to help you measure the likelihood of a particular candidate to accept the job and how to configure an offer that’s competitive, rewards their experience and doesn’t err on the side of a salary gap to save a few bucks.
You can also use a chatbot to remove social stigma and stress from salary negotiations and open the floor to questions during a new employee’s ramp-up. Let AI automate your onboarding process, measure engagement and iterate on delivering better onboarding experiences every time a new employee starts.
You’ll see your employee engagement flourish and retention rate grow with an experience that just keeps getting better.
What if you were as biased as your AI?
Dare yourself to look at the facts. Open the door. Let people in. Take the time to connect, human-to-human. Learn about your candidates’ passions, challenges, triumphs and fears. It’s hard to hold onto bias when you’re looking at living, breathing proof this person is awesome. And who would have thought it would take a robot to bring you two together?