While some experts unequivocally see AI as the future, other labor experts have reservations about AI, particularly human biases — against people of different races, genders, religions, abilities and the like — being replicated in algorithmic software.
But can AI be used, instead, to reduce the effects of human bias? Anirban Chakrabarti, CEO of HireLogic, told HR Dive that his company’s tool is one such example, providing “a smart HR assistant” to take notes while recruiters listen to job candidates during interviews.
The HireLogic AI primarily scans job descriptions for required skills and experience, cross-referencing them against job candidate resumes, generating meaningful interview questions and scanning the audio clip of the conversation for highlights.
He says that his software reduces unconscious bias by “extracting meaningful, objective data from interviews,” “helping sort resumes by job fit,” and rating candidates specifically on how well they answered interview questions.
“When used properly to augment human decisions, solutions like HireLogic can help to reduce this unconscious bias that exists in the hiring process, while being careful not to replace human bias with other sources of AI bias,” he told HR Dive via email.
Lesson from the healthcare industry
Christina Silcox, research director for digital health at Duke University’s Margolis Center for Health Policy, published a white paper this year on preventing bias and inequities in AI-enabled technology in healthcare with widely applicable takeaways.
Her team identified four key areas of bias: inequitable framing of challenges, use of unrepresentative data, use of biased training data, and negligent choices regarding data selection, curation, preparation, and model development.
In the healthcare industry, for example, biased AI manifested as a “no show” algorithm — which used demographic data to predict which patients might not make their appointment. Subsequently, health clinics and hospitals would “double-book certain patients to minimize lost revenue,” Silcox told Pew Trusts.
This algorithm didn’t take into consideration that Black, Indigenous and Latino people disproportionately lack access to reliable transportation, affordable health insurance and paid sick leave — these factors thereby contributing to missing appointments.
Beyond healthcare, human bias in AI has been noted in the creative tech, financial, and law enforcement sectors.
What does this have to do with HR? It’s worth noting that lack of care in acknowledging socioeconomic status, consideration of varied racial and ethnic backgrounds as well as cultural differences, and accounting for individual lived experience can cause hiring managers to replicate prejudice — just through AI, this time.
Chakrabarti also brought up his AI’s anti-bias capabilities regarding compliance in the hiring process. “There is an evolving set of federal and state laws that determine what questions you cannot ask during interviews,” he said, bringing up salary history bans in California as an example.
It can fill the gaps for human error, regarding compliance training for hiring managers.
“HireLogic can automatically detect and flag potential bias in compliance questions — so that HR can determine training effectiveness and apply coaching where needed to reduce bias and compliance risk,” Chakrabarti said.
Solutions and safeguards for reducing bias
For HCM software developers, Silcox’s words of caution are worth noting.
“AI developers have the responsibility to create teams with diverse expertise and with a deep understanding of the problem being solved, the data being used, and the differences that can occur across various subgroups,” she told Pew Trusts.
Employers considering said software should also put it to the test before formal adoption.
“Purchasers of these tools also have an enormous responsibility to test them within their own subpopulations and to demand developers use emerging good machine learning practices — standards and practices that help promote safety and effectiveness — in the creation of those products,” Silcox said.