DJHC Ltd

Is AI the answer to implicit bias in hiring?

Is AI the answer to implicit bias in hiring?

As the climate crisis looms, engineering and construction has a vital role to play. That means we need talent – and we need to ensure there’s nothing standing in the way of getting it. Diversity has been at the top of the agenda in engineering for some time, as we try to capitalise on the advantages a diverse workplace brings. However, this ought not to be fuelled by a “box-ticking” approach that undermines the mission to hire the best person for the job.

So how can we eliminate the biases that people hold, whether they’re conscious or unconscious? Are blind interviews or nameless CVs the answer? Or could we use avatars or voice manipulation to hide the candidates gender, age, or race? An interesting possibility is the use of AI to eliminate bias – an issue that Harvard Business Review recently discussed in regard to one of the most taboo forms of discrimination: the beauty bias.

What is the beauty bias?

The beauty bias is one of the most ubiquitous forms of discrimination, but it’s hardly ever discussed. In fact, “lookism” in the labour market is well-documented. An academic review summarised: “Physically attractive individuals are more likely to be interviewed for jobs and hired, they are more likely to advance rapidly in their careers through frequent promotions, and they earn higher wages than unattractive individuals.”

Sounds worrying. But the good news is this bias is quite easy to identify, making interventions easier to develop. The most effective way to achieve this is with data and AI analysis, as long as it’s approached responsibly.

How could this data be collected?

To collect these metrics, you can measure attractiveness via a consensual rating of physical appearance. For example, you could ask 10 people to rate the attractiveness of 100 people; although this isn’t objective, you can generally expect some sort of trend as long as those surveyed are from the same culture. This question of culture is, of course, where certain types of racial discrimination can feed into the beauty bias – another very serious issue in unto itself.

These attractiveness ratings can then be measured against the person’s past successes, promotion or salary data. From this, AI can be used as a diagnostic tool to predict the probability that someone will be deemed better for a job based on their looks. Research shows that a person’s attractiveness is far more indicative of their career outcomes than if we lived in a totally unbiased world. Subsequently, this data can then be used to attempt to eliminate this unfair advantage from the hiring process.

So why don’t we implement this now?

Certainly, employers have tried to eliminate the beauty bias from their hiring processes by eliminating any appearance data, as we suggested in the introduction. They’re only left with the data on the candidate’s past performance and resume; but, as discussed above, this could have been impacted by their attractiveness in previous non-blind interview processes. So is eliminating this bias ever possible?

The conclusion reached in Harvard Business Review was basically “maybe”. Like scales are used to keep us accountable when losing weight, so could responsibly programmed AI help us keep implicit bias in check. This could be useful for eliminating a range of biases, using technology to work towards a more inclusive workplace.

 

 

Leave a Reply

Your email address will not be published.