Good talents may miss AI’s radar in hiring process

Tahei Kobayashi dropped out of high school and spent some time being homeless in Tokyo.

His rise was featured on news websites, as today, he is the chief executive officer of Sun* Inc – a company with a market value of a billion dollars.

To many, this is a story of inspirational grit and affirmation of meritocratic values; a sentiment I find popular in the comment threads. Perhaps Kobayashi also got lucky and interviewed with the right people – a case of “real recognise real”, where certificates, grades, name of schools and prior experiences do not really matter.

Kobayashi’s journey, though inspiring, is not totally unique. Do a quick Google search on “rags to riches” and you will find many inspiring stories of young men and women overcoming great disadvantages in personal circumstances to become successful.

There is much to be admired from Kobayashi’s story, but the one thing that stood out most for me has less to do with Kobayashi himself. How a company took a chance on a young man without a high school certificate and who came from an awkward background – he is from a good family, but has been homeless – should get more people thinking.

If you apply for a job with a big corporation today, your resume will likely go through an artificial intelligence (AI) screening. AI technology is now embedded in our daily lives in many different forms.

However, in its decision-making process, AI’s algorithm output is only as good as the input parameters set by its operators.

If a company sets its recruitment criteria algorithm based on a set of star performers in its employ and these staff also happen to share the same characteristics – attended selective boarding school, have overseas degrees, converse in perfect English and is exposed to finer cultural things, then it is likely an individual who has spent his or her whole life in Ranau or Kapit will not get through the AI screening stage.

In other words, an AI can become insular as it only learns and reinforces the positives of preferred traits while penalising outliers.

In 2018, the exact thing happened to Amazon when the company had to scrap an AI recruiting programme that was found to be biased against women.

The input fed to the AI contained historically male-dominated applicants. The AI then taught itself to penalise characteristics such as having a degree from an all-women’s college.

It is not likely a person like Kobayashi would pass the screening stage of an AI recruitment programme. Were they to apply, Bill Gates and Steve Jobs’ resumes would also be dismissed for not having completed their degrees.

AI technology will only get better over time and will help corporations recruit great candidates, as long as we understand its limitations.

We must remember that AI’s limitations in recruitment is a reflection of its operators’ own limitations. To overcome this, we have to always audit the recruitment algorithm for biases and insularity tendency.

We must give that young man or woman from Ranau, Kapit or even Lahad Datu, a chance.

This is the personal opinion of the writer and does not necessarily represent the views of Twentytwo13.

Tagged with: