An experimental AI algorithm revealed the gender bias in recruitment processes

A new research from the University of Melbourne revealed that hiring algorithms tend to have a gender bias against women.

 

Indeed, researchers carried out a study by giving recruiters real-life resumés for jobs at UniBank. The roles were for data analyst, finance officer, and recruitment officers, which are usually respectively male-dominated, gender-balanced, and female-dominated positions.

 

One-half of the recruiters were given the gender, the others were given the resumés with traditionally female names and male ones interchanged. They were then told to pick the top three resumés for each role.

 

By reviewing their choices, the researchers found out that they tend to favor male candidates, despite having the same qualifications as the women. The recruiters would be more likely to give men a higher rank.

 

Using these statistics, the researchers then developed a hiring algorithm that ranked candidates based on the recruiters’ preferences and, hence, discovered it also took into account their biases. Indeed, if the algorithm is trained to rank candidates based on the data, it could presume that having a male name is beneficial to get the job.

 

Moreover,  it was found out that the AI tends to go for male candidates even when the names were removed as it has been trained with the panel’s preferences.

 

Although the study only relied on a small sample of data, it is quite common to find these types of gender biases in large companies. For instance, Amazon had to recently remove its hiring algorithm tool which was discriminating against female candidates. This was due to the fact that the AI was trained on men’s resumés.

 

Moreover, if more advanced Ais operate within a ‘black box’ without having a human oversight or transparency, it is possible that any bias could increase even more.

 

The study showed however that these risks could be reduced by making the AI algorithms more transparent. There is still the problem of human biases that need to be dealt with before we start training machines.

Related Posts

Menu
Cart
  • No products in the cart.