Written by Lune Loh

Software engineers at Amazon have been working on an AI to automate the painstaking process of job recruitment since 2014. The company used its own data to teach the AI what the company needed, using resumes that have been collected over a decade. The data consists of predominantly male resumes. However, they only realised in 2015 that the algorithm was biased towards the resumes of male applicants.

According to Reuters, the Edinburgh-based team developed 500 algorithmic models to sieve through 50,000 key terms found in the resumes. Via this, the system taught itself that male candidates were better, favouring masculine-tending verbs such as “executed”, and unfavourably sorted resumes that included words like “women’s”.

Likewise, the system also recommended unqualified candidates. By early 2017, Amazon executives lost faith in the project and disbanded the team. The failed AI has since been reduced to a simpler form, used for clearing duplicate candidate profiles from Amazon’s database.

Lune is a core member of /S@BER (/Stop @ Bad End Rhymes), a Singaporean writing collective, and is currently an Undergraduate at the National University of Singapore. Her works have been published in Cha: An Asian Literary Journal, Cordite Poetry Review, 聲韻詩刊 Voice & Verse Poetry Magazine, Math Paper Press' SingPoWriMo 2017 & SingPoWriMo 2018 and Squircle Line Press' Anima Methodi anthology. She has also been featured at Singaporean LGBTQ+ pride events such as Contradiction XIII and TransIt 2. Find her waxing at lune.city.

Leave a Reply

Your email address will not be published. Required fields are marked *