Byteside
  • Newsletter
  • Podcasts
  • Studio
Twitter Facebook Instagram RSS
Twitter Twitch LinkedIn RSS Instagram Facebook
Byteside
  • Newsletter
  • Podcasts
  • Studio
Twitter Twitch LinkedIn RSS Instagram Facebook
Byteside

AI bias isn’t going away anytime soon

Chris ButtonBy Chris ButtonNovember 24, 2020

Another day, another example of artificial intelligence judging people unfairly based on their appearance.

This time around, researchers found that AI technology analysing images of US members of Congress made drastically different word associations depending on whether the subject was male or female, according to WIRED.

Men were more commonly associated with traits such as “businessperson” and “official, whereas women received a higher proportion of appearance-based descriptors such as “hairstyle” and “smile”.

Now bloody AI is telling women to smile? Yikes…

Source: Schwemmer et al.

It’s not just a one-off example of AI favouring one group of people over another, as the recent study — published as Diagnosing Gender Bias in Image Recognition Systems — found consistent examples of bias across Google Cloud Vision, Microsoft Azure Computer Vision, and Amazon Rekognition.

This follows a recent Twitter experiment testing how the social media platform automatically previews images that don’t fit the 2:1 ratio it natively displays. 

In this instance, Twitter user Tony Arcieri used various images with Republican Senator Mitch McConnell alongside former President Barack Obama to see whose face was previewed more often.

And who do you think the Twitter algorithm picked? If you guessed the crusty old white dude, you’re absolutely correct.

Trying a horrible experiment…

Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama? pic.twitter.com/bR1GRyCkia

— Tony “Abolish ICE” Arcieri 🦀 (@bascule) September 19, 2020

There are so many factors at play with AI bias, including how algorithms are programmed, what definitions and parameters are included, and what data is used to ‘teach’ the AI.

As it currently stands, AI technologies clearly need more work to more equally represent women and people of colour.

More studies like this pointing out discrepancies in how AI interprets data is a good start, which will hopefully lead to women not being defined by their hairdo.


Some Byteside outbound links may include affiliate programs to support Byteside’s operations. Our recommendations and review commentary remain independent of any potential revenue that may come through including such links.

Chris Button

Chris is an award-nominated writer based in Adelaide who specialises in covering video games and technology. He loves Donkey Kong Country, sport, and cats. The Last Jedi is the best one, no questions asked.

Artificial Intelligence Pro research Technology

Previously on Byteside...

Upright Go S device attached using its necklace on a woman's back
Technology

Upright Go S review: a nudge toward better posture

Google Pixel 6A lying on red wood chips
Technology

Google Pixel 6A review: delicious Google flavour with minimal compromise

Google AFL Footy Skills Lab 1
Technology

Google and AFL update skills app to make footy more accessible

Comments are closed.

Get On It
Follow us
  • Twitter
  • Twitch
  • LinkedIn
  • Instagram
  • Facebook
Latest
Google Pixel 6A lying on red wood chips

Google Pixel 6A review: delicious Google flavour with minimal compromise

Trust and the future of business IBM

Trust and the future of digital business

Best headphones for 2022 - a woman relaxes on a couch wearing a set of Sony WH-1000MX5 headphones.

The best headphones for all purposes in 2022

Why transformation itself is the new normal IBM

Why transformation itself is the new normal

Hamster in the palm of a hand

Why one cute hamster says everything you need to know about what online learning should be

Byteside mug ad
Twitter Twitch LinkedIn RSS Instagram Facebook
© 2022 Byteside Pty Ltd

Type above and press Enter to search. Press Esc to cancel.