Google Images ‘Racist Algorithm’ Has A Fix But It’s Not A Great One

Technology is - for the most part - great. It makes a lot of things that used to be quite difficult and tedious much easier and quicker.

However, it can be a bit of a minefield. Even the mighty, all-conquering search God Google is not immune to the teething problems that come with experimenting with new technology.

The problem that it has run into is a pretty bad one, too.

In 2015, black software developer Jacky Alciné was understandably really angry at the fact that Google images had labelled photos of him with his mates as 'gorillas'.

He tweeted Google and the company said it was "appalled and genuinely sorry" and told him that it was "working on long-term fixes".

So what has Google done in the two years since then? Well, it has simply removed the term Gorilla from the search terms that are available to the system. The names of several other monkeys and primates have also been erased to try to stop this situation happening again.

<img src="" data-orig-height="700" data-orig-width="1400" alt="Credit: Google. Google blocked searches for gorilla" chimp,"="" "gorilla"="" and="" "monkey"="" within="" personal="" photo="" organiser"="">
Credit: Google. Google blocked searches for gorilla

As well as being a monumental egg on Google's collective face, this mix-up has real implications, too.

Having technology that mislabels simple human images is a pretty stupid problem to have but it's a dangerously stupid problem to have if you are trying to develop image recognition software to drive cars (which Google is).

The tech magazine WIRED ran a test of Google Photos' image recognition system, feeding 40,000 images with animals into the system.

Whilst it was able to recognise most of the animals, and apparently even managed to spot individual breeds of dog and could identify some apes such as orangutans, there were no search results returned that had identified the terms 'gorilla', 'chimp', 'chimpanzee' or 'monkey'.

In another test, they fed in 10,000 pictures of people used in facial recognition research.

They found that: "The search term 'African American' turned up only an image of grazing antelope.

"Typing 'black man', 'black woman', or 'black person', caused Google's system to return black-and-white images of people, correctly sorted by gender, but not filtered by race.

"The only search terms with results that appeared to select for people with darker skin tones were 'afro' and 'African', although results were mixed."

Google has confirmed that 'gorilla' has been censored since the incident in 2015 and a spokesman said: "Image labeling technology is still early and unfortunately it's nowhere near perfect."

On this evidence, Black Mirror style driverless cars are off the menu for now - which is probably for the best anyway given how that one ends.

Featured Image Credit: PA

Tom Wood

Tom Wood is a freelance journalist and LADbible contributor. He graduated from University of London with a BA in Philosophy before studying for a Masters in Journalism at the University of Salford. He has previously written for the M.E.N Group as well as working for several top professional sports clubs. Contact him on [email protected]

Next Up

arrow-down arrow-left arrow-right arrow-up camera clock close comment cursor email facebook-messenger facebook Instagram link new-window phone play share snapchat submit twitter vine whatsapp logoInline safari-pinned-tab Created by potrace 1.11, written by Peter Selinger 2001-2013