To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Google Images ‘Racist Algorithm’ Has A Fix But It’s Not A Great One

Google Images ‘Racist Algorithm’ Has A Fix But It’s Not A Great One

After suffering embarrassment when image recognition software mistook humans for gorillas, Google Images has removed some search terms.

Tom Wood

Tom Wood

Technology is - for the most part - great. It makes a lot of things that used to be quite difficult and tedious much easier and quicker.

However, it can be a bit of a minefield. Even the mighty, all-conquering search God Google is not immune to the teething problems that come with experimenting with new technology.

The problem that it has run into is a pretty bad one, too.

In 2015, black software developer Jacky Alciné was understandably really angry at the fact that Google images had labelled photos of him with his mates as 'gorillas'.

He tweeted Google and the company said it was "appalled and genuinely sorry" and told him that it was "working on long-term fixes".

So what has Google done in the two years since then? Well, it has simply removed the term Gorilla from the search terms that are available to the system. The names of several other monkeys and primates have also been erased to try to stop this situation happening again.

Google. Google blocked searches for gorilla

As well as being a monumental egg on Google's collective face, this mix-up has real implications, too.

Having technology that mislabels simple human images is a pretty stupid problem to have but it's a dangerously stupid problem to have if you are trying to develop image recognition software to drive cars (which Google is).

The tech magazine WIRED ran a test of Google Photos' image recognition system, feeding 40,000 images with animals into the system.

Whilst it was able to recognise most of the animals, and apparently even managed to spot individual breeds of dog and could identify some apes such as orangutans, there were no search results returned that had identified the terms 'gorilla', 'chimp', 'chimpanzee' or 'monkey'.

In another test, they fed in 10,000 pictures of people used in facial recognition research.


They found that: "The search term 'African American' turned up only an image of grazing antelope.

"Typing 'black man', 'black woman', or 'black person', caused Google's system to return black-and-white images of people, correctly sorted by gender, but not filtered by race.

"The only search terms with results that appeared to select for people with darker skin tones were 'afro' and 'African', although results were mixed."

Google has confirmed that 'gorilla' has been censored since the incident in 2015 and a spokesman said: "Image labeling technology is still early and unfortunately it's nowhere near perfect."

On this evidence, Black Mirror style driverless cars are off the menu for now - which is probably for the best anyway given how that one ends.

Featured Image Credit: PA

Topics: Google, UK News, Interesting, US News, Technology