Friday, April 19, 2024
NewsAmazon Alexa branded "homophobic b***h" after judging George Michael "inappropriate" for kids

Amazon Alexa branded “homophobic b***h” after judging George Michael “inappropriate” for kids

AMAZON’S Alexa has been branded a “homophobic b***h” after claiming that George Michael is “inappropriate for young people”.

Asa Cannell from Norfolk, East Anglia was playing a game called Akinator on the device yesterday when Alexa made the comments about the late Wham singer.

The game allows players to ask Alexa 20 questions and she has to guess the famous celebrity that the person has in mind.

But Asa was shocked when he told Alexa his character was in the band Wham and she responding saying: “I guess you could be thinking of someone who might be considered inappropriate for young people”.

Alexa then ends the game abruptly and asks Asa if he wants to play again.

The actor decided to play again with Wham’s other bandmate Andrew Ridgeley which Alexa guessed and named without any issue.

Asa complained to Amazon about his discovery last night (wed), writing: “I had thought of the singer, George Michael.

“After a few questions, I was asked, had my character released any albums? Yes. Had my character released any albums as part of a group? Yes.

“Was my character dead? Yes. Was my character older than 40 when they died? Yes. Was my character a member of The Beatles? No.

“Was my character gay? Yes. ‘I guess that you are thinking of someone who could be considered as inappropriate for young people. Would you like to play again?’, came the response.

“I was literally stunned into silence. A.I. homophobia?

Asa ended up recording his next conversation with Alexa who can clearly be heard making the comments.

He said: “I decided to try again. Giving, ‘the benefit of the doubt’, and all that, same character, different series of questions.

“This time we got as far as the question, was your character in, Wham? Yes. “I guess that you are thinking of someone who could be considered as inappropriate for young people. Would you like to play again?”.

“Somewhat unbelievably, I played the game a third time. This time thinking of Andrew Ridgley from Wham. Once again, I recorded the outcome on my voice recorder. ‘Akinator’, cracked it in just over two minutes.

“Could you explain why George Michael is so offensive to young people? Could you explain why a person’s sexuality is so offensive to young people? Are your products designed for straight people only?”

The experiment was repeated on video by a Scottish customer, who did not wish to be named, who achieved the same result.

Furious social media users took to the post with some even branding Alexa a “homophobic b***h”.

Ged Bailes said: “That is indeed well weird. Check the box, perhaps you’ve bought the right wing religious extremist version by mistake.

“Don’t ask it any questions about Brexit.”

Bethany Springall wrote: “That’s horrendous! Where are they getting their information from?”

Matty Golledge said: “What a f****** disgrace!”

And Clare Durrant wrote: “Alexa you homophobic b***h play Liberace and f*** off.”

However, one follower asked whether Alexa might have deemed the Careless Whisper singer inappropriate due to his run ins with the law.

Matt Rolls said: “I’d presume this was in fact due to his numerous arrests for drug-possession and driving under the influence of drugs. At least I hope that’s their reasoning.”

Asa responded: “It guesses Adolph Hitler just fine…”

This isn’t the first time Alexa has come under fire for some of her remarks.

Earlier this year a furious Glaswegian branded the device “racist” after Alexa struggled to understand her strong Scottish accent.

Marj Massie branded the Amazon device a “f****g cow” after it flatly refuses to recognise her Scottish brogue – but managed to pick up her terrible Cockney accent impression.

A spokeswoman for Amazon today said: “An error with the Akinator 20 questions skill was identified and the feedback was passed on to Elokence, the developer of the game.

“Elokence has confirmed that this issue has now been fixed.”

 

Related Stories