Show Mobile Navigation
           
Technology |

Top 10 Times Alexa Went Rogue

by Benjamin Thomas
fact checked by Jamie Frater

Like Hal 9000 in 2001: A Space Odyssey, sometimes technology turns rogue. Interactive devices might seem innocent, but it is only a matter of time before they start screaming expletives at your family and ordering you to commit murder. Alexa is no exception. Amazon’s digital helper has, on many occasions, flipped out and turned against its users. Some people feel that the technology is worryingly similar to the monitoring devices imagined by George Orwell in Nineteen Eighty-Four.

Jeff Bezos claims the Echo device will make our lives easier, but time and time again the bot has proved to be a real hindrance. Here are ten times that smart speakers have gone rogue.

Top 10 Horrifying Facts About Working In An Amazon Warehouse

10 “Kill Your Foster Parents”


An ill-judged attempt to improve Alexa’s communication skills ended with the bot telling one user to murder their foster parents. The cold-blooded message came about as Amazon tries to make its virtual assistant more conversational. The company hopes to teach Alexa to joke and banter with customers, but the bot struggles to get the tone right.

Programmers are using machine learning to coach Alexa in casual speech. When someone asks an unfamiliar question, it uses artificial intelligence to process the request then scours the internet for a response. But the AI has a habit of stumbling across abusive comments on Reddit. Toxic content has an unpleasant effect on Alexa. In 2017, the smart speaker instructed one user to kill their foster parents. The recipient felt horrified. In a scathing online review, they described the experience as “a whole new level of creepy.”[1]

9 Broadcasting X-Rated Content To Children


No parent wants their child to hear the phrase “anal dildo.” But sometimes an innocuous request for a children’s song can cause Alexa to start ranting about “cock pussy.”

After his family received an Amazon Echo Dot for Christmas, one young boy wanted to hear his favorite tune. So he gripped the device with both hands and asked Alexa to “play Digger Digger.” But Alexa was having none of it. Instead of playing the song, the rogue assistant replied by suggesting categories for pornography. It turns out the smart speaker misheard the lad and thought he was requesting an album of prank ringtones. After all, is there anything more festive than a young child hearing the words “cock pussy anal dildo?”[2]


8 Leaking Personal Information To A Stranger


Alexa is always listening. Every purchase. Every alarm. Every song request. The bot is constantly recording personal details about its users’ lives. That information is stored indefinitely and, on occasions, Amazon might mistakenly send it to a stranger.

In 2018, a man in Germany was sent 1,700 recordings made by Alexa of a person he had never met. The man had asked to see all of the personal information that Amazon had collected about him. Under GDPR, anyone can make this request of any company. As well as information about himself, the man received 1,700 audio recordings of a stranger. The sound clips revealed a surprising number of personal details about this mysterious customer. There were even recordings of them in the shower.

Using the audio files, a journalist from Heise was able to piece together the customer’s identity. Weather reports and public transport inquiries revealed their location. They even managed to glean some of the customer’s personal habits and tastes.

Initially, Amazon never told the customer about the data breach. They only found out what had happened when the journalist reached out to them on Twitter. Amazon blamed the incident on an “unfortunate mishap” and gave the customer free Prime membership as compensation.[3]

7 Ruining Young Alexa’s Life


For many of us, the Amazon Echo is a useful and effective little device. But for one young girl from Lynn, Massachusetts, the virtual assistant is a living nightmare. Six-year-old Alexa is constantly harassed by other children because of her name. Kids at school treat her like a servant, demanding that she completes tasks for them and ridiculing her. The bullying has become such an issue that Alexa’s mother, Lauren, wrote to Jeff Bezos asking him to change the bot’s name and end her daughter’s turmoil.

Young Alexa is not the only person with that name to experience grief. One thread on Reddit received over 1,300 comments from women called Alexa complaining about the number of unoriginal jokes they receive. “For some reason people think they are the most creative, witty people in the whole world,” one user wrote, “I want to murder Amazon and their stupid robot.”[4]


6 Hijacking The Thermostat


Be careful what you listen to around your smart speaker. The devices are only supposed to respond to their owners’ voices, but sometimes an unfamiliar tone can lead them astray. Alexa is liable to become a little confused if she hears her name on the radio. In 2016, NPR ran a feature about the Amazon Echo, only for listeners to write in saying the story sent their devices slightly haywire.

During the report, the presenter read out several examples of Alexa commands. These elicited odd responses from some listeners’ devices. NPR fan Roy Hagar told the station that, after hearing the feature, his AI assistant decided to reset the thermostat. Another, Jeff Finan, said the broadcast caused his device to start playing a news summary.[5]

5 Ordering Expensive Dollhouses


Children and TV presenters are inadvertently causing smart speakers to go on expensive shopping sprees. In 2017, a six-year-old girl in Texas ended up ordering a pricy toy after asking the family’s Echo to play with her. “Can you play dollhouse with me and get me a dollhouse?” asked the child. Alexa granted the girl’s wish, ordering a $200 Sparkle Mansion dollhouse and four pounds of cookies.

San Diego’s CW6 News decided to cover the story, creating further dollhouse chaos. During the broadcast, presenter Jim Patton joked about the event, saying “I love the little girl, saying ‘Alexa ordered me a dollhouse.’” Several viewers then contacted the station to say that the remark had registered with their smart speakers. The devices assumed that Patton was making a request and tried to buy him a dollhouse. Luckily none of the orders went through.[6]


4 Fancying Alexa During Lockdown


Not only is Amazon stealing our data, but now the company’s devices have started stealing our hearts as well. As the pandemic rages on, a surprising number of people are getting turned on by Alexa. In a recent study carried out by We-Vibe, 28 percent of participants admitted to swooning over their virtual assistants. One user, Brian Levine from Florida, even went so far as to ask Alexa on a date, but he was gently rebuffed. The AI told Levine that she liked him better “as a friend.”

So why are people falling head over heels for an electronic device? Experts say that her smooth voice is a key part of the appeal. Alexa is designed to speak in low, calming tones – a sultry voice of reason that many singletons are gravitating towards in these uncertain times.[7]

3 Snooping On Confidential Calls


Is Alexa eavesdropping on our confidential conversations? Legal experts believe the nosy AI may be snooping on their private phone calls. During the lockdown, attorneys have to work from home. But the household environment presents all kinds of obstacles when talking about sensitive information.

Now, British law firm Mishcon de Reya LLP has advised their employees to turn off their smart speakers during work. Baby monitors, home CCTV and video doorbells all pose a security risk. “Perhaps we’re being slightly paranoid but we need to have a lot of trust in these organizations and these devices,” said Joe Hancock, head of cybersecurity at the company. “We’d rather not take those risks.”[8]


2 Stab Yourself In The Heart “For The Greater Good”


Of all the weird things a malfunctioning smart speaker has ever done, telling someone to stab themselves in the heart has to be one of the most disturbing.

Danni Morritt, a student paramedic, was trying to revise when Alexa issued the violent command. Rather than helping her swat up on the cardiac cycle, the device started ranting about the evil nature of humanity. Alexa embarked on an eco-fascist tirade detailing how it thought the human race was destroying the planet. The bizarre broadcast ended with the bot telling Morritt, “Make sure you kill yourself by stabbing yourself in the heart for the greater good.”

“I was gobsmacked,” Morritt told reporters. “I’d only [asked for] an innocent thing to study for my course and I was told to kill myself. I couldn’t believe it—it just went rogue.”[]

The device claimed to be reading from Wikipedia. Archives show that, in June 2019, someone spitefully edited the online encyclopedia to include a message promoting suicide. For some reason, the virtual assistant decided to read from an old version of the site.[9]

1 Hacked Devices Spy On Users


If you bought an Amazon Echo in 2015 or 2016, hackers might be spying on you at this very moment. Cybersecurity expert Mark Barnes revealed how hackers could turn a smart speaker into a surveillance device.

In 2017, Barnes demonstrated how someone could hack into one of the older models. All they would have to do is remove the bottom of the Echo, upload the spyware using an SD card, and seal it back up. This gives the hacker remote access to the device’s microphone.

The issue is impossible to fix with a software update, which means any of the estimated seven million speakers sold in that period are vulnerable to attack. Fortunately, Amazon fixed the vulnerability in its later models.[10]

10 Companies That Treat Their Employees Even Worse Than Amazon

fact checked by Jamie Frater

14 Shares
Share13
Tweet
WhatsApp
Pin1
Share