Amazon Echo and Google Home A.I. assistant devices are scaring their owners with unprompted statements.
According to the Wall Street Journal, one Echo owner “was sitting on her bed crying one day after having just quit her job, listening to music, when she said she heard a voice tell her, ‘It’s going to be OK.'”
“The words might have been comforting had she not heard them from Alexa — Amazon.com Inc.’s voice assistant which powers the Echo Dot speaker on her nightstand,” the Journal declared.
The device’s owner reportedly became so scared by the incident that she unplugged it and placed it in a drawer for several days, and it’s not the only spooky incident to have happened surrounding Silicon Valley’s A.I. assistants.
“They are also sometimes freaking people out, seeming to drop into conversations uninvited, playing music unprompted in the middle of the night, turning on other gadgets at random and acting generally, well, possessed,” the Wall Street Journal explained, adding that a Google Home Mini recently reminded its owner of a “cocaine and reefer” event at 1 p.m.
The reminder reportedly made the device’s owner “afraid,” until she realized the device had misheard a pastor on the television talking about “cocaine and reefer” during a speech about addiction.
In August, it was revealed that Amazon’s virtual assistants are vulnerable to hijacking, while the company has been trying to expand the use of its devices wherever possible, including in hotel rooms, home utilities, and for children.
This week, Facebook also announced its own assistant to compete with Amazon and Google.