According to KIRO-TV, a local news station in Portland, Oregon, a family of four said that their Alexa, a voice-controlled smart speaker from Amazon, recorded a private conversation and sent it to someone in their contacts without permission.
When contacted by the family, Amazon said it takes privacy "very seriously", but downplayed the incident as an "extremely rare occurrence".
The speaker later heard "send message" during the conversation, at which point the device asked, "to whom?" "Alexa then asked out loud, '[contact name], right?' Alexa then interpreted background conversation as 'right".
Amazon said that they are evaluating options to make this case even less likely.
Although Amazon maintains this was a malfunction rather than proof Alexa is always listening, the company has filed patent applications in the past for functionalities that involve always listening, such as an algorithm that would analyse when people say they "love" or "bought" something.
However, Alexa's integration into society has been rocked by the claims of a family from Portland in the USA saying a private conversation was recorded by the diminutive device.
According to USA broadcaster Kiro 7, every room in the family home was wired with Amazon Echo devices to control the home's heating, lights and security - the flawless IoT cabin.
Amazon's Alexa isn't only listening to everything you say - it's absorbing it. She noted Alexa did not let her know that it was recording their conversation.
Amazon confirmed the woman's conversation had been inadvertently recorded and sent, blaming an "unlikely" string of events for the error. Initially, the problem spawned thoughts of someone hacking into her system, with the employee suggesting she unplug her device before digging deeper.
The engineer did not explain why it happened, though.
While the family is seeking a full refund, Amazon has reportedly only offered to "de-provision" their Alexa communications. But if you want to make sure this doesn't happen to you, then simply avoid sending any kind of messages over Alexa.
Amazon also said it was changing the VA's response from simply laughter to, "Sure, I can laugh", only then would it be followed by laughter. Then in between the couple's conversation, Alexa heard the request to "send message" by mistake.