Artificial intelligence: Some can hardly wait for the future, others are already demonizing it. Some use Apple's Siri, others use Microsoft's Cortana or Google's solution. However, Amazon's Alexa is also very popular among many fans. It is smart, clever and useful - but data protection should not be Alexa's top priority.
According to a new Bloomberg report, Amazon has a department that listens to Alexa users' private conversations and records them in writing. According to the report, Alexa is supposed to record conversations in the home and send them to Amazon employees in the USA, Costa Rica, India and Romania for analysis. The task of these employees is to improve Alexa. By listening to and writing down these conversations, the virtual assistant's voice recognition is supposed to become significantly better. In doing so, private conversations would inevitably be recorded - according to the news channel.
The tricky part: affected users know nothing about it. Amazon Alexa's terms of use also contain no indication that said recordings are subsequently listened to and recorded by real employees. It simply states: "For example, we use your commands to Alexa to train our speech recognition and natural language understanding systems" - whether the analysis is carried out by humans or machines and that entire conversations are included is not noted.
Recording without activation command
Amazon itself has partially admitted to this process in the past. According to the online retailer, only a small fraction of such data is processed further - however, employees do not have access to personal customer data. The recordings would therefore not enable the identification of people or accounts. However, Bloomberg has been given other information. According to this, these packages contain the first name of the respective user and the account and serial number of the individual device. Employees are said to process around 1,000 such recordings every day. Around 10 percent of these data packages contain recordings without the affected users having issued the activation command for Alexa. According to Bloomberg's source, the phrase "Alexa, is anyone else listening?" is particularly popular. A current statement from Amazon is still pending. It remains to be seen whether and how the company will respond to this allegation.
How to deactivate the Alexa development feature
The topic is currently being hotly debated in various forums - some Alexa fans are not bothered by Bloomberg's latest report. Others, however, want to prevent it from happening. Of course, the function for further developing Alexa can be deactivated - we'll show you how. If you really want to be on the safe side, you can turn the feature off in the Alexa app for iOS and Android. To do this, go to the settings, accessible via the menu symbol, and select "Alexa account" and then "Alexa privacy". Now the option to improve Alexa can be activated/deactivated as desired. The whole thing can also be controlled via Amazon's official website. To do this, the user must log into their Amazon account and navigate to "My account". Then call up the "My content and devices" tab. This opens a subpage that allows you to select the "Alexa privacy" section. Here the user can also select the respective setting to improve Alexa.
You are currently seeing a placeholder content of standard. To access the actual content, click on the button below. Please note that data will be passed on to third-party providers.
You are currently seeing a placeholder content of standard. To access the actual content, click on the button below. Please note that data will be passed on to third-party providers.
You are currently seeing a placeholder content of standard. To access the actual content, click on the button below. Please note that data will be passed on to third-party providers.