To improve Xbox’s voice command features, contractors working for Microsoft have listened to the audio recordings from the gaming console, which were collected without user consent.
The audio recording feature in Xbox was supposed to be triggered by voice commands “Xbox” or “”Hey Cortana”. However, the audio would sometimes get recorded by mistake and sent to processing, reports Motherboard quoting one Microsoft contractor.
Multiple contractors have spoken on the condition of anonymity saying they were asked to sign “non-disclosure agreements” with Microsoft for this work.
Microsoft reportedly paid contractors $10 an hour, while human review of such recorded conversations and audio clips have been going on for a long time, and in many instances involved recordings of minors or children playing on the Xbox console.
A former employee alleged that Microsoft contractors worked on this data from 2014 to 2015, before Microsoft’s voice assistant Cortana was rolled out. The recording system was then enabled with the voice command, which controlled Xbox’s Kinect device.
Concerned about their privacy, users did raise issue with Microsoft. Microsoft, in a subsequent statement, had assured that the system was designed and built with “strong privacy protection” in place and the company will continue its commitment.
With the implementation of Cortana in 2016, Microsoft created a database of users’ conversations. This July, Microsoft removed Cortana from the console, but it can still be connected to the console via Cortana’s Android and iOS apps.
A Microsoft spokesperson said that it has stopped listening to the user audio recordings. “We’ve recently updated our privacy statement to add greater clarity that people sometimes review this data as part of the product improvement process.”
Are Conversations With Virtual Assistants Really Private?
The report is the latest revelation in a string of controversies at various tech giants and their voice assistants, which are becoming increasingly popular. Many platforms have admittedly been listening to the users’ conversations to improve their technology.
Earlier this month, Microsoft contractors were found listening to some private Skype calls and personal conversations through app’s translation service and audio recorded by Cortana.
Apple and Google too came under the radar with similar allegations in July. Reports revealed that Apple’s Siri and Google Assistant were accessing user’s private conversations to process voice data and improve commands.
The Guardian reported that Apple contractors had overheard people having sex, making drug deals, or describing medical symptoms. Earlier this month, both companies dismissed their human review worldwide.
A Google contractor allegedly leaked thousands of voice recording to a Belgian news outlet that was able to identify the individuals through their locations and recordings, in July. Google claims that a user can change the settings from their Google account so that none of the audio is stored.
Finally, Amazon’s Alexa was also accused in April of recording and listening to voice conversations and clips as well. The company allegedly hired contractors in Romania to listen to the voice assistant’s recordings, who were able to view user locations as well. An Amazon spokesperson had responded then saying the company is ready to give an “opt out” option to users.