Google has been forced to admit its employees are listening to the conversations of its customers through the smart speakers of Google Home.
The revelation has come after more than 1000 sound recordings of private customer conversations with the Google Assistant were leaked to a Belgian news site VRT.
- Google device only picks up man's voice when he speaks 'like a Māori'
- How to stop Google keeping your data
These conversations are used to develop the Google Assistant artificial intelligence system, which is used in its Google Home smart speakers and Android smartphones.
In a conversation with an anonymous Google contractor, he reveals he transcribes around 1,000 clips per week in Dutch and Flemish.
"I've heard people who are addressing their device but also random conversations," said the employee.
In the audio clips provided to the Belgian news service, children's conversations and those between couples can be heard.
In one case, the contractor said he transcribed a recording in which a woman sounded like she was in distress. "I felt that physical violence was involved," he revealed. "It becomes real people you are listening to, not just voices."
According to The Independent, in a statement, Google said a small number of anonymous recordings were transcribed by its experts, and it was launching an investigation into the Dutch audio data leaked.
"We partner with language experts around the world to improve speech technology by transcribing a small set of queries - this work is critical to developing technology that powers products like the Google Assistant," revealed the tech giant.
"Language experts only review around 0.2 percent of all audio snippets, and these snippets are not associated with user accounts as part of the review process."
In their privacy pages for Google Home, the tech giant claims language exerts are employed to analyse "snippets" of recordings made by users, which Google claims helps improve its voice recognition technology. "That's meant to make our services faster, smarter, more relevant, and more useful to you," the site reads, adding that no information leaves the device until its wake word is detected, omitting the fact that the system can mistakenly detect it.