Sick Gaming
AppleInsider - Amazon, Google follow Apple’s lead on voice assistant review policies - Printable Version

+- Sick Gaming (https://www.sickgaming.net)
+-- Forum: Computers (https://www.sickgaming.net/forum-86.html)
+--- Forum: Apples Mac and OS X (https://www.sickgaming.net/forum-87.html)
+--- Thread: AppleInsider - Amazon, Google follow Apple’s lead on voice assistant review policies (/thread-91770.html)



AppleInsider - Amazon, Google follow Apple’s lead on voice assistant review policies - xSicKxBot - 08-03-2019

Amazon, Google follow Apple’s lead on voice assistant review policies

<div style="margin: 5px 5% 10px 5%;"><img src="https://www.sickgaming.net/blog/wp-content/uploads/2019/08/amazon-google-follow-apples-lead-on-voice-assistant-review-policies.jpg" width="1" height="1" title="" alt="" /></div><div><p><span class="article-leader">Following Apple’s decision to temporarily halt Siri grading as it evaluates the program’s privacy safeguards, Amazon and Google this week followed suit and updated their respective policies on human reviews of recorded voice assistant audio. <br /></span></p>
<div align="center">
<div class="article-img"><img src="https://www.sickgaming.net/blog/wp-content/uploads/2019/08/amazon-google-follow-apples-lead-on-voice-assistant-review-policies.jpg" alt="Amazon Echo" height="335" class="lazy" data-original="https://www.sickgaming.net/blog/wp-content/uploads/2019/08/amazon-google-follow-apples-lead-on-voice-assistant-review-policies-1.jpg"><img src="https://www.sickgaming.net/blog/wp-content/uploads/2019/08/amazon-google-follow-apples-lead-on-voice-assistant-review-policies-1.jpg"></div>
<p><span class="minor2 small gray"></span></div>
<p>Apple on Thursday <a href="https://appleinsider.com/articles/19/08/02/apple-suspends-siri-quality-control-program-following-whistleblower-report">suspended its Siri grading program</a>, which seeks to make the virtual assistant more accurate by having workers review snippets of recorded audio, after a contractor <a href="https://appleinsider.com/articles/19/07/26/siri-query-manual-review-process-detailed-by-whistleblower">raised privacy concerns</a> about the quality control process. </p>
<p>Now, Apple’s competitors in the space, namely Google and Amazon, are making similar moves to address criticism about their own audio review policies. </p>
<p>Shortly after Apple’s announcement, Google in a statement to <em>Ars Technica</em> on Friday said it, too, <a href="https://arstechnica.com/tech-policy/2019/08/apple-and-google-temporarily-stop-listening-to-siri-and-ok-google-queries/">halted a global initiative</a> to review Google Assistant audio. Like Siri grading, Google’s process runs audio clips by human operators to enhance system accuracy. </p>
<p>Unlike Apple’s Siri situation, however, a contractor at one of Google’s international review centers leaked 1,000 recordings to <em>VRT NWS</em>, a news organization in Belgium. In a <a href="https://www.vrt.be/vrtnws/en/2019/07/10/google-employees-are-eavesdropping-even-in-flemish-living-rooms/">subsequent report</a> in July, the publication claimed it was able to identify people from the audio clips, adding that a number of snippets were of “conversations that should never have been recorded and during which the command ‘OK Google’ was clearly not given.”</p>
<p>The <em>VRT</em> leak prompted German authorities to investigate Google’s review program and level a three-month ban on voice recording transcripts.</p>
<p>“Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally,” Google told <em>Ars Technica</em>.</p>
<p>Google did not divulge the halt to global reviews until Friday. </p>
<p>Amazon is also taking steps to temper negative press about its privacy practices and on Friday <a href="https://www.bloomberg.com/news/articles/2019-08-02/amazon-gives-option-to-disable-human-review-of-alexa-recordings">rolled out a new Alexa option</a> that allows users to opt out of human reviews of audio recordings, <em>Bloomberg</em> reports. Enabling the feature in the Alexa app excludes recorded audio snippets from analysis. </p>
<p>“We take customer privacy seriously and continuously review our practices and procedures,” an Amazon spokeswoman said. “We’ll also be updating information we provide to customers to make our practices more clear.”</p>
<p>Amazon came under fire in April after a <a href="https://appleinsider.com/articles/19/04/10/thousands-of-amazon-workers-are-listening-to-echo-conversations-report-says">report revealed</a> the company records, transcribes and annotates audio clips recorded by Echo devices in an effort to train its Alexa assistant. </p>
<p>While it may come as a surprise to some, human analysis of voice assistant accuracy is common practice in the industry; it is up to tech companies to anonymize and protect that data to preserve customer privacy. </p>
<p>Apple’s method is outlined in a security white paper (<a href="https://www.apple.com/business/site/docs/iOS_Security_Guide.pdf">PDF link</a>) that notes the company ingests voice recordings, strips them of identifiable information, assigns a random device identifier and saves the data for six months, over which time the system can tap into the information for learning purposes. Following the six-month period, the identifier is erased and the clip is saved “for use by Apple in improving and developing Siri for up to two years.” </p>
<p>Apple does not explicitly mention the possibility of manual review by human contractors or employees, nor does it currently offer an option for Siri users to opt out of the program. The company will address the latter issue in a future software update.</p>
</div>