Privacy problems in web World are widespread. Google Assistant voice apps and Amazon Alexa rank high in the list. They are often are often “problematic” and violate baseline requirements, according to researchers. The Clemson University School of Computing researchers has analysed tens of thousands of Alexa skills and Google Assistant actions. This is also to calculate the effectiveness of their data practice disclosures. The researchers characterize the current state of affairs as “worrisome” also claiming that Google and Amazon run afoul. In spite of This is of their own developer rules.
Google Assistant
Hundreds of millions of people around the world use Google Assistant and Alexa to order products, manage bank accounts, catch up on news. And control smart home devices. Voice apps (referred to as “skills” by Amazon and “actions” by Google) extend the platforms’ capabilities, in some cases by tapping into third-party tools. But in spite of app store regulations and legislation that mandates data transparency, developers are inconsistent when it comes to disclosure, the co-authors of the Clemson study found.
To determine which Google Assistant and Alexa app developers’ privacy policies were sufficiently “informative” and “meaningful,” the co-authors scraped the content of skill and action web listings. They conducted an analysis to capture practices provided in policies and descriptions. Both Google and Amazon do the availability on the web. Also the app storefronts for their voice platforms. They are the developer of a keyword-based approach, drawing on Amazon’s skill permission list and developer services agreement. They compiled a dictionary of nouns related to data practices.
Data:
Across a total of 64,720 unique Alexa skills and 2,201 Google Assistant actions (every skill and action scrapeable via the study’s approach), the researchers sought to identify three types of problematic policies:
- Those that don’t outline data practices.
- Those with incomplete policies (i.e., apps that mention data collection in their descriptions but whose policies don’t elaborate).
- Missing policies.
The researchers report that 46,768 (72%) of the Alexa skills and 234 (11%) of the Google Assistant actions don’t include links to policies and that 1,755 skills and 80 actions have broken policy links. (Nearly 700 links lead to unrelated webpages with advertisements, and 17 lead to Google Docs documents that aren’t publicly viewable.) The dichotomy is partially attributable to Amazon’s lenient policy, which unlike Google’s.
Doesn’t require developers to provide a policy if their skills don’t collect personal information. But the researchers point out that skills which collect information often bypass the requirement by choosing not to declare it during Amazon’s automated certification process.
A substantial portion of skills’ and actions’ policies share a privacy policy link (10,124 skills and 239 actions), with 3,205 skills sharing the top three duplicate links. Publishers with multiple voice apps are to blame, but this practice becomes problematic if one of the links breaks. The researchers found 217 skills using the same broken link as well as actions linking to a generic policy with company names and addresses but not action names, which Google requires.
Damningly, the researchers accuse Google and Amazon of violating their own requirements regarding app policies. One official weather Alexa skill also asks for users’ locations but doesn’t provide any kind of a privacy policy, while 101 Google-developed actions lack links to privacy policies. Moreover, nine Google-developed actions also point to two different general privacy policies, disregarding Google’s policy requiring Google Assistant actions have app-specific policies.
Amazon Spokesperson:
When reached for comment, an Amazon spokesperson provided this statement via email to Venture Beat. “We require developers of skills that collect personal information to provide a privacy policy. Which we display on the skill’s detail page. And to collect and use that information in compliance with their privacy policy and applicable law. We are closely reviewing the paper. And we will continue to engage with the authors to understand more about their work. We appreciate the work of independent researchers who help bring potential issues to our attention.”
Google Spokesperson:
A Google spokesperson denied that Google’s actions don’t abide by its policies and said third-party actions. With broken policies have been removed as the company “continually” enhances its processes and technologies. “We’ve been in touch with a researcher from Clemson University and appreciate their commitment to protecting consumers. All actions are required to follow our developer policies, and we enforce against any action that violates these policies.”
Privacy policy content and readability:
In their survey of voice app privacy policy content. The researchers found the bulk didn’t clearly define what data collection the apps. That were capable of. Only 3,233 Alexa skills and 1,038 Google Assistant actions explicitly mention skills or action names, respectively. And some privacy policies for kids’ skills mention the skills could collect personal information. In point of fact, 137 skills in Alexa’s kids category disclose that data collection could occur but provide only a general policy, running afoul of Amazon’s Alexa privacy requirements for kids’ skills.
More troubling still, the researchers identified 50 Alexa skills that don’t inform users of what happens to information. Like email addresses, account passwords, names, birthdays, locations, phone numbers, health data, and gender or who the information is shared with. Other skills potentially violate regulations including the Children’s Online Privacy. Protection Act (COPPA), Health Insurance Portability and Accountability Act (HIPAA), and California Online Privacy Protection Act (CalOPPA). By collecting personal information without providing a policy.
Beyond the absence of policies, the researchers take issue with linked-to policies’ lengths and formats. More than half (58%) of skills and actions policies are longer than 1,500 words. And none are available through Alexa or Google Assistant themselves; instead, they must be viewed through a store webpage or a smartphone companion app.
Privacy Policies
“Amazon Alexa and Google Assistant is not explicitly requiring app-specific privacy policies. And results in developers providing for the same document explaining data practices of all their services. This leads to certain uncertainties and confusion among end users. Available documents do not give a proper understanding of the capabilities of the skill to end users,” the co-authors wrote. “In some cases, even if the developer writes the privacy policy with proper intention and care. There can be some discrepancies between the policy and the actual code. Updates made to the skill might not be reflected in the privacy policy.”
Solution:
The researchers propose a solution to this built-in intent. That takes the interaction from model of a voice app and scans for data collection capabilities. It creates a response notifying users the skill has these specific capabilities. The intent could be invoked when the app is first enabled. They say, so the brief privacy notice could be read aloud to users. This intent could also advise users to look at a detailing policy provided by the developers.
This will give the user a better understanding. Of what the skill he or she just enabled is capable of collecting and using. The users can also ask to invoke this intent later to get a brief version of the privacy policy. “As our future work, we also plan to extend this approach to help developers. To automatically generate privacy policies for their voice-apps.” The co-authors concluded.