Teenagers and Mobile App Privacy

According to a new study from the Pew Internet Project, more than half f all U.S. teenagers are concerned about privacy when using mobile devices. The study, done in conjunction with Harvard University’s Berkman Center for Internet & Society, surveyed about 800 teens aged 12 to 17. Key findings in the report include:

  • 58% of all teens have downloaded apps to their cell phone or tablet computer.
  • 51% of teens that use appshave avoided certain apps due to privacy concerns.
  • 26% of teen apps users have uninstalled an app because they learned it was collecting personal information that they did not wish to share.
  • 46% of teen apps users have turned off location tracking features on their cell phone or in an app because they were worried about the privacy of their information.

As the Wall Street Journal remarked, “teenagers aren’t exactly the reckless app devourers you think they are.” Indeed, the Family Online Safety Institute (FOSI) recently discussed this topic during its briefing on Capitol Hill supported by Congressman Honda (D – Cal.).  Panelists Tim Sparapani of the App Developers Alliance, Emma Llanso of the Center for Democracy & Technology, Carl Szabo of Net Choice and Heather Federman of the Future of Privacy Forum responded to moderator Jennifer Hanley of FOSI’s questions on the implications of regulatory efforts geared towards minors. The panel discussed the NTIA’s Multistakeholder process on mobile app transparency, U.S. state actions – especially those in California and Maryland, the impact of international proposals, and current industry efforts around promoting privacy and parental controls.

Both the FOSI panel and the Pew study found that the “privacy concerns” of teens are different than adults. As Amanda Lenhart, the Pew study’s lead author, stated in a CBS News interview: ” Teens are more concerned about privacy from their parents, their teachers, their schools.” In other words, teens care more about whether an app is “creepy” or whether they have “social privacy” versus advertising or governmental surveillance concerns. Perhaps when it comes to teenagers and online privacy then, context is key.

 

Comments { 0 }

FTC Provides Limited “Safe Harbor” for Users of a “Do Not Track for Kids” Flag

The new Children’s Online Privacy Protection Act (COPPA) rule that went into effect earlier this month restricts almost all forms of tracking across child-directed sites other than for a set of limited “internal operations purposes.”  Child-directed sites are now strictly liable for any third party tracking on their sites that do not meet COPPA’s limited exceptions, unless they obtain verified parental consent.

Third party code providers, such as analytics companies, ad networks, or social plug-in providers, can also be liable under the new COPPA rule if they have “actual knowledge” they are dealing with children – that is, if the first party site has effectively communicated its online status to the third party or if a “representative of the online service recognizes the child directed nature of the site.”  Yet for many third party code providers, who distribute their code freely to millions of web developers, there is no way to assess whether they are being used by services directed at children.

Earlier this month, the Future of Privacy Forum (FPF) announced its support for a model proposed by FTC Chief Technologist Steve Bellovin calling for a special “flag” to be passed between companies that would indicate the child directed status of a site.  FPF has been working with a number of stakeholders to refine a technical proposal that could help standardize this type of communication, effectively creating a limited “Do Not Track for Kids” signal.  We have urged the FTC to provide a “safe harbor” for users of this flag in order to provide more certainty in this area and to help ensure compliance from web publishers and third parties.

Last week, the FTC released updated FAQs to help businesses comply with the COPPA rule.  These FAQs include a provision recognizing the COPPA flag as a viable tool for compliance; the FAQ sets forth a technical system for a site to affirmatively certify whether it is “child-directed” or “not child-directed.”  According to the FAQs, companies may rely on a signal that a site is “not child-directed,” but “only if first parties affirmatively signal that their sites or services are ‘not child-directed.’”  Companies cannot set this option for their clients as a default, if they wish to limit their liability. The FTC is requiring a “forced choice” or a “double flag” process, rather than the single flag that Bellovin proposed and that FPF championed.

We are pleased that the FTC recognized the COPPA flag as an effective way to both protect children and ensure that companies meet their obligations.  Technology can offer a meaningful, low-cost solution that can be widely implemented across industry to encourage compliance.

The new FAQs describe stringent requirements that must be met for a COPPA signal that companies “may ordinarily rely on.”  Our view is that this FTC language creates a safe harbor of sorts, providing protection for companies worried that they will be arbitrarily imputed actual knowledge.

While the FTC’s version of the flag will work for some companies, it will not be practical for many others.  And for those who it will work, it will likely be feasible for their new clients only, because retroactively forcing many thousands of current clients to make a forced choice or be terminated is not realistic.

A number of leading companies, including FacebookAdMob, Twitter, The Rubicon Project, and Yahoo!, began to roll out a single flag option to their clients even before the FTC released its new FAQs.  We believe this single “Do Not Track for Kids” option still has value even though it may not meet the FTC standard for a safe harbor.  The FTC has reiterated that “actual knowledge” requires a fact-specific inquiry.  As a practical matter, companies that send and receive a COPPA flag as part of their compliance efforts are demonstrating a good-faith attempt to meet their obligations under the new COPPA rule.  Those who implement such technology as part of a broader compliance strategy will be in a far better position should the FTC come calling than those who do not.

The next step for companies is to standardize a format for the COPPA flag signal so that it can more easily be passed along from company to company.  If you are interested in learning more about the FPF’s efforts to standardize this Do Not Track for Kids signal, please email [email protected].

 

 

Comments { 0 }

Article on “Mobile Privacy Initiatives and Actions Creating a Patchwork Landscape”

Mary Ellen Callahan of the law firm, Jenner & Block, co-wrote an article with Associates Michael T. Borgia, David M. Didion and Sabrina N. Guenther that examines recent federal and state initiatives to define consumers’ privacy rights when they use mobile devices. The authors eloquently explain that various state and federal agencies have similar goals when determining mobile privacy standards, but without a comprehensive nationwide framework, their efforts have resulted in a patchwork of binding and nonbinding forms. The article appeared in the June 2013 edition of Communications Lawyer, the publication of the ABA Forum on Communications Law.

An excellent read for your Friday afternoon on recent initiatives and legal developments  in the mobile privacy space!

 

Comments { 0 }

NTIA User Interface Mockups

“I am pleased to support the NTIA Short Form Notice Code of Conduct,” said Jules Polonetsky, Executive Director of the Future of Privacy Forum. “A ‘food label’ type approach to a privacy notice will give consumers a standardized way to get key privacy information at a glance and will help consumers better understand how apps collect and share data.”

The sample notices below show examples of implementations of the short notice developed by a number of the multi-stakeholders. We expect that consumer testing will lead to even better versions that will deliver easy to use information to consumers.

Example 1:  Data Use Highlighted

Example 2:  Data Used on Top & Data Not Used on Bottom

Example 3:  ”YES/NO” Highlighted Accordion

Example 4:  Categories Separated (Long List)

Example 5:  Categories Separated (Short List)

Click to download a compilation of all five user interface designs.

Please also check out the short form notice example from the Association of Competitive Technology, which demonstrates  another clear way that apps can implement the code in an easy-to-use manner!

 

Comments { 0 }

Privacy “Fail” for Popular Health Apps

Privacy Rights Clearinghouse’s recent study found that many health and fitness apps are lacking on the privacy front. The report evaluated 43 paid and free apps in the health and fitness categories on Google Play and and Apple’s App Store. The privacy-focused non-profit discovered that many of these apps lacked privacy policies, failed to encrypt data and failed to notify users that data is transmitted to third parties. “given the often sensitive nature of health data stored on wellness apps, which can range from weight loss trackers to blood glucose monitors, apps used for health purposes should be adhering to a much higher standard of privacy and security protection,” said Beth Givens, founder and director of PRC. 

According to the report:

  • 13 percent of free apps and 10 percent of paid apps encrypt all data connections and transmissions between the app and the developer’s website.
  • 39 percent free apps and 30 percent of paid apps send data to someone not referenced by the developer in the app or privacy policy.
  • 26 percent of free apps and 40 percent of paid apps did not have a privacy policy.
  • 43 percent of free apps and 25 percent of paid apps provided a link from the app to a privacy policy on the developer’s site.

Perhaps if these developers had used PRC’s best practices for mobile apps,  these apps would have received a better grade!

 

Comments { 0 }

App Quality Alliance’s Privacy Guidelines

The App Quality Alliance (AQuA) has updated their mobile application development Best Practice Guidelines to incorporate consumer privacy-focused recommendations. The update is designed to help mobile developers address topics such as “users’ rights, location data, and information security and accountability.”

Working directly with the GSM association (GSMA), the guidelines take on a privacy-by-design approach: “The Best Practices are what you can use when you’re designing your application, and trying to work out how you should some of these aspects so you can avoid any errors in the early design stage.”

Comments { 0 }

Getting COPPA Right with a New Directed at Children Signal

One of the most important provisions of the updated Children’s Online Privacy Protection Act (COPPA) rule that took effect yesterday is the extension of child privacy protection to behavioral advertising, the practice of tracking users across online sites and services to tailor advertising.  The Future of Privacy Forum supported the Federal Trade Commission’s move to restrict behavioral ads for children and we are pleased to see many companies working hard to come into compliance.

However, when the FTC focused on behavioral ads, they drew their rulemaking scope widely, capturing almost all forms of tracking across sites other than a set of limited “internal operations purposes”.  Third party code providers, such as analytics companies, ad networks, or social plug-in providers are deemed to have “actual knowledge”  they are dealing with children if the first party site has effectively communicated its online status to the third party or if a “representative of the online service recognizes the child directed nature of the site.”

This last provision is challenging, since many third party code providers distribute their code freely to millions of web developers, with no way to assess whether they are being used by services directed at children.  Does an email from anyone in the world to an employee of a social network put the company on notice that it is dealing with a child directed site?  How should an ad network know if a “representative” of its service has recognized the child directed nature of an app?  Some apps are obviously directed at children, but for others the legal analysis is quite fact specific.  Given the strict liability standard under COPPA, all third parties that distribute code widely are facing a substantial and amorphous risk.  We trust that the FTC staff will be reasonable in their enforcement efforts, but more certainty in this area would help ensure compliance from web publishers and third parties.

One way to help provide certainty is to develop a technical method for child directed sites to communicate their status to third parties.  FTC Chief Technologist Steve Bellovin proposed a promising model several months ago, calling for a special site flag to be passed between companies that would indicate the child directed status of a site.  FPF has been working with a number of stakeholders to refine a technical proposal that could help standardize this type of communication, effectively creating a limited “Do Not Track for Kids” signal.

In this direction, we are pleased to note that a number of companies have started rolling out technical flag options for sites directed at children to use.  Facebook just released a new kid_directed_site parameter, which sites can use to let Facebook know that they are directed towards the under-13 set.  Google’s AdMob mobile ad network SDK now includes a new setting called tag_for_child_directed_treatment, which allows mobile apps to indicate they want their content treated as child directed for ad requests.  The Rubicon Project emailed its clients advising them to use a new site naming convention “[Site Name] – Children’s Site, which publishers should insert in their ad tags.  And Twitter just advised sites directed to children that they must use the data-dnt parameter, which Twitter provides for sites that wish to opt-out their users from tailored content and suggestions.

For many companies, creating such a flag will be far more complex.  Tags will need to be created by complex content management systems for sites that dynamically assemble pages. For companies that operate ad networks or exchanges, flags will need to be reliably passed from one ad network to another; sites or networks that don’t pass site data will need to develop a means to generate a flag.   But the effort to implement this flag could be an effective way to both protect children and ensure compliance.

The FTC could play a key role here to encourage this new technical method of COPPA compliance, if it recognized that services designating a primary technical method for sites to communicate their status or to restrict data use should not be deemed to have gained actual knowledge via alternate means.  To be clear, services that get this flag are now on the hook for full COPPA compliance, as are their child directed site partners.  By sending or distributing the flag, companies are distributing and expanding a significant legal compliance obligation and accepting the risk of substantial penalties.  By choosing to use this flag, they should be have certainty that they will not held responsible for being attributed knowledge in an uncertain manner.

Much criticism of the COPPA rule has focused on the compliance burden it poses on small companies and start-up app developers. By looking to technology for a solution, the FTC and industry could turn a legal burden into an effective, no cost and widely distributed method to advance children’s privacy.

 

Comments { 0 }

FTC Sends Out Educational Letters to Help Companies Prepare for Updated COPPA

With the July 1st deadline approaching, the FTC sent out over 90 letters to companies that may be affected by the updated COPPA rule. These letters were sent to domestic and international online services – including mobile apps – that “appear to collect personal information from children under 13.”  Under the updated rule,  the definition of “personal information” now includes photos, videos, audio recordings and persistent identifiers that can recognize users over time and across different websites and online services. The Commission issued one set of letters to companies that may be collecting images or sounds of children, and another set of letters to companies that may be collecting persistent identifiers from children.
Comments { 0 }

Rep. Hank Johnson Introduces The “Apps Act”

Rep. Hank Johnson (D-GA) introduced the “Apps Act,” a bill  that would require developers to have privacy policies detailing their data sharing practices. Developers would also need to obtain consent from consumers before collecting data and securely maintain the data they collect. The proposed bill is based off of AppRights, the Congressman’s online effort to build mobile privacy legislation from the bottom up. “These engaged citizens also wanted simple controls over privacy on devices, security to prevent data breaches, and notice and information about data collection on the device,” Johnson said when officially announcing the bill at yesterday’s State of the Mobile Net Conference. “The Apps Act answers the call.”

 

Comments { 0 }

Preventing Digital Keg Parties on Your Website

Sharing on Facebook, LinkedIn and elsewhere has made it incredibly easy for users to distribute their favorite Internet finds, though the plugins that make sharing easy and writing the code to do so is extra work for many companies. Consequently, free plugins that manage multiple social networks like AddThis have become quite popular. Consumer experts know that when you say “free” you ought to ask, “free in return for what?” Well B2B companies need to be just as smart as consumer experts, for while the plugins may be free, the exchange is usually your site’s data. Indeed, this was all a surprise to ITWorld blogger, Dan Tynan, when he discovered that a plugin on his Web site was selling his data to third parties: “It appears that the social media widget AddThis invited a bunch of pals over for a kegger on my Web site and didn’t tell me.”

For our friends in the advertising and publishing world, if your plan is to share data with third parties, then make sure you use the appropriate notice & choice mechanisms and good luck to you. However, if it’s not your plan to sell this information, then free or low-cost alternatives do exist. One example we are proud to be associated with is Gigya’s SocialPrivacyTM Certification, which assures companies and users that specific guidelines are followed regarding social data collection and usage practices.

We want to remind companies that collect sensitive data, specifically kid’s data, you risk massive fines under the new COPPA rule if your data is used for online behavioral advertising purposes. Make sure that the social media plugins you use and do business with are COPPA compliant. And as always, be sure to know whether the vendors you use are in the business of selling your data.

Comments { 0 }
About
Contact
Supporters
Privacy Policy
Learn
Do / Tools
Showcase
Consumers
Rate and Review Applications
Blog