Rep. Hank Johnson (D-GA) introduced the “Apps Act,” a bill that would require developers to have privacy policies detailing their data sharing practices. Developers would also need to obtain consent from consumers before collecting data and securely maintain the data they collect. The proposed bill is based off of AppRights, the Congressman’s online effort to build mobile privacy legislation from the bottom up. “These engaged citizens also wanted simple controls over privacy on devices, security to prevent data breaches, and notice and information about data collection on the device,” Johnson said when officially announcing the bill at yesterday’s State of the Mobile Net Conference. “The Apps Act answers the call.”
Sharing on Facebook, LinkedIn and elsewhere has made it incredibly easy for users to distribute their favorite Internet finds, though the plugins that make sharing easy and writing the code to do so is extra work for many companies. Consequently, free plugins that manage multiple social networks like AddThis have become quite popular. Consumer experts know that when you say “free” you ought to ask, “free in return for what?” Well B2B companies need to be just as smart as consumer experts, for while the plugins may be free, the exchange is usually your site’s data. Indeed, this was all a surprise to ITWorld blogger, Dan Tynan, when he discovered that a plugin on his Web site was selling his data to third parties: “It appears that the social media widget AddThis invited a bunch of pals over for a kegger on my Web site and didn’t tell me.”
For our friends in the advertising and publishing world, if your plan is to share data with third parties, then make sure you use the appropriate notice & choice mechanisms and good luck to you. However, if it’s not your plan to sell this information, then free or low-cost alternatives do exist. One example we are proud to be associated with is Gigya’s SocialPrivacyTM Certification, which assures companies and users that specific guidelines are followed regarding social data collection and usage practices.
We want to remind companies that collect sensitive data, specifically kid’s data, you risk massive fines under the new COPPA rule if your data is used for online behavioral advertising purposes. Make sure that the social media plugins you use and do business with are COPPA compliant. And as always, be sure to know whether the vendors you use are in the business of selling your data.
Today, the FTC issued updated FAQs on the amended COPPA rule. There are many requirements relevant to app developers including:
(No. 47) Whether you are covered if your app collects data but does not transmit it off of the device:
Developers that simply access data on the device (i.e. the data is not being “transmitted” off of the device) do not have to worry that they are “collecting” personal information under the law.
(No. 66) Whether you can rely on the adult app store parental account as permission:
The collection of a parent’s app account number or password, without more, is insufficient to fulfill the Rule’s notice and consent requirements.
(No 50) Whether course location is considered personal information under the new definition:
No, you do not need to worry about collecting course location under the updated rule.
(No. 30) When the app must present the notice to the parents (in the app store or immediately upon download):
There are many other requirements for developers – please review the FAQs for more details.
At yesterday’s workshop titled, “Future of Privacy + Innovation,” held at U.C. Hastings, participants heard from thought leaders on the evolving privacy space for app developers. California Attorney General Kamela Harris spoke to the audience about the need to balance consumer privacy and innovation while finding new ways to innovate on privacy: “Let’s not stop the innovation. I don’t want to shut it down…But what we do have to do is give the user information, and let the user, not anyone else, make the choice about the trade off.” The AG urged developers to give consumers the appropriate “tools” to let them make choices regarding uses their information. Ultimately, developers need to be aware of the legal and privacy requirements they must fulfill when building apps.
The California Attorney General and UC Hasting’s Privacy and Technology Project is holding a workshop, “Future of Privacy + Innovation” for app developers in California. Participants will hear from thought leaders at the intersection of technology, entrepreneurship, and policy, on the future of privacy and innovation. The Workshop will cover the evolving privacy space for application developers as we strive to balance consumer privacy and innovation, and find new ways to innovate on privacy.
The workshop will be held on Wednesday, April 10th.
Click here for more details!
Apple stated on its developer blog that starting May 1st, ”the App Store will no longer accept new apps or app updates that access UDIDs. Please update your apps and servers to associate users with the Vendor or Advertising identifiers introduced in iOS 6.”
In August 2011, Apple announced that it would phase out third party use of UDIDs – third party app developers were instructed to stop tracking iPhone, iPod Touch, and iPad users by the unique identifier number attributed to each of its devices and instead, create their own unique identifiers. Apple provided the Advertising Identifier, an alternative to the UDID, when it released iOS 6 last September. The recent Apple iOS 6.1 update included the option to reset this “non-permanent, non-personal, device identifier” feature, that is located below the Limit Ad Tracking feature.
As Gigaom notes, by May 1st the Advertising Identifier will have been available for eight months, “plenty of time for those who want to understand how their apps are being used to switch over to the new system.”
The Federal Trade Commission released a new video for mobile app developers that offers tips on complying with truth-in-advertising standards and privacy principles. The Commission recently revised their online advertising disclosure guidelines to illustrate how to make advertising disclosures clear and conspicuous, even on smaller screens. For example, avoid pop-up windows and hyperlinks to convey critical information. When hyperlinks are used, they should be labeled as specifically as possible and should function on all devices. Indeed, the new guidelines serve as a reminder that federal consumer protection laws apply to advertisements on mobile devices.
Intuit announced today that it supports the draft code of conduct being developed in the National Telecommunications and Information Administration’s privacy multistakeholder process, and that the company plans to adopt the code. When finalized and adopted, the code will improve disclosures regarding mobile applications’ privacy practices. Other key stakeholders have also voiced general support for the direction the code is taking.
“This commitment from a key stakeholder is a major milestone for NTIA’s privacy multistakeholder process; however, there is more work to be done,” said Lawrence E. Strickling, Assistant Secretary of Commerce for Communications and Information and Administrator of NTIA. “Intuit’s announcement signals that the multistakeholder approach is making real progress; we encourage stakeholders to move swiftly to finalize and adopt a code of conduct that will improve privacy disclosures on mobile devices. Genuine disagreements remain among stakeholders on a few issues, and NTIA anticipates that as compromises are reached, more stakeholders will commit to adopting the code.”
A recent study claims that the top apps on iOS and Android present some risky privacy behaviors. We appreciate the privacy concerns created by the app ecosystem: we work to educate developers about best practices, encourage the platforms to improve privacy controls and urge the FTC to act against deceptive apps. That being said, we have previously been critical of reports such as this one that scare folks from ever wanting to download an app.
The report released by Appthority analyzed “the top 10 free apps across five common categories from the Apple App Store and Google Play Store.” We found the study to be more confusing than informative. We reached out to them (no response as of yet) since some of the results seemed incomplete. Some key issues we have include:
- The study did not mention how many of these apps explicitly ask users’ permission to access the relevant information. Of course, accessing certain information without clear user permission is a problem, but this study asserts that any collection of sensitive data is risky. For example, the study found that 60% of iOS apps use location compared to 42% of Android apps, and thus implied that iOS apps are more privacy invasive than Android ones. However, the report does not explain how many of these apps actually request permission to use location. In fact, all iOS apps are forced by the Apple platform to ask for user permission. Indeed, there is valid concern of how and for what purposes apps collect/use location data. Yet there is a big difference between apps that notify and ask users for location information versus apps that run amok with this type of data.
- The study also states that 14% of iOS apps access users’ calendars. Are these apps stealing people’s calendars? Or are these apps that add travel plans and the like to your current calendar? Moreover, just because the top iOS apps access users’ calendars does not say anything about iOS vs. Android being more privacy invasive – perhaps more calendar apps are enjoyed by iOS users.
- In addition, the study found that most apps transmit data in unencrypted format. It’s not unusual for apps that rely on analytics code to transmit relatively non-sensitive information. Although some may argue that all traffic sent off phones should be encrypted as a best practice, but that is certainly not the current standard. It would be far more useful if the study tested apps that send and receive personal or sensitive data – the type of data that should be encrypted.
We do think that there are real privacy concerns in the mobile app ecosystem. For instance, some people are cool with their data sent to ad networks that use it to tailor advertisements in return for free or cheap services. Others find it intrusive and would prefer to easily decline this targeting. Unfortunately, “opting-out” of targeted mobile advertisements is still inconvenient. There is much work to be done and studies/surveys of app practices are useful. That being said, reports that fan privacy fears to promote security services too often throw out the privacy baby with the digital bathwater.