BlurryEdge Strategies

California Attorney Releases Recommendations for Best Privacy Practices

The California Attorney General recently released a guide with recommendations on how websites and mobile services can convey information about privacy practices to users. During the guide’s development, BlurryEdge Strategies participated in discussions with the attorney general’s staff, privacy advocates, and service providers. The guide is based on California’s one-of-a-kind privacy policy law, the California Online Privacy Protection Act, which was amended in 2013 to require transparency about Do Not Track signals. The guide is intended to help businesses craft policies that meet and exceed current legal requirements.
 
Inside, the guide contains a general checklist of what websites and mobile applications should explain to users about information practices. The recommendations range from the structural to the substantive and include suggestions of how to describe what a service collects, uses, and shares. The guide also provides a useful checklist for websites seeking to understand and implement California’s Do Not Track law. The guide places special emphasis on the need for services to avoid legalese in favor of easy to read, straightforward language.
 
This set of best practices is the latest in a series of moves by the California Attorney General that seek to educate web and mobile-based services on the state’s privacy laws and the importance of privacy by design. These efforts recognize that clear and approachable information about privacy practices is essential to ensuring users are informed of how services work. As California continues to lead the nation on matters of privacy, BlurryEdge will take advantage of further opportunities that make these protections workable, and effective, for both services and users. 

Posted on 06/11/2014 | Permalink | Comments (0)

Reblog (0) | | Digg This | Save to del.icio.us |

FTC Settles With Flashlight App Over Deceptive Data Use

On Thursday, December 5th the Federal Trade Commission (“FTC”) announced a settlement with Goldenshore Technologies, producer of the “Brightest Flashlight Free” app (“Flashlight App”), one of the top free apps available for Google’s Android operating system. The Flashlight App works by activating lights on Android phones, including the LED flash, which turns users’ smartphones into a flashlight.

The settlement followed an FTC investigation into the Flashlight App that revealed it shares information from the app, including geolocation and device identifiers, with third parties such as advertising networks without informing users via Android’s “permissions” process or via the App’s privacy policy. The App’s End User License Agreement (“EULA”) repeated these misrepresentations. Finally, the FTC discovered that the Flashlight App transmitted the geolocation and device identifiers to third parties even before users could accept or refuse the terms of the EULA.

The FTC alleged that Goldenshore’s failure to disclose the full scope of its information practices was “deceptive” under the FTC Act.

The Flashlight App case provides a few lessons for app developers:

  • The failure to disclose information about how an app collects, uses, or shares data may be considered “deceptive” by the FTC; in other words, the FTC is concerned with omissions as well as affirmative misrepresentations.
  • Privacy policies, license agreements, and terms of use must disclose all the information collected, used, and how it is shared with third parties.
  • Consumers and regulators expect apps to collect and share only the information necessary to provide the apps’ functions, and the collection and sharing of “extra” information will raise eyebrows.
  • Opt outs must provide users with real choices, and they must be honored by developers. Developers should also limit the collection of information via an app before users have the opportunity to exercise choice.

Posted on 12/23/2013 | Permalink | Comments (0)

Reblog (0) | | Digg This | Save to del.icio.us |

Federal Trade Commission to Host November Workshop on the “Internet of Things”

On November 19, 2013 in Washington, D.C., the Federal Trade Commission (“FTC”) will hold a public workshop on the “Internet of Things”—the emerging world of devices that communicate with each other and with people via the Internet. The Internet of Things has the potential to be a massive market of devices with varied tasks that include everything from monitoring an individual’s vital signs and communicating them to a doctor’s tablet, to measuring the amount and age of milk in the fridge at home. In the past year, this quickly developing area has garnered press attention for both its benefits and potential privacy and security risks.

The FTC’s public workshop will focus on the consumer privacy and security issues this new set of technologies raises and will feature experts and other interested players in the space. In advance of this workshop, the FTC called for public comments and asked questions such as “What are the various technologies that enable this connectivity?” and “How should privacy risks be weighed against potential societal benefits, such as the ability to generate better data to improve health-care decision-making or to promote energy efficiency?” Those comments were due June 1st.

As the Internet of Things develops, we continue to monitor the potential security and privacy issues raised by new technologies and business applications in this space. As we see it, here are a few issues that technology companies, consumer advocates, and regulators will need to address within the Internet of Things:

  • As users begin to fill their lives with these easy-to-activate devices, there will be challenges to providing consumers with meaningful notice of how their information is collected and shared.
  • The seamless way in which Internet of Things devices will connect and communicate also highlights the importance of providing information about uses of consumer information and the parties or devices that receive it in a transparent and understandable manner. Determining what transparency means in the Internet of Things context will be key.
  • There may also be challenges providing consumers with access to their data, as there may not be a direct relationship between a consumer and a device, and information from one device may be shared with other devices and companies.
  • Since many Internet of Things devices will continually communicate consumer information via the cloud, and parties must ensure the security of this data both while in transit and at rest.
  • Finally, the role design will play in addressing these challenges will need to be examined.

Here’s some interesting reading on the topic:

  • Federal Trade Commission, “FTC Seeks Input on Privacy and Security Implications of the Internet of Things,” (Apr. 17, 2013).
  • “Welcome to the Programmable World,” Wired.com (May 14, 2013).
  • “The Internet of Things Has Arrived — And So Have Massive Security Issues,” Wired.com (Jan. 11, 2013).

Posted on 11/13/2013 | Permalink | Comments (0)

Reblog (0) | | Digg This | Save to del.icio.us |

reddit Launches new Privacy Policy

For the past few months I've been working with BlurryEdge associate Megan Worman to help reddit overhaul its privacy policy. The new version went live today, and I'm participating in a reddit AMA right now. Come ask us anything!

Posted on 05/01/2013 | Permalink | Comments (0)

Reblog (0) | | Digg This | Save to del.icio.us |

With Warning that Data Brokers’ Tenant Reports May Be Subject to FCRA, FTC Highlights the Need to Value Consumers’ Personal Information

Last week, the Federal Trade Commission revealed that it had sent letters to six different data brokers, all of whom provide requestors with reports detailing individual tenant histories, warning them that their practices may be subject to the Fair Credit Reporting Act (“FCRA”). This move follows the FTC’s announcement that it is investigating data brokers that mine consumer information and a congressional inquiry of the industry’s practices.

In its letter to the data brokers, the FTC points out how data brokers that assemble and share individuals’ rental histories are likely “consumer reporting agencies” issuing “consumer reports” and thus subject to the Fair Credit Reporting Act (“FCRA”). As the letter describes in detail, the FCRA requires consumer reporting agencies that issue these sorts of reports must ensure they are being used correctly, are as accurate as possible, and provide consumers access to the reports and a chance to dispute information believed to be inaccurate. Companies that fail to do this may be subject to damages for each violation of the law.

Although the FTC’s letter is primarily concerned with the requirements of the FCRA, there are some general lessons that can be taken from this move. First, businesses that covertly collect and share consumer information risk bad press and legal action by the FTC. Second, when a company collects information for one purpose (e.g., an application for a first apartment), the law frowns upon subsequent uses of that information that are different and that a normal consumer wouldn’t expect (e.g., to deny them the next apartment). Lastly, as technology increases the detail of data brokers’ consumer profiles and expands the types of personal information they can trade, the public’s expectation of privacy in such information should not be discounted.

Posted on 04/11/2013 | Permalink | Comments (0)

Reblog (0) | | Digg This | Save to del.icio.us |

Mobile Unique Identifiers and Location Information

We're keeping up with the latest developments in mobile data collection and have issued an to our 2011 White Paper, Mobile Unique Identifiers and Location Information. The two new events in this area of privacy law include:
- The Location Privacy Protection Act of 2011, which had traction in the Senate, was not presented to Congress by the end of 2012
- Google settled with 38 states and the District of Columbia for $7 million dollars in Wi-Fi data collection investigation.

Download 2013BESWPMobile

Posted on 03/22/2013 | Permalink | Comments (0)

Reblog (0) | | Digg This | Save to del.icio.us |

Room for Debate: A National Priority and a Business Priority

My NYTimes op-ed makes the case for public disclosure of serious cyber security breaches, in response to the question "Should Companies Tell Us When They Get Hacked?"

Posted on 02/24/2013 | Permalink | Comments (1)

Reblog (0) | | Digg This | Save to del.icio.us |

FTC Recommends Best Practices for Mobile Privacy

Screen Shot 2013-02-07 at 4.34.34 PMOn February 1, 2013, the FTC released a new report, Mobile Privacy Disclosures: Building Trust Through Transparency, setting out current data protection best practices for mobile operating system (OS) providers and app-developers.

The report’s guiding principle is that these providers must work to give mobile device users:

(1)  clear understandings of how her information is being collected, and

(2)  tools to manage and protect access to her data.

The FTC recommends that app-developers and OS-providers integrate specific privacy designs into their products, to protect themselves from future FTC actions.  It also recommends a general ‘privacy by design’ approach, which would prioritize data minimization, data security, and procedural safeguards at every stage of product development.

It also pushes ad networks, third-party data collectors, and app-industry groups to put a priority on data protection measures, so that they encourage OS-providers and app-developers to provide users more notice and controls.

Recommendations for OS-providers

The FTC focuses on OS-providers as the main stakeholder who can promote data protection. This is because OS-providers largely determine the users’ experience & awareness of data privacy, and because they have substantial leverage over app-developers.

The FTC recommends OS-providers build in privacy alerts and management tools for users, and that they implement enforceable standards for app-developers. These best practices are:

Privacy Alerts for Users

  • Definitely provide ‘just-in-time’ warnings (i.e., just prior to the collection of information) to the device-owners before apps can access ‘sensitive content’ -- especially geolocation. Ask the user if she agrees to let the app access the data, and only if she consents, will the app be granted access.
  • Consider providing ‘just-in-time’ consent interfaces for apps’ collection of semi-sensitive content, including contacts, photos, calendar entries, and the recording of audio or video.
  • Publish a clear policy about how the OS-provider reviews apps before they are released for download.

Management Tools for Users

  • Build a dashboard into the platform, on which the user can review what types of content certain apps can access, and what data apps have already accessed.
  • Create a set of universal icons that communicate to the user what data is being accessed by an app.
  • Offer users a Do Not Track mechanism, which would let them choose to prevent tracking by ad networks and other third parties while using apps, unless apps get their consent.

 

Screen Shot 2013-02-07 at 4.33.26 PM
An Icon on Android OS, notifying the user that the app is accessing her geolocation data

 

 

Screen Shot 2013-02-07 at 4.33.13 PM
An Icon on Apple's iOS, notifying the user that her geolocation data is being accessed

Screen Shot 2013-02-07 at 4.34.00 PM
A privacy notice icon, that appears when data is being collected, which the user can expand and read more about


 

Supervision of App-Developers

  • Require developers to disclose data collection to users and have a privacy policy in place, through contract provisions.
  • Educate developers about best practices in data protection.
  • Conduct compliance checks of apps, to determine if they are in violation of data protection standards.  If the standards are not met, then enforce them by taking action against the developer.

Recommendations for App-Developers

The FTC also focuses on what app-developers could be doing better regarding data protection.  It recommends the following best practices:

Privacy Alerts for Users

  • Post a privacy policy on the app store about how they may collect and distribute users’ data.
  • If the OS-provider does not do so already, provide ‘just-in-time’ warnings to users before collecting data, and only accessing the data if the user explicitly consents to it.

Oversee Ad Networks & 3rd Parties

  • Before integrating third-party code into an app (e.g., for ads or for analytics), first determine what user information the third-party will be collecting.
  • Communicate to the user that this third-party data collection will occur.

Reach out for Guidance

  • Take advantage of self-regulatory programs, trade associations, and industry organizations to stay up-to-date on what best practices are.
  • Follow the National Telecommunications & Information Agency’s upcoming privacy code of conduct.

Enforcement & consequences

The FTC emphasizes that it will enforce data protection standards for mobile businesses. 

It points to its recent action against Path for their collection of users’ address book data and collection of children under 13 without parental consent – and by their action against Frostwire for a peer-to-peer file-sharing app that would lead to users’ unwitting exposure of personal files on their device.

The FTC has put together this report of recommendations so that mobile businesses can avoid such actions.  If OS-providers and app-developers implement these designs, and if they comply with the upcoming NTIA privacy code of conduct, the FTC indicates that this compliance will insulate companies from law enforcement actions.

Posted on 02/11/2013 | Permalink | Comments (2)

Reblog (0) | | Digg This | Save to del.icio.us |

CPUC Energy Data Access Workshop

I attended the California Public Utilities Commission (CPUC) Energy Data Access Workshop last week (1/15-16) in San Francisco. I have been following the CPUC's numerous proceedings on energy data privacy, including the Privacy Rule for SmartGrid Data and the discussion about implementing processes for users to authorize the utility to directly transfer their energy usage data to third party providers (e.g. for demand response purposes).

This particular meeting was focused on access to users' energy data for research purposes.  A number of research institutes and city planners want access to both personalized and anon/aggregate energy consumption data for energy efficiency planning and research into alternative energy programs. This makes interesting politics because the Privacy Rule places the burden on utilities to protect their users' privacy.

Also interesting was how similar this debate is to other debates (i.e. cookie data or health data) where the question is how to create useful information from data, but to do so in a way that is reasonably aggregated and anonymous to protect user privacy.  There were also interesting presentations, particularly by the census bureau, on secure means to provide access to the data when the research requires data in a form that could not be considered to meet this standard.

This Workshop was meant to give the CPUC enough information to start a proceeding that will likely determine whether a new data center with data from the three IOUs will be created or some other means to facilitate access to data will be pursued.  Hopefully, it will spark some input from researchers and professionals other fields where these issues are being discussed.

Posted on 01/23/2013 | Permalink | Comments (0)

Reblog (0) | | Digg This | Save to del.icio.us |

CA AG Issues Report "Privacy on the Go: Recommendations for the Mobile Ecosystem"

The California Attorney General's office today released its long awaited report Privacy on the Go (pdf) with recommendations for application developers, platforms, and ad-networks.  It is a must read-- both for the easy to understand language and clear suggestions, and because it promotes implementations than generally are not considered "required by law."  This is the future of privacy design.  I urge you to take a look.

Highlights:

Limit collection:  Avoid or minimize the collection of personally indentifiable data that you do not need to provide your service and limit the time you keep it.  Or as I tell my clients, make a fair bargain with your users!  Have policies that make sense and describe them in a way so the user thinks the exchange of their data for your service is reasonable.

Surprise Minimization:  Don't collect data or use it in a way that will surprise your users.  In other words-- notice early, contextually, and repeatedly!

Enhanced Notice:  Use "special" notices to highlight things that would or should be important to users of your service.  What are special notices you ask?  Well the report doesn't say, exactly.  But I'm betting it means notice that is not a tiny 8 point disclosure hidden in paragraph 27 of a privacy policy.

 

Posted on 01/10/2013 | Permalink | Comments (1)

Reblog (0) | | Digg This | Save to del.icio.us |

Next »

Welcome

Recent and Upcoming Presentations

  • 2/23: What's Hot in Copyright
    for Virtual Worlds
    and User Generated Content
    Copyright Society of Northern California
  • 2/24: Privacy, Free Speech, and
    ‘Blurry-Edged’ Social Networks
    The Community Roundtable
  • 3/17: Can Publishers Take Ownership of Privacy? OMMA Global, SF
  • 4/11: Technologic Change And The Courts Northern District of California Judicial Conference
  • 1/7: Social Media and Social Norms American Association of Law Schools

@laurengelman

    follow me on Twitter

    Archives

    • June 2014
    • December 2013
    • November 2013
    • May 2013
    • April 2013
    • March 2013
    • February 2013
    • January 2013
    • August 2012
    • March 2010

    More...

    BlurryEdge Strategies is powered by Typepad. Blog design by Eliza Grace Design.