Senate Mobilizes for Mobile Data Privacy

Business As Usual

Transparency & Privacy By Design: Workable Mobile Privacy Laws

Presumably without geolocating ourselves today on Capitol Hill (outside of a Ukrainian TV crew's short interview in the halls of the Dirksen where we admittedly fired up an iPhone), we covered the United States Senate's Mobile Technology & Privacy Senate Judiciary Subcommittee on Privacy, Technology & the Law hearing.  Seeking to garner information from the public and private sectors on privacy issues for users of smartphones, tablets, and mobile devices at large, the US Senate is mobilizing for reasonableness and fair-use practices within the exploding mobile application and device industry. Heading into the hearing, much concern has been levied at providers and mobile device and application usage of geolocation and other personal, private data in ways often entirely unknown to the end-user.  With panelists from the Justice Department and Federal Trade Commission providing a Federal perspective, and industry consultants, public policy think tanks and high-level reps from Google and Apple (internally referred to as "Gapple" by us here at ConsultED) all offering opinion and fielding Senate inquiry, here is the take from our Capitol coverage beat.

Senator Patrick Leahy (D - Vermont) outlines the crux of the issues as the potential for tracking, storing and reselling data without users’ consent, that sensitive user data may be maintained by providers in unencrypted formats, and that the sale of location data without consumer knowledge results in the receipt of unsolicited ads by third parties.  What is striking to us as technologists as well as citizens is the current conditions of data sharing responsibility: as we users of mobile devices and applications give up our information, there are seemingly little to no legal restrictions in place to prevent that data from being shared with other third-party businesses.  Our attending Federal Regulatory Agency representatives admitted as much, but also provided the Subcommittee with some scope and direction on how the issues might be addressed.  Led by Senator and Subcommittee Chairman Al Franken (D - Minnesota), the hearings commenced as Senator Franken excellently condensed the Government's footing in that the 4th Amendment doesn't apply to companies, and the Freedom of Information Act does not apply to Silicon Valley.


Jessica Rich (Deputy Director, Bureau of Consumer Protection, FTC) stated that consumers seem to have no idea the layers involved in the sharing of their data and emphasized tackling consumer privacy concerns early on in product design. Devices must be developed from the beginning to maximize user safety without "inhibiting innovation".  Jason Weinstein (Deputy Assistant Attorney General, Criminal Division, DOJ) noted that while there are no restrictions against 3rd party selling, sharing and distribution of your information there are restrictions preventing unjustified sharing with government agencies. He also expressed that federal law does not currently require a company to disclose a data breach and there is a need for regulations governing each of these situations, pointing out that he was unaware of any legal requirement for set data security standards of the provider companies, instead setting their own policies in this regard based on thier own assessments. 

Senator Tom Coburn (M.D., R - OK) inquired with the agencies as to what tools by way of statutes, without detriment to public use of these technologies, are needed in better dealing with these issues.  For Rich and the FTC, they have been utilizing with a degree of success Section 5 of the FTC Act, but reiterated several items for the Senate to consider when scoping a resolution: upfront disclosure with clearer privacy agreements that users could more easily understand, improving the interaction between the providers and the users, and what is not necessary for the business model of the company then by default make it unnecessary for any other use, and better data retention policies (he interesting counterpoint here by the industry is that such data retention policies actually increase the risk for being hacked, and privacy being comprised more largely).  For the Justice Department, Weinstein stated the need for federal legal requirements, asking for a strengthening of the 1030 Act for deterrents and consequences and for a better balance of data retention policies between users, law enforcement and industry.  Coburn also fairly reminded that we cannot request standards for security breaches, etc. that the federal government cannot uphold themselves.


Ashton Soltani (Independent Researcher and Consultant), Justin Brookman (Director of the Project on Consumer Privacy, Center for Democracy and Technology) and Jonathan Zuck (President, Association for Competitive Technology) provided testimony we felt was most useful to the Senate's agenda: in mobile privacy regulation being more transparent and better definition of the concepts involved.  "Gapple" was represented by Alan Davidson (Director of Public Policy, Americas, Google Inc). and Guy "Bud" Tribble (Vice President of Software Technology, Apple Inc.).

Snapshots of their testimony:

Soltani - not only consumers, but often the platform providers themselves are surprised as to the information apps and platforms have access to.  Platforms need to provide adequate measures in making absolutely clear (to all parties, including themselves) as to what information is collected at any given time and as to the purpose of that collection. We need greater clarity as to what mobile privacy involves even as to the terms used to describe it: how exactly do we define "opt-in' and isn't "opt-out" more appropriate as proposed by Sen. Franken? how do we define 3rd and 1st Parties? how to definite “location” and “anonymized?”   

Brookman - The mobile space is governed by a patchwork of outdated privacy laws, mobile applications access far more private data than that capturable from the Web, there is no comprehensive data privacy law in the United States; the baseline as set by the FTC is too low - companies cannot affirmatively lie about how they are using or sharing your data, resulting in those companies simply not making any representations about use at all and their use policies vague and legalistic. Privacy laws are so vague that the easiest way for a company to get into trouble is to actually make a concrete statement as to what they are doing.  It is not really possible for the public to really find out how their data is being shared (echoing a sentiment earlier from Rich).

Zuck - the face of these mobile devices IS the applications found on them, 85% of these applications being made by small business, a national phenomena with international implications, need to approach data privacy in a holistic manner, focusing on the data itself and how it is used, not necessarily how it is collected.  The small business applications developers are not collecting personal data of users, nor or are they storing personal data but transferring or selling it back to providers

Tribble - Apple places a link on every page of their website as to their usage policies; they require all 3rd party application developers to adhere to customer privacy policies (however, doesn't specifically require a privacy policy).  As to location, Apple makes use of this data only in determining available towers for relay points and to map wifi networks.  Apple devices have a master location services switch built into mobile devices that customers can turn off, and popup dialogue box that cannot be overwritten to accept/decline.

Davidson - Google provides a location setup "opt-in" as part of their "privacy by design", are highly transparent with users as to what is being collected with high security standards to anonymize and protect that information. Google puts a lot of emphasis on their standards for openness and consumer trust with a permissions-based model.

Much of the Senate inquiries were of course directed towards (and slightly deflected by) Gapple, but instead of outlining the verbal volley of testimony we'd rather make a different closing slant.  There are some interesting philosophical differences, if not outright dichotomy, between Apple and Google as to how they handle application usage on their devices. Fundamentally, we are talking about architectural design for openness versus a somewhat closed approach; each certainly have their merits as well as pitfalls.  To enforce privacy policies of 3rd Party application providers, Apple is rather restrictive: providers are screened before being allowed into the app store, and once within the app store are subject to random audits (Apple does not check against every single app provider), examining network traffic to see if the provider is properly respecting privacy.  Should they be found in policy violation, the provider is put on a 24 hour notice to correct, or otherwise will be removed from the app store.  Google, on the other hand, is much more open, intending not to be a gate keeper.  With repeated emphasis on their policies toward openness, Google contends that the device itself provide policing for users, as they make their own choice to install the app, with no 10 page legal document of user agreement and policy (usually just one screen) that is clear and specific (Google even provided the actual screen option onto whiteboard for the Subcommittee to see). Senator Sheldon Whitehouse (D - RI) remarked that total openness is unacceptable, as we need boundaries of safety - openness is not an adequate standard and questioned how informed is the choice being made by consumers?  Davidson's reply: we all need to educate consumers better, and Google's position is not openness at all cost but an increase to openness with a content policy; Google is agreeable, and trying to, strike the right balance.


As opening remarks closed on the complex layers of mobile data privacy for users, platform providers and application developers the "where are we now" is a little more clear than just our geolocation, but we expect these issues to be addressed by a determined Senate initiative that will undoubtedly take considerable time to sort. ECPA is a 25 year old law, outdated and outmoded for covering the development and use of mobile devices and applications; CPNI does not provide coverage for data plans, not extending to information services that leave customer use of those services unregulated. The positives are that it is clear that all involved parties recognize their stake and share in the concern, and responsibility, in addressing the issues in a consortium of public and private ethics and agreement.  True cyber security will require a multi-layered approach, with a decided public and private partnership. Good, open-government at work.

This information is not an advertisement on ConsultED's part but merely alerts our Members to a potentially useful company, website, application or idea.
(3 votes)
Read 4551 times


If you found this post and information useful, please reference our other material: