Category Archives: Law: Privacy

Dystopian Fiction in Everyday Life

The Tampa Bay Times has the scoop on a new surveillance plan in Pasco County, Florida.  The Sheriff’s Department there is targeting people for enhanced police scrutiny based on what it claims is an “unbiased, evidence-based risk assessment designed to identify prolific offenders in our community.”

“As a result of this designation,” the Sheriff’s office warns targeted residents, “we will go to great efforts to encourage change in your life through enhanced support and increased accountability.”

Naturally, there’s a federal lawsuit.

Indeed, last year, the paper reports, “a Tampa Bay Times investigation revealed that the Sheriff’s Office creates lists of people it considers likely to break the law based on criminal histories, social networks and other unspecified intelligence. The agency sends deputies to their homes repeatedly, often without a search warrant or probable cause for an arrest.”  In addition, there’s “a separate program that uses schoolchildren’s grades, attendance records and abuse histories to label them potential future criminals.”

To rub salt in the wound, the Sheriff’s Office has a video telling the program’s victims of increased harassment that inclusion is “good news” because it will give them opportunities to receive “assistance”. A hint of what that looks like comes in its letter to the surveilled, which warns, “Our desire to help you will not hinder us from holding you fully accountable for your choices and actions,” and promises that recipients’ names and criminal histories with get sent to local, state and federal law enforcement agencies to ensure “the highest level of accountability” for any future crimes they commit.

Spotted via Crooks & Liars’s Susie Madrak, Dept. Of Pre-Crime: Florida Sheriff Harassing Pre-Criminals — What could possibly go wrong, other than civil rights violations?. Photo Licensed via Creative Commons Attribution 4.0 International License by Fabius Maximus Blog

Posted in Law: Criminal Law, Law: Privacy, Surveillance | 5 Comments

Apple’s Great New Privacy Commercial vs Reality

Apple has unveiled a terrific new video/commercial for the privacy features of the iPhone:

While I do think Apple deserves real credit for resisting government attempts to get a back door into iPhone encryption, I can’t help but view that video a little cynically in light of reports, not so long ago, that more than half of the App Store privacy labels were false.

Bonus shout-out to “Mind Your Own Business” by Delta 5 which provides the background.

Posted in Kultcha, Law: Privacy, Sufficiently Advanced Technology | 8 Comments

‘Contact Tracing in the Real World’

Excellent essay by Ross Anderson on Contact Tracing in the Real World, especially apposite in light of a number of government and private tracker apps being floated and even implemented.

Posted in COVID-19, Cryptography, Law: Privacy | 1 Comment

New Article: Privacy as Safety

Fresh up on SSRN, a pre-publication draft text of Privacy as Safety, co-authored with Zak Colangelo and forthcoming in the Washington Law Review.  Here’s the abstract and TOC:

The idea that privacy makes you safer is unjustly neglected: public officials emphasize the dangers of privacy while contemporary privacy theorists acknowledge that privacy may have safety implications but hardly dwell on the point. We argue that this lack of emphasis is a substantive and strategic error, and seek to rectify it. This refocusing is particularly timely given the proliferation of new and invasive technologies for the home and for consumer use more generally, not to mention surveillance technologies such as so-called smart cities.

Indeed, we argue—perhaps for the first time in modern conversations about privacy—that in many cases privacy is safety, and that, in practice, United States law already recognizes this fact. Although the connection rarely figures in contemporary conversations about privacy, the relationship is implicitly recognized in a substantial but diverse body of U.S. law that protects privacy as a direct means of protecting safety. As evidence we offer a survey of the ways in which U.S. law already recognizes that privacy is safety, or at least that privacy enhances safety. Following modern reformulations of Alan Westin’s four zones of privacy, we explore the safety-enhancing privacy protections within the personal, intimate, semi-private, and public zones of life, and find examples in each zone, although cases in which privacy protects physical safety seem particularly frequent. We close by noting that new technologies such as the Internet of Things and connected cars create privacy gaps that can endanger their users’ safety, suggesting the need for new safety-enhancing privacy rules in these areas.

By emphasizing the deep connection between privacy and safety, we seek to lay a foundation for planned future work arguing that U.S. administrative agencies with a safety mission should make privacy protection one of their goals.


Enjoy!

Posted in Law: Privacy, Writings | 1 Comment

In America TV Watches You

I find this creepy:

First came product placement. In exchange for a payment, whether in cash, supplies or services, a TV show or a film would prominently display a brand-name product.

Then there was virtual product placement. Products or logos would be inserted into a show during editing, thanks to computer-generated imagery.

Now, with the rise of Netflix and other streaming platforms, the practice of working brands into shows and films is likely to get more sophisticated. In the near future, according to marketing executives who have had discussions with streaming companies, the products that appear onscreen may depend on who is watching.

Tiffany Hsu, You See Pepsi, I See Coke: New Tricks for Product Placement (NYT)

It is creepy, right? I’m not just being cranky?

Posted in Law: Privacy, Sufficiently Advanced Technology | Leave a comment

New Paper–“Big Data: Destroyer of Informed Consent”

Just posted: A near-final draft of my latest paper, Big Data: Destroyer of Informed Consent. It will appear later this year in a special joint issue of the Yale Journal of Health Policy, Law, and Ethics and the Yale Journal of Law and Technology.

Here’s the tentative abstract (I hate writing abstracts):

The ‘Revised Common Rule’ took effect on January 21, 2019, marking the first change since 2005 to the federal regulation that governs human subjects research conducted with federal support or in federally supported institutions. The Common Rule had required informed consent before researchers could collect and use identifiable personal health information. While informed consent is far from perfect, it is and was the gold standard for data collection and use policies; the standard in the old Common Rule served an important function as the exemplar for data collection in other contexts.

Unfortunately, true informed consent seems incompatible with modern analytics and ‘Big Data’. Modern analytics hold out the promise of finding unexpected correlations in data; it follows that neither the researcher nor the subject may know what the data collected will be used to discover. In such cases, traditional informed consent in which the researcher fully and carefully explains study goals to subjects is inherently impossible. In response, the Revised Common Rule introduces a new, and less onerous, form of “broad consent” in which human subjects agree to as varied forms of data use and re-use as researchers’ lawyers can squeeze into a consent form. Broad consent paves the way for using identifiable personal health information in modern analytics. But these gains for users of modern analytics come with side-effects, not least a substantial lowering of the aspirational ceiling for other types of information collection, such as in commercial genomic testing.

Continuing improvements in data science also cause a related problem, in that data thought by experimenters to have been de-identified (and thus subject to more relaxed rules about use and re-use) sometimes proves to be re-identifiable after all. The Revised Common Rule fails to take due account of real re-identification risks, especially when DNA is collected. In particular, the Revised Common Rule contemplates storage and re-use of so-called de-identified biospecimins even though these contain DNA that might be re-identifiable with current or foreseeable technology.

Defenders of these aspects of the Revised Common Rule argue that ‘data saves lives’. But even if that claim is as applicable as its proponents assert, the effects of the Revised Common Rule will not be limited to publicly funded health sciences, and its effects will be harmful elsewhere.

This is my second foray into the deep waters where AI meets Health Law. Plus it’s well under 50 pages! (First foray here; somewhat longer.)

Posted in AI, Law: Privacy, Writings | Leave a comment