First came product placement. In exchange for a payment, whether in cash, supplies or services, a TV show or a film would prominently display a brand-name product.
Then there was virtual product placement. Products or logos would be inserted into a show during editing, thanks to computer-generated imagery.
Now, with the rise of Netflix and other streaming platforms, the practice of working brands into shows and films is likely to get more sophisticated. In the near future, according to marketing executives who have had discussions with streaming companies, the products that appear onscreen may depend on who is watching.
Just posted: A near-final draft of my latest paper, Big Data: Destroyer of Informed Consent. It will appear later this year in a special joint issue of the Yale Journal of Health Policy, Law, and Ethics and the Yale Journal of Law and Technology.
Here’s the tentative abstract (I hate writing abstracts):
The ‘Revised Common Rule’ took effect on January 21, 2019, marking the first change since 2005 to the federal regulation that governs human subjects research conducted with federal support or in federally supported institutions. The Common Rule had required informed consent before researchers could collect and use identifiable personal health information. While informed consent is far from perfect, it is and was the gold standard for data collection and use policies; the standard in the old Common Rule served an important function as the exemplar for data collection in other contexts.
Unfortunately, true informed consent seems incompatible with modern analytics and ‘Big Data’. Modern analytics hold out the promise of finding unexpected correlations in data; it follows that neither the researcher nor the subject may know what the data collected will be used to discover. In such cases, traditional informed consent in which the researcher fully and carefully explains study goals to subjects is inherently impossible. In response, the Revised Common Rule introduces a new, and less onerous, form of “broad consent” in which human subjects agree to as varied forms of data use and re-use as researchers’ lawyers can squeeze into a consent form. Broad consent paves the way for using identifiable personal health information in modern analytics. But these gains for users of modern analytics come with side-effects, not least a substantial lowering of the aspirational ceiling for other types of information collection, such as in commercial genomic testing.
Continuing improvements in data science also cause a related problem, in that data thought by experimenters to have been de-identified (and thus subject to more relaxed rules about use and re-use) sometimes proves to be re-identifiable after all. The Revised Common Rule fails to take due account of real re-identification risks, especially when DNA is collected. In particular, the Revised Common Rule contemplates storage and re-use of so-called de-identified biospecimins even though these contain DNA that might be re-identifiable with current or foreseeable technology.
Defenders of these aspects of the Revised Common Rule argue that ‘data saves lives’. But even if that claim is as applicable as its proponents assert, the effects of the Revised Common Rule will not be limited to publicly funded health sciences, and its effects will be harmful elsewhere.
Consent, that is ‘notice and choice,’ is a fundamental concept in the U.S. approach to data privacy, as it reflects principles of individual autonomy, freedom of choice, and rationality. Big Data, however, makes the traditional approach to informed consent incoherent and unsupportable, and indeed calls the entire concept of consent, at least as currently practiced in the U.S., into question.
Big Data kills the possibility of true informed consent because by its very nature one purpose of big data analytics is to find unexpected patterns in data. Informed consent requires at the very least that the person requesting the consent know what she is asking the subject to consent to. In principle, we hope that before the subject agrees she too comes to understand the scope of the agreement. But with big data analytics, particularly those based on Machine Learning, neither party to that conversation can know what the data may be used to discover.
I then go on to discuss the Revised Common Rule, which governs any federally funded human subjects research. The revision takes effect in early 2019, and it relaxes the informed consent rule in a way that will set a bad precedent for private data mining and research. Henceforth researchers will be permitted to obtain open-ended “broad consent”–-i.e. “prospective consent to unspecified future research”–-instead of requiring informed consent, or even ordinary consent, on a case-by-case basis. That’s not a step forward for privacy or personal control of data, and although it’s being driven by genuine public health concerns the side-effects could be very widespread.
In #4 it says you can “stop Amazon from tracking your browsing” but in fact, if you go to the “Your Browsing History” page at Amazon, it appears to offer only to stop showing you your browsing history–it doesn’t actually say they’ll stop collecting it.
Even so, most or all of these steps are worth taking.
Find the EFF surveillance self-defense guide. It offers advice tailored for different groups that might have greater / lesser needs for privacy/defense (e.g. LGBTQ, activists, journalists, lawyers, activists).
Use VPNs — virtual private networks. And only use good ones – be careful about jurisdiction and policies:
The UM off-campus VPN is a valuable service, and good to protect against third parties … but not against UM. Does UM log your usage? Do they record your originating IP#? The sites you visit? Despite some frantic Google searches, I can’t tell — it seems they don’t say. I think therefore you have to assume they do. And if were the UM General Counsel my first instinct would probably be to say they need to do the logging to protect themselves.
Is your VPN service dirt-cheap or free? Does the service cost only a few dollars for a lifetime service? There’s probably a reason for that and your browsing history may be the actual product that the company is selling to others.
Look for establishment in a democratic country with a strong commitment to the rule of law. Without that, even the best promises in the Terms of Service (ToS) to not log web page access OR IP# and access times is meaningless. Note that many, probably most, VPNs in most other countries are required to do some logging.https://it.miami.edu/a-z-listing/virtual-private-network/index.html
Does the VPN promise to prevent DNS leakage to your ISP?
Ideally, the VPN should support IPv6 as well as IPv4 to prevent leakage when the remote site is on IPv6. This will become more important in the future as more and more sites move to IPv6.
Inspect your browser settings on your phone and computer to set max privacy options (including blocking 3rd party cookies and enabling Do Not Track). Use a privacy hardened browser on your phone such as the Warp browser. On both computer and phone always use a search engine such as Duckduckgo that will not track you.
Encrypt every drive, every email (when possible), and especially all cloud-stored data before uploading it.
Get a password manager and use it – never re-use a password. Use 2-factor authentication for google, other services that support it. (Only 10% of google users do!)
Don’t put any apps on your phone that connect to anything financial (due to risk of ID theft if phone stolen).
Lobby UM to make it easier to use VPNs and Tor, on both the wired and wireless networks. Ask UM to be more transparent about what cookies its web pages set and what they track and record. And, importantly, ask UM to not require you take every single UM cookie in order to use the “remember me for 30 days” feature of its authentication app DUO. Also, ask UM to promise that it has your back, and that it will challenge any request for your data to the maximum extent the law allows (right now it makes no such promises at all; even National Security letters are sometimes withdrawn if the data-holding entity says it will go to court to ask for it to be reviewed).
Resist the frame: understand that the true definition of the ‘greater good’ is one in which the individual is able to flourish. Remember that ‘terrorist’ is a label that fits best after conviction – before that what we have is a ‘suspect’; conceivably any of us can be a suspect. So arguments that we should control crypto or prevent privacy in order to give law enforcement access to all our data when they decide they need it should be viewed with great caution and a firm eye on how the powers they want could be misused by them or by others who get hold of their tools. And even if we someday find ourselves in a world where things have gone badly wrong, and we do find ourselves subject to pervasive surveillance, follow Vaclav Havel, who in his great work ‘Living in Truth’ reminded us that so long as we choose not to self-censor we have chosen not to surrender a key part of our freedom.