I was interviewed on WLRN Radio’s Sundial program today about privacy issues relating to home assistants like Alexa, and other privacy topics such as the risks of using private DNA sequencing companies. They’ve published a transcript.
Author Archives: Michael Froomkin
The “lock her up” chants are Trump rallies were often scary and always disgusting. But turnabout doesn’t seem like right play: Even if it was meant ironically, and I rather doubt it, I got no joy from the anti-Trump “lock him up” chants at Nationals Park during Game 5 of the World Series.
I think we need a new chant.
And, just in time to celebrate the first day of the impeachment hearings, I’ve got it, presented here as a poster or t-shirt worthy graphic:
I’ve just uploaded the final text of Big Data: Destroyer of Informed Consent which is due to appear Real Soon Now in a special joint issue of the Yale Journal of Health Policy, Law, and Ethics and the Yale Journal of Law and Technology. This pre-publication version has everything the final version will have except the correct page numbers. Here’s the abstract:
The ‘Revised Common Rule’ took effect on January 21, 2019, marking the first change since 2005 to the federal regulation that governs human subjects research conducted with federal support or in federally supported institutions. The Common Rule had required informed consent before researchers could collect and use identifiable personal health information. While informed consent is far from perfect, it is and was the gold standard for data collection and use policies; the standard in the old Common Rule served an important function as the exemplar for data collection in other contexts.
Unfortunately, true informed consent seems incompatible with modern analytics and ‘Big Data’. Modern analytics hold out the promise of finding unexpected correlations in data; it follows that neither the researcher nor the subject may know what the data collected will be used to discover. In such cases, traditional informed consent in which the researcher fully and carefully explains study goals to subjects is inherently impossible. In response, the Revised Common Rule introduces a new, and less onerous, form of “broad consent” in which human subjects agree to as varied forms of data use and re-use as researchers’ lawyers can squeeze into a consent form. Broad consent paves the way for using identifiable personal health information in modern analytics. But these gains for users of modern analytics come with side-effects, not least a substantial lowering of the aspirational ceiling for other types of information collection, such as in commercial genomic testing.
Continuing improvements in data science also cause a related problem, in that data thought by experimenters to have been de-identified (and thus subject to more relaxed rules about use and re-use) sometimes proves to be re-identifiable after all. The Revised Common Rule fails to take due account of real re-identification risks, especially when DNA is collected. In particular, the Revised Common Rule contemplates storage and re-use of so-called de-identified biospecimens even though these contain DNA that might be re-identifiable with current or foreseeable technology.
Defenders of these aspects of the Revised Common Rule argue that ‘data saves lives.’ But even if that claim is as applicable as its proponents assert, the effects of the Revised Common Rule will not be limited to publicly funded health sciences, and its effects will be harmful elsewhere.
An earlier version, presented at the Yale symposium which the conference volume memorializes, engendered significant controversy — the polite form of howls of rage in a few cases — from medical professionals looking forward to working with Big Data. Since even the longer final version is shorter, and if only for that reason clearer, than much of what I write I wouldn’t be surprised if the final version causes some fuss too.
Via the NYT’s Criticize Israel? For Democratic Voters, It’s Now Fair Game comes news of a study showing that Jews donating to a Democratic Presidential candidate gave most often to … Pete Buttigieg?
According to an analysis by The Forward, a Jewish publication, roughly 5.5 percent of all donors to Democratic presidential candidates in the first half of this year were Jewish, and they accounted for 7 percent of all funds given. (The Forward used several methods, including a research tool called the Distinctive Jewish Names list, to identify donors who were most likely Jewish in campaign finance reports.)
The Forward found that the Democratic presidential candidate receiving the most donations from Jewish backers this cycle has been Mr. Buttigieg, even as he has issued sharp critiques of Mr. Netanyahu.
“You can be committed to the U.S.-Israel alliance without being supportive of any individual choice by a right-wing government over there,” Mr. Buttigieg said at the J Street conference on Monday.
Mr. Sanders, who would become the country’s first Jewish president if elected, railed against giving “carte blanche to the Israeli government, or for that matter to any government at all.”
I find this surprising. And not because of his position on Israel, which doesn’t sound all that different from Sen. Warren’s or even Sen. Sanders’s, it’s just he seems, for all his undeniable linguistic virtuosity, to be willing to oppose progressive ideas …and relatively unprepared for the national and international aspects of the job compared to all the candidates who have held federal office.
One of the striking features of the US economy is how much concentration we’ve allowed in major industries: national monopolies, regional monopolies (e.g. cable), oligopolies. Now come economists tying that trend to why raising the minimum wage doesn’t cause the job losses that micro-economics might predict: one of the things that makes market concentration profitable is that it allows firms to underpay workers (that is, pay them below their marginal productivity). So long as the raise in minimum wage doesn’t rise to a level exceeding the value of the work, businesses rationally decide to hold on to the workers.
José Azar, Emiliano Huet-Vaughn, Ioana Marinescu, Bledi Taska, and Till von Wachter: Minimum Wage Employment Effects and Labor Market Concentration:
Why is the employment effect of the minimum wage frequently found to be close to zero? Theory tells us that when wages are below marginal productivity, as with monopsony, employers are able to increase wages without laying off workers, but systematic evidence directly supporting this explanation is lacking. In this paper, we provide empirical support for the monopsony explanation by studying a key low-wage retail sector and using data on labor market concentration that covers the entirety of the United States with fine spatial variation at the occupation-level. We find that more concentrated labor markets–where wages are more likely to be below marginal productivity–experience significantly more positive employment effects from the minimum wage. While increases in the minimum wage are found to significantly decrease employment of workers in low concentration markets, minimum wage-induced employment changes become less negative as labor concentration increases, and are even estimated to be positive in the most highly concentrated markets. Our findings provide direct empirical evidence supporting the monopsony model as an explanation for the near-zero minimum wage employment effect documented in prior work. They suggest the aggregate minimum wage employment effects estimated thus far in the literature may mask heterogeneity across different levels of labor market concentration.
Spotted via Brad DeLong.
My brother has a new project: Press Watch, “a collaborative project to monitor political reporting and encourage more responsible, informed and informative campaign and government coverage before the 2020 election.”
About This Site
We’re entering a critical period in American politics and American political journalism is not up to the task. Donald Trump’s campaign and presidency have exposed and exploited chronic weaknesses like never before. And despite some progress, elite political press coverage insufficiently rebuts lies; normalizes abnormal behavior; asserts false equivalences; remains overly susceptible to spectacle, conflict, and gamesmanship; fails to contextualize the news with expertise – and on and on.
Over the past several years, a considerable number of expert groups, commissions, panels and individuals have voiced elements of what, writ large, is a fairly coherent and consistent critique of the current practice of political journalism at our major news outlets (see above). But on a day-to-day basis, it’s diffuse. Press Watch will aggregate, amplify, curate and centralize the consistent application of that critique by a network of smart, critical readers.
We’ve also identified some solutions, such as prominently rebutting misinformation; practicing radical transparency; holding politicians accountable to the citizens’ agenda; imbuing our work with civics lessons; pursuing solutions journalism; and encouraging civic engagement. But too much of our discussion of these solutions is theoretical. There’s an urgent need for practical, recreatable models and best practices.
The work product
- A four-day-a-week, real-time assessment of political coverage in the form of a column with critiques harvested from a wide network of expert readers. Our first publishing partner is Salon.com
- Guided, goal-oriented workshops – physical and virtual, held in collaboration with journalism schools and other organizations — that dive into specific elements of political reporting and generate concrete deliverables including guidelines, examples, and recreatable models.
Political reporters are hard to influence. But they are more likely to respond to pressure if the critiques are reasoned, detailed, constant, and coming from respected members of their profession and other experts. They are more likely to do their jobs better if we offer them plausible alternative approaches that don’t create more work or risk. Meanwhile, a lively ongoing discussion of political coverage will encourage the public to read more critically.
The project lead
Dan Froomkin is a trailblazer in the area of online accountability journalism with 21 years of experience building, editing and contributing to websites including the Huffington Post, The Intercept, and the Nieman Foundation’s Watchdog Project. Over 12 years at the Washington Post, he served as Editor of the website and wrote its enormously popular White House Watch column, which aggregated and amplified insightful political coverage. He has taught online journalism at the Poynter Institute and the American University Graduate School of Communication.