This episode of the Cyberlaw Podcast delves into the use of location technology in two big events – the surprisingly outspoken lockdown protests in China and the January 6 riot at the U.S. Capitol. Both were seen as big threats to the government, and both produced aggressive police responses that relied heavily on government access to phone location data. Jamil Jaffer and Mark MacCarthy walk us through both stories and respond to the provocative question, what’s the difference? Jamil’s answer (and mine, for what it’s worth) is that the U.S. government gained access to location information from Google only after a multi-stage process meant to protect innocent users’ information, and that there is now a court case that will determine whether the government actually did protect users whose privacy should not have been invaded.
Whether we should be relying on Google’s made-up and self-protective rules for access to location data is a separate question. It becomes more pointed as Silicon Valley has started making up a set of self-protective penalties on companies that assist law enforcement in gaining access to phones that Silicon Valley has made inaccessible. The movement to punish law enforcement access providers has moved from trashing companies like NSO, whose technology has been widely misused, to punishing companies on a lot less evidence. This week, TrustCor lost its certificate authority status mostly for looking suspiciously close to the National Security Agency and Google outed Variston of Spain for ties to a vulnerability exploitation system. Nick Weaver is there to hose me down.
The UK is working on an online safety bill, likely to be finalized in January, Mark reports, but this week the government agreed to drop its direct regulation of “lawful but awful” speech on social media. The step was a symbolic victory for free speech advocates, but the details of the bill before and after the change suggest it was more modest than the brouhaha suggests.
The Department of Homeland Security’s Cyber Security and Infrastructure Security Agency (CISA) has finished taking comments on its proposed cyber incident reporting regulation. Jamil summarizes industry’s complaints, which focus on the risk of having to file multiple reports with multiple agencies. Industry has a point, I suggest, and CISA should take the other agencies in hand to agree on a report format that doesn’t resemble the State of the Union address.
It turns out that the collapse of FTX is going to curtail a lot of artificial intelligence (AI) safety research. Nick explains why, and offers reasons to be skeptical of the “effective altruism” movement that has made AI safety one of its priorities.
Today, Jamil notes, the U.S. and EU are getting together for a divisive discussion of the S. subsidies for electric vehicles (EV) made in North America but not Germany. That’s very likely a World Trade Organziation (WTO) violation, I offer, but one that pales in comparison to thirty years of WTO-violating threats to constrain European data exports to the U.S. When you think of it as retaliation for the use of General Data Protection Regulation (GDPR) to attack U.S. intelligence programs, the EV subsidy is easy to defend.
I ask Nick what we learned new this week from Twitter His answer – that Elon Musk doesn’t understand how hard content moderation is – doesn’t exactly come as news. Nor, really, does most of what we learned from Matt Taibbi’s review of Twitter’s internal discussion of the Hunter Biden laptop story and whether to suppress it. Twitter doesn’t come out of that review looking better. It just looks bad in ways we already suspected were true. One person who does come out of the mess looking good is Rep. Ro Khanna (D., Calif.), who vigorously advocated that Twitter reverse its ban, on both prudential and principled grounds. Good for him.
Speaking of San Francisco Dems who surprised us this week, Nick notes that the city council in San Francisco approved the use of remote-controlled bomb “robots” to kill suspects. He does not think the robots are fit for that purpose.