“Once they lose trust in their leaders, in mainstream media, in political institutions, in each other, in the possibility of truth, the game’s won.”
Barack Obama, April 2022[1]
Regulation (EU) 2022/2065, the Digital Services Act or “DSA”, was published in the EU Official Journal on October 22 and came into force last November. This wide-ranging EU regulation will revolutionize the way in which online platforms can conduct their business in the European Union and raises a significant compliance challenge to many providers of intermediary services who must now make sweeping adjustments to internal procedures, online interfaces, and reporting practices, among many other aspects.
The DSA pursues the goal of ensuring a safe, predictable, and trustworthy online environment and does so, essentially, through a focus on online moderation of illegal content, a concept which is construed broadly, to cover information relating to illegal content, products, services, and activities. It may be the case that certain information is, in itself, unlawful (e.g., illegal hate speech or terrorist content), or is rendered unlawful because it relates to illegal activities (e.g., sharing of content depicting child sexual abuse, online stalking, sale of counterfeit goods or content piracy).
A lot has been made of the dangers of mis- or disinformation but the risks posed by the new DSA framework to diversity of opinion have been mostly swept under the rug. This is dangerous because it shows the possibility that mechanisms introduced by the DSA may, depending upon the way they are implemented, be used to “shut down” dissenting opinion and, moreover, because if fails to acknowledge that there is a crisis of confidence well upstream of the perceived crisis of content.
Trust in mainstream and officially sanctioned sources is broken, at least in Western nations. In Europe one intuitively notices a growing mistrust against national governments, and the European Commission. Not only do people remember a stream of “official” lies and scandals (weapons of mass destruction in Iraq, Mr. Juncker’s “when it gets serious, you have to lie” – sadly the list of examples could go on), basic transparency is also being slowly but surely replaced by secrecy and obfuscation.
Two examples from last year, one from the US and the other closer to home, in the EU – both related to the Covid-19 pandemic and almost entirely ignored by mainstream media – are a glaring illustration of this worrying trend.
In August 2021, a group of physicians and scientists organized as the Public Health and Medical Professionals for Transparency (PHMPT) submitted a Freedom of Information Act (FOIA) request to the U.S. Food and Drug Administration (FDA) seeking access to all relevant information and documents pertaining to the licensing procedure of Pfizer/BioNTech’s Covid-19 vaccine[2]. Following the FDA’s denial to comply with an expedited procedure, the plaintiffs sued the agency in the federal court of the Northern District of Texas.
In its response to the claim, the FDA proposed to release the requested documents at a rate of 500 pages per month[3], which meant it would have taken 55 years before the entire file had been made public: “FDA proposes to process and produce the non-exempt portions of responsive records at a rate of 500 pages per month. This rate is consistent with processing schedules entered by courts across the country in FOIA cases. Plaintiff’s request (as set forth below) that FDA process and produce the non-exempt portions of more than 329,000 pages in four months would force FDA to process more than 80,000 pages per month.”
That 55-year delay compares with the approximately 108 days it took the FDA to review the same documentation before deciding to authorize the Pfizer mRNA vaccine – an impressive discrepancy. However, in January 2022, the court eventually ordered the FDA to produce the entire file in approximately 8 months which was, apparently, complied with.
Closer to home, Ursula von der Leyen, President of the European Commission, was heavily criticised by the EU Ombudsman, Emily O’Reilly, for failing to produce – or even search for – text messages she exchanged with the CEO of Pfizer while negotiating the procurement of Covid-19 vaccines in the first half of 2021.
According to the New York Times[4], the Commission President corresponded intensively with the CEO of this pharmaceutical company by text message, whilst negotiating the deal that would secure 1.8 billion doses of the Pfizer/BioNTech Covid-19 vaccine for distribution in the EU. On May 4, 2021, a journalist submitted a request for access to documents under Regulation (EC) No. 1049/2001 encompassing “text messages and other documents relating to the exchange between President Ursula von der Leyen and Albert Bourla, the chief executive of Pfizer, since January 1, 2021”. In response, the Commission gave full or partial access to certain documents but said it held no text messages.
As the EU Ombudsman’s Decision noted, in July 2022, its inquiry “showed that the Commission had asked its President’s private office (cabinet) to search only for documents that fulfil the Commission’s recording criteria. As the Commission does not to register text messages, the search did not yield any results. Thus, the Commission had not attempted to identify any text messages beyond what had been registered in its record management system, and it had therefore not even assessed whether any such text messages should be disclosed.” In other words, the Commission chose to construe the access request in a restrictive manner, avoiding the infamous text messages like the plague: the circular logic seems to have been that only relevant information sources are registered and since text messages are not registered that must mean that they don’t, as a rule, contain relevant information.
Indeed, in a reply to a question from the EU Parliament, the Commission Vice-president and Commissioner for Values and Transparency (quite the misnomer…), Vera Jourová, actually said the following: “Due to their short-lived and ephemeral nature, text and instant messages are not meant to contain important information relating to policies, activities and decisions of the Commission; they therefore neither qualify as a document subject to the Commission record-keeping policy nor are they falling within the scope of Regulation 1049/2001 on access to documents.”[5] Well, perhaps Mrs. von der Leyen should have avoided texting to negotiate one of the largest EU deals ever, then.
It’s hardly a surprise that, in a decision dated July 12, 2022[6], the EU Ombudsman confirmed maladministration by the Commission in failing to conduct a thorough search on whether the text messages in question existed and, if so, in producing them. And, coincidentally or not, the European Public Prosecutor’s Office later confirmed in October 2022 that an investigation was ongoing into “the acquisition of COVID-19 vaccines in the European Union”[7]. And in January of this year, the New York Times brought legal action against the Commission for failing to give access to the controversial text messages (Case T-36/23, Stevi and The New York Times v. Commission).[8]
Now, this is the backdrop against which the DSA has been designed and is expected to operate as a vehicle to enhance online safety, predominantly by promoting content moderation. Commissioner Breton has not failed to push home the high expectations the Commission has for this landmark piece of legislation, for instance by making it clear to Elon Musk last December that he expects Twitter 2.0 to fully comply and that “there is still huge work ahead, as Twitter will have to implement transparent user policies, significantly reinforce content moderation and protect freedom of speech, tackle disinformation with resolve, and limit targeted advertising.”[9]
In my view, the challenge of rebuilding trust in Governments and public agencies must not be underestimated if implementation of the DSA is to yield better (fairer, faster and less arbitrary) results that the previous framework. The technicalities of notice and action mechanisms should not be relied on to trivialize content removal, at the risk of further undermining trust and transparency. And some new roles introduced by the DSA – such as that of “trusted flagger” – must be viewed with caution to prevent a gatekeeping, or mainstreaming, effect on access to information.
Rebuilding trust is essential and that will require inclusive and grown-up content moderation policies and, most of all, more transparent public institutions and corporations. Will the DSA be up to the task of promoting safety without harming transparency? Fingers crossed.
