The European Union Commission published on December 15, 2020, its Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act, “DSA”). This new horizontal framework aims at being a lex generalist which will be without prejudice to the e-Commerce Directive, adopted in 2003, and to the Audiovisual Media Services Directive, revised in 2018.
This new Regulation is likely to dramatically change how online platforms, particularly social media platforms, moderate content posted by their users. Very large platforms, defined by the DSA as ones providing monthly services on average to 45 million active users in the European Union, will face heightened responsibilities, such as assessing and mitigating risks of disseminating illegal content. Whether this heightened scrutiny may lead illegal speech to migrate to smaller platforms remains to be observed.
Each Member State will design its Digital Service Coordinator, a primary national authority ensuring the consistent application of the DSA. Article 47 of the DSA would establish the European Board for Digital Services, which will independently advise the Digital Services Coordinators on supervising intermediary services providers. A few months earlier, the creation of Facebook’s Oversight Board (“the Oversight Board”) was met with skepticism, but also hope and great interest. Dubbed by some as “Facebook’s Supreme Court,” its nature is intriguing, appearing to be both a corporate board of the U.S. public company Facebook and a private court of law, with powers to regulate the freedom of expression of Facebook and Instagram users around the world. This independent body has for its main mission to issue recommendations on Facebook’s content policies and to decide whether Facebook may, or may not, keep or remove content published on its two platforms, Facebook and Instagram. The Oversight Board’s decisions are binding on Facebook, unless implementing them violates the law.
The law of the States remains thus the ultimate arbiter of the illegality of content. However, such legal corpus is far from being homogeneous. Both the European Court of Justice and the European Court of Human Rights have addressed the issue of illegal content. In the European Union, several instruments, such as the revised Audiovisual Media Services Directive and the upcoming Regulation on preventing the dissemination of terrorist content online, address harmful, illegal content. Member States have each their own legal definitions. France is currently anticipating the DSA by amending its June 21, 2004 Law for Confidence in the Digital Economy. If the French bill becomes law, platforms will have to make public the resources devoted to the fight against illegal content, and will have to implement procedures, as well as human and technological resources, to inform judicial or administrative authorities, as soon as possible, of actions taken following an injunction by these courts or authorities.
The DSA does provide a definition of illegal content but aims at harmonizing due diligence obligations of the platforms, and how they address the issue. For instance, article 12 of the DSA is likely to influence how platforms’ terms and conditions are written, as it directs platforms to inform users publicly, clearly, and unambiguously, about the platform’s content moderation policies and procedures, including the roles played by algorithms and by human review.
This requirement is particularly welcome as the use of algorithms to control speech and amplify messages currently suffers from a lack of transparency, and has been credibly accused of fostering bias: is algorithm bias illegal content? The Oversight Board will only examine a few posts among the billions posted each year, and the cases submitted for approval may be the proverbial trees hiding in the forest: the use of algorithms to control speech by amplifying messages likely to trigger reactions, clicks, likes, often monetizing hate and preventing minority speech to be heard. As freedom of expression is not only the right to impart, but also to receive information, are algorithms, which are striving to keep us in well-delimited echo chambers, the ultimate violation of the freedom of expression?
Users are provided a way to have a say about what should be considered illegal online speech, and both the DSA and the Oversight Board aim at providing users more power. The DSA would provide users a user-friendly complaint and redress mechanism and the right to be provided an explanation if content they have posted is taken down by a platform. The Oversight Board is using a crowdsourcing scheme allowing users all around the world to provide their opinion of a particular case before it is decided.
International users will also be provided the power to define and shape illegal content. Is this a new form of private due process? The DSA would provide several new rights to users: the right to send a notice (article 14 on “Notice and action mechanisms”); the right to complain (article 17 on “Internal complaint-handling system”); the right to judicial redress (article 18 on “Out-of-court dispute settlement”); a special right to flag, if considered particularly trustworthy (article 19 on “Trusted Flaggers” whose opinion will be given more weight than common users); the right to be provided an explanation (article 15 of the Digital Service Act on “Statement of reasons”).
The Oversight Board also gives powers to users: they have the power to appeal a decision made by Facebook about content posted on its two platforms and are also provided the opportunity to share their opinions, knowledge, or expertise in each case. Users are invited to share their explanation of local idioms and standards, shedding further light on what content should be considered illegal. As such, they are given an easy way to file an “amicus brief”‘ comment.
Surprisingly, both the DSA and the Oversight Board appear to have the fundamental rights of users as ultimate standards in their mission. As stated by the DSA Explanatory Memorandum, the DSA “seeks to improve users’ safety online across the entire Union and improve the protection of their fundamental rights.” More curiously, the Oversight Board has shed new light on international human rights. The Oversight Board’s first decisions, issued on January 28, 2021, were not based on U.S. laws or on the private rules of the Facebook platform, but on international human rights laws. As such, it can be argued that their decisions were “based on European values – including the respect of human rights, freedom, democracy, equality and the rule of law,” the same values which are the foundations of the DSA. This research paper will analyze the decisions of the Oversight Board during its first year, including how the cases were chosen, where the speaker of the speech at stake lives, and the legal issues raised by the cases.
However, assessing the legality of content in accordance with the law is still the prerogative of the Courts. The DSA provides them the right to be informed by the platforms on the effect of their order to act against a specific illegal content (article 8). This power is, however, also provided by the DSA to administrative authorities. Will the administrative judge soon have more power than the judiciary to protect freedom of expression? The DSA may lead human rights to become a mere part of corporate compliance, overseen by administrative authorities, as is, arguably, already the case with the right to privacy under the GDPR. Would tortious conduct and civil responsibility be more adequate to protect the rights of users?
Regardless of their differences, the DSA and the work of the Oversight Board may have a common goal, transparency. The DSA aims at setting a higher standard of transparency and accountability, as its article 13 on “Transparency reporting obligations for providers of intermediary services” directs platforms to publish, at least once a year, a clear and comprehensible report on their content moderation practices. Very large online platforms will have to perform external risk auditing and public accountability. The Oversight Board is committed to publicly sharing written statements about its decisions and rationale.
This research paper will compare the place and power provided to victims of illegal speech by the Oversight Board’s caselaw and recommendations, with the place and power provided to victims of illegal speech by the platforms in their DSA compliance practice. It will also examine how the Oversight Board’s decisions have been implemented by Facebook. Are the future European Union Digital Service Act and the corporate Oversight Board at the service of users or they overseeing them?