August 23, 2022
The "Platforms and the '2040 Problem'" project is part of KGRI's "2040 Independence and Self-Respect Project." For this project, the "Study Group on Democracy in the Digital Age" was held, welcoming Albert Ingold, a constitutional scholar and media law researcher from the University of Mainz in Germany.
In Germany, the concept of the "public sphere (Öffentlichkeit)" refers to an intermediate domain between the completely private sphere and the public sphere managed by the state. In addition to referring to a civil society sustained by autonomous individuals, it is also used to describe a communication space for the formation of public opinion, where anyone can participate in autonomous and rational discussions. Historically, it originated in the coffee houses of 18th and 19th century England and the salons of France, and later transformed with the emergence of print media such as newspapers, and then broadcast media. Today, it is said that the development of the internet and social media has given rise to a new "digitized public sphere (digitalisierte Öffentlichkeiten)."
But what exactly is this digitized public sphere? Furthermore, in the legal domain, what is the significance of the public sphere and its transformation, and what is the optimal form of its regulation? Professor Ingold's lecture and a discussion with KGRI members addressed these unavoidable issues when considering platforms, democracy, and freedom of expression. This report provides a digest of the content.
Speaker: Albert Ingold
Professor, Faculty of Law, University of Mainz. Born in Bielefeld, Germany, in 1980. After studying law at Humboldt University of Berlin, he served as an assistant at the same university and obtained his doctorate in law in 2007. After completing his legal clerkship, he passed the second state examination in law in 2008. He became an assistant at the University of Munich in 2009 and has been in his current position since 2017. His specializations are constitutional law and media law.
Moderator and Interpreter: Tomoaki Kurishima
Associate Professor, Graduate School of Humanities and Social Sciences, Saitama University. Born in Tokyo in 1989. He majored in constitutional law at the Graduate School of Law, Keio University, and studied abroad at the Faculty of Law, University of Munich, from 2015 to 2017 as a German Academic Exchange Service (DAAD) scholar. He served as an assistant professor at the same graduate school from 2018 and has been in his current position since 2019. He also serves as a council member of the German-Japanese Association of Jurists (DJJV).
Lecture: Albert Ingold, "A Typology of Public Sphere Regulation for Internet Platform Operators in Germany and the EU"
Tomoaki Kurishima: I am Kurishima, and I will be serving as today's moderator and interpreter. We will proceed with our discussion with Professor Albert Ingold from the University of Mainz in Germany, who researches issues of the internet and the digitized public.
(The following is an edited excerpt from the lecture)
Albert Ingold: Today, I will speak about the "regulation of the public sphere" in Germany and the EU. The premise is the change in the social discourse space brought about by the internet. In Germany, the so-called "Social Media Countermeasures Act" (official name: NetzDG/Netzwerkdurchsetzungsgesetz) was enacted in 2017, obligating social media operators of a certain size to take measures against illegal content, including hate speech and defamatory expressions. Behind this is the idea that online communication, including aggressive debates and fake news, affects the political public sphere in a democratic society. In response, the question is how to balance this with the protection of individual personality rights.
First, regarding the position of the public sphere in the German legal system, multiple perspectives are possible from a constitutional law standpoint depending on the context, but in any case, it must be understood as something that changes with societal progress. Particularly in the digital domain, we must first correctly recognize how the structure of social communication is changing and then consider how norms should be structured.
There are two major problems to consider when thinking about regulating the public sphere. One is that communication in the digital domain is extremely broad and complex, and research on its substance is still insufficient. The interpretive theory concerning risk (Risiko) provides the legal leeway to respond to this situation. This allows for precautionary measures to be taken even for matters that have not been scientifically proven, for the purpose of protecting legal interests and preventing crime.
The other is that due to the influence of the digitized network structure, the approach focused on ensuring diversity, as has been considered for public broadcasting, has become difficult. Based on this situation, there is a growing movement to shift the purpose of regulation to guaranteeing individual autonomy. Specifically, one approach is to regulate the tendency to pursue economic value by excessively driving attention on online news and social media.
Next, regarding possible regulatory methods. The first is to impose positive content obligations on platform operators. For example, this could include obligating them to prioritize the display of high-quality content when showing search results. However, the question here is how to balance this with ensuring diversity.
Second, negative content obligations. This could include obligating the removal of undesirable content, such as by filtering posts to see if they contain content that could lead to crime. This method is central to the EU's DSA (Digital Service Act), but how to protect the fundamental rights of each party—such as providers, content creators, and users—is being debated.
The third is to demand algorithmic transparency for communication infrastructure. This can be said to be a relatively easy method to apply.
These discussions are not yet settled and show how difficult it is in terms of legal policy to respond to the digitized public sphere. I look forward to discussing this with all of you as we continue to consider it with interdisciplinary knowledge.
Discussion: Challenges and Prospects of Internet Regulation, Considered from the German Case
(The following is an edited excerpt from the discussion)
Tatsuhiko Yamamoto (Professor, Keio University Law School; Deputy Director, KGRI): From my side, I would first like to ask about the difference between illegal content and harmful content. Assuming that platforms are obligated to take measures against illegal content like hate speech, to what extent should platforms be held responsible for content that is not illegal but is harmful, such as fake news? Second, please share your thoughts on the obligations that the EU's Digital Services Act (DSA) seeks to impose on very large online platforms like Google.
Ingold: The first question is a difficult issue that has been long debated. Should platform operators be allowed to remove what they deem harmful, or should they be obligated to respect users' fundamental rights, such as freedom of expression, and thus remove only what is truly illegal? Regarding Facebook, the Federal Court of Justice last year expressed the view that while recognizing the operator's discretion, a high procedural hurdle should be imposed in relation to freedom of expression.
As a premise, we must consider what fundamental rights platform operators themselves possess. I personally believe that since these operators handle expressive content, they possess not only economic fundamental rights but also the fundamental right to communication related to freedom of expression.
Regarding the second question, I would like to point out that the thinking behind the DSA bill is said to have been developed into civil legislation from the perspective of consumer protection, stemming from the EU's "e-Commerce Directive."
Participant (Private Company): You mentioned granting a certain degree of discretion to platform operators. In Japan, too, initiatives are underway, such as publishing transparency reports, to clarify the principles behind their content moderation (monitoring and removing inappropriate content).
However, while there is a social demand for measures against fake news, I think it is very difficult to decide where to draw the line while maintaining a balance with democracy and the public space. If there are any relevant German cases that could serve as a reference, please let us know.
Ingold: It is indeed a difficult problem. We are required to respond without knowing whether moderating in line with social norms ultimately promotes social polarization or the opposite. However, I believe it is more desirable for companies to make judgments on a case-by-case basis rather than for the state to draw the line at a high level.
In Germany, there are cases of self-regulation by companies in which the state is also involved, often described as "regulated self-regulation." It is similar to "co-regulation" in Japan, but I think its characteristic feature is that the rules created by companies are entrusted to a third-party body, and the state also accepts them.
A typical example of this is the "Presserat" (German Press Council), a monitoring body established by various press companies. On the other hand, regarding the internet, while there is potential to expand the provision concerning the "duty of care to be met by journalists," as indicated in Article 19 of the "Interstate Media Treaty," to "regulated self-regulation," it is not yet functional at this point. From the perspective of platform operators, being regulated by the state would reduce their own obligations, so I think they may be waiting for the government to act.
Kodai Hakeyama (Lecturer, Faculty of Regional Co-creation, Kyushu Sangyo University): I felt a similarity to consumer law in the understanding that the approach is shifting from content-based regulation to attention-based regulation. In other words, it's the image of applying the concept of contract types that are susceptible to adverse effects from businesses—the objects to be protected—to the digital public sphere.
Ingold: When considering the future of regulation, it is a correct observation that consumer protection and data protection perspectives are behind the argument that it is a natural progression to move toward regulating attention rather than content. In a sense, the question is how to apply paternalistic regulation to enable free judgment. To what extent paternalistic regulation can be implemented to protect people's free judgment is, I believe, an extremely difficult problem.
Haruna Kawashima (Project Associate Professor, KGRI): Regarding "regulated self-regulation," is there a situation in Germany where platform operators could unite as one? This is because the model of operators imposing ethical regulations on themselves is easy to imagine if it's a system where professionals in a field, like a medical association, come together to create their own rules. Even in such fields, there are cases where state regulation is central, as in France. Is voluntary discipline by operators possible in Germany or the EU? Also, in Germany, is regulation from the perspective of state-enforced journalist deontology (professional ethics) also applied to network operators?
Ingold: Speaking of the media situation in Germany, while various newspaper companies are successively entering the internet, much of the information provided by public broadcasting is also published online, and "T-Online," operated by Deutsche Telekom, is widely used as a news feed. Unlike traditional media entities such as newspaper companies and broadcasting stations, the current situation is that new internet operators are trying to evade regulation as much as possible by not positioning themselves as media providers or news operators.
In response to this trend, there have also been changes in the movement toward "regulated self-regulation" by professional industry groups. The "Interstate Media Treaty" has crafted its wording to say, "insofar as they are involved in editing, they must meet the standards of a journalist," in an attempt to extend its scope to internet platform operators. Considering these developments, I believe the future trend will be toward the establishment of journalist organizations and a move to mandate the participation of internet operators.
Kurishima: Our time is almost up. Professor Ingold, thank you very much.
Ingold: If you could continue to inform me about the state of regulation in Japan that I heard about today, I believe it would be very meaningful for my own research. Thank you for today.
Held online on March 1, 2022
*Affiliations and titles are as of the time of the event.