Keio University

[Special Feature: Social Media and Society] Takayuki Matsuo: Practice of Social Media Legal Affairs—Taking Account Suspension Responses under the Information Platform Management Act as an Example

Publish: October 07, 2025

Writer Profile

  • Takayuki Matsuo

    Other : Attorney at LawGraduate School of Media and Governance Project Associate Professor

    Takayuki Matsuo

    Other : Attorney at LawGraduate School of Media and Governance Project Associate Professor

The author has written works such as "Theory and Practice of Defamation on the Internet in Recent Precedents *1" and practices social media legal affairs, also known as SNS legal affairs. Social media legal affairs cover various topics; for example, SNS elections, deepfakes, and automated posting accounts known as "bots" have recently become hot topics. However, this article focuses on account suspension *2.

1. The Three Perspectives in Social Media Legal Affairs

In social media legal affairs, there are broadly three perspectives: the subject, the expresser, and the operator.

The subject is often called the "victim," referring to the person who claims to have suffered harm due to an expression on social media. While they are indeed often victims, legal claims may not always be established in the end; therefore, this article uses the neutral term "subject." Subjects seek post deletion, disclosure of the sender, apologies, and prevention of recurrence (including account suspension).

The expresser is often called the "user" or "sender," referring to the person who engages in activities such as posting or streaming on social media. From the subject's perspective, they may be called the "perpetrator." As an expresser, they generally do not want their posts deleted by the operator (though they often delete posts themselves if they are indefensible). Furthermore, from a privacy standpoint, they do not want their anonymous accounts to be unmasked. Additionally, since they treat that social media as their "place of belonging," they strongly reject (unjust) account suspension, which means being deprived of that place.

The operator is sometimes called the social media operator or platform operator, referring to the entity that runs the social media in question. Operators receive requests for disclosure and deletion (account suspension requests) from subjects, while expressers demand that they do not disclose information, delete posts, or suspend accounts. They are truly in a position of being "caught in the middle" between the two parties.

As an attorney, I have handled practice from all of these perspectives.

2. Content Moderation Required to a Certain Extent and the Former Provider Liability Limitation Act

Here, if the service provided by the operator becomes a "social media with poor public order," it will have negative effects such as users leaving. Therefore, operators wish to perform content moderation, such as deletion, for certain problematic posts. However, excessive deletion, disclosure, or account suspension will lead to strong protests from expressers. Conversely, if deletion, disclosure, or account suspension is insufficient, they will face strong protests from subjects. In this situation, the former Provider Liability Limitation Act stipulated the requirements for deletion and disclosure by operators, ensuring that operators would not be held liable as long as they handled content moderation according to those requirements.

Taking deletion as an example, unless there were "reasonable grounds to find that the operator could have known that the rights of others were being infringed by the distribution of information" (Article 3, Paragraph 1, Item 2 of the same Act), the operator was not liable to the subject for distributing the post. Furthermore, even if they actually deleted it, they were not liable to the expresser if there were "reasonable grounds to believe that the rights of others were being unjustly infringed by the distribution of information" (Paragraph 2, Item 1 of the same Article) *3.

In short, a system already existed prior to the amendment to the Information Platform Management Act where, if a subject requested deletion claiming a post was illegal and the operator knew there were reasonable grounds to believe it was an infringing post, the operator could perform the deletion without fear of being held liable by the expresser.

II. Account Suspension Practice in the Era of the Information Platform Management Act

1. Practice of Account Suspension Response under the Former Act

As mentioned, while I believe a certain level of content moderation itself is necessary, the method used can significantly impact the rights of the expresser. Operators possess various means of content moderation, such as deleting individual posts, temporary account suspension, and (temporary) functional restrictions.

Among these many means, "(permanent) account suspension" has an extremely large negative impact on the expresser. In particular, under terms of use, those who receive a permanent suspension are often prohibited from creating another account thereafter. In that sense, suspension is an extremely heavy penalty that prohibits future expression on that social media. For example, even if an expresser actually violates the terms of use and the terms provide for suspension as a penalty, it should not be considered that a minor violation immediately justifies suspension. Such a penalty should be limited to cases where it is justified in comparison with other options—specifically, when the purpose of making the expresser comply with the terms or protecting the rights of the parties cannot be achieved through other, more lenient measures. Nevertheless, the issue of some operators choosing suspension relatively easily has been raised for a long time. For example, in the 2024 Hyogo Prefecture gubernatorial election, the X account of Governor Saito's opponent was suspended multiple times during the election period. It has been pointed out that X's AI algorithm may have been "tricked" by organized reporting, leading to repeated suspensions *4.

A major problem is that some operators have terms of use or guidelines that do not clearly define the criteria for choosing permanent suspension as a last resort, which can be read as allowing arbitrary suspension at the operator's "discretion." In principle, a suspension is a breach of contract by the operator regarding the social media service agreement. For such a breach to be justified, strict requirements must be met. Even if stipulated in the terms of use, the ability to perform such actions arbitrarily is unacceptable from the perspective of the importance of social media services and the social responsibility of operators.

I have fought against operators on behalf of expressers for many years regarding inappropriate suspensions, including the unfair use of AI algorithms. For example, I represented a VTuber in a lawsuit against an international video streaming platform at the Tokyo District Court and won *5.

In such practical responses, I have achieved the lifting of many account suspensions through a two-step approach: first attempting to resolve the suspension through out-of-court communication, and if that is impossible, pursuing legal proceedings. That is, if an unfair suspension is simply an error by an AI algorithm, notifying them that an attorney has been retained may prompt a human representative to review it, notice the error, and lift the suspension. If the suspension is not lifted after notification, using legal proceedings allows a Japanese attorney to take over the case and investigate which post was judged to be a serious violation requiring suspension and why, which may lead to the suspension being lifted.

As stated above, I understand the necessity of content moderation itself and do not argue that "no content moderation should be done at all" *6. However, since content moderation methods range from those with high impact to those with low impact on the expresser, the choice should be appropriate and proportionate to the degree of the expresser's actions.

Furthermore, I understand the necessity of using AI for such content moderation. However, problems may arise, such as algorithms becoming unfair *7. Therefore, to prevent such problematic processing by AI from being presented to the expresser as the company's official judgment, responses including the so-called "Centaur model," which incorporates a human verification process, should be considered *8.

2. The Information Platform Management Act

The Information Platform Management Act expects problematic posts to be promptly deleted outside of legal proceedings by mandating that some large-scale operators establish systems for deletion and other actions.

Specifically, based on three requirements—scale requirements such as "10 million or more monthly active users or 2 million or more monthly posts," the technical feasibility of measures to prevent the transmission of infringing information, and not being a specified telecommunications service with a low risk of rights infringement—certain large-scale operators were designated and subjected to special obligations. Currently, Google (YouTube), LY Corporation (Yahoo! Chiebukuro, LINE OpenChat, etc.), Meta (Facebook, Instagram), TikTok Pte. Ltd. (TikTok, TikTok Lite), X Corp. (X), Dwango (Niconico), CyberAgent (Abema Blog), Shonan Seibu Home (Bakusai), and Pinterest Europe Limited (Pinterest) have been designated.

These large-scale operators are primarily required to respond to rights infringements on their platforms under the Information Platform Management Act as follows:

• Publication of methods for accepting deletion requests (Article 22): They are obligated to establish and publish online methods for accepting deletion requests from subjects. These methods must allow for requests in Japanese and must not impose an excessive burden on the subject.

• Conduct of infringement information investigations (Article 23): When a deletion request is made, they must conduct the necessary investigation without delay.

• Appointment of specialists (Article 24): To conduct necessary investigations without delay upon a deletion request, they must appoint specialists (such as attorneys) with sufficient knowledge and experience regarding responses to rights infringements.

• Notification of results to the subject (Article 25): They are obligated to notify the subject of whether deletion measures were taken and the reasons why, in principle within 7 days of the request. If measures are not taken, specific reasons must be provided to assist in any subsequent requests.

• Publication of deletion criteria (Article 26): They must publish their own deletion criteria and take deletion measures in accordance with them.

• Notification to the expresser (Article 27): If deletion measures are taken, they must notify the sender of that fact and the reasons without delay.

• Publication of implementation status (Article 28): They are obligated to publish their operational status once a year, including the status of accepting deletion requests, the status of notifications to subjects and expressers, and their own evaluations of these situations.

If these obligations are violated, the Minister of Internal Affairs and Communications can issue recommendations or orders to correct the violation (Article 30). If an order is violated without a justifiable reason, penalties such as imprisonment or fines for the individual, and fines for the corporation, will be imposed (Articles 35 and 37).

3. The Information Platform Management Act and Account Suspension

Interestingly, the Information Platform Management Act contains provisions regarding account suspension.

Specifically, Article 2, Item 8 of the Act defines "measures to prevent the transmission of infringing information," which refers to so-called deletion *9.

In contrast, Item 9 of the same Article defines "measures to prevent the transmission of said information and to stop the provision of specified telecommunications services to the sender of said information" as "service provision suspension measures." These service provision suspension measures involve stopping the provision of the specified telecommunications service—in other words, not providing the service at all—and thus include the concept of account suspension. It should be noted that this is not limited to "permanent" suspension; temporary suspension is also conceptually included in service provision suspension measures.

Furthermore, Article 26, Paragraph 2, Item 2 of the Act stipulates a duty of effort for large-scale operators when establishing criteria for deletion and other actions: "In cases where service provision suspension measures may be taken, the criteria for implementing service provision suspension measures shall be established as specifically as possible." *10

This applies only to large-scale operators and is merely a duty of effort. Nonetheless, it has the potential to be a more effective response to the current problems mentioned above, such as unclear suspension criteria or terms of use and community guidelines that allow for discretionary suspension.

This can be seen as indirectly stipulating that account suspension, as a last resort, should be implemented only when it is truly unavoidable and based on specifically predefined criteria. I believe this is very suggestive for future account suspension response practice.

4. Challenges and Directions for Improvement of the Information Platform Management Act

As described, the Information Platform Management Act has introduced provisions pointing in a desirable direction regarding account suspension.

However, individually, as mentioned in Section 3 above, establishing specific criteria for account suspension remains a duty of effort. Furthermore, at least in terms of wording, it is only stipulated to be "established as specifically as possible," without clearly stating the specific content that should be established.

Regarding such challenges, for example, as part of the publication of implementation status mentioned in Section 2, Article 18, Paragraph 5, Item 8 of the Enforcement Regulations of the Information Platform Management Act requires the publication of the number of account suspensions categorized by reason and the circumstances leading to the measures. Additionally, Item 10 of the same paragraph requires the publication of the number of account suspensions using AI, Item 15 the number of appeals against account suspensions (Item 16 for those using AI), and Item 17 the number of withdrawals of account suspension measures (Item 18 for those using AI). I hope that the Information Platform Management Act and its practice will further improve based on these published details.

III. Conclusion

I have briefly discussed social media practice from the perspective of account suspension response. While some individual details may vary, the intent of the Information Platform Management Act itself is very good. I strongly hope that future operations, especially regarding account suspension, will accumulate practices that fully reflect this intent.

*1 Takayuki Matsuo and Yuichiro Yamada, "Theory and Practice of Defamation on the Internet in Recent Precedents" (Keiso Shobo, 2nd Edition, 2019).

*2 Regarding the Information Platform Management Act, an article titled "Procedural Responses to Defamation and Infringement of Honor under the Information Platform Management Act (Provisional)" is scheduled to be published in Gakushuin Law Studies within this fiscal year.

*3 Note that there is no substantial change in this content even after the amendment to the Information Platform Management Act described later.

*4 Regarding elections and social media, see Broadcasting Ethics & Program Improvement Organization, "Broadcasting Human Rights Committee Holds Kinki District Opinion Exchange Meeting: 'Election Reporting in the SNS Era - Discussion Across Station Boundaries'" February 3, 2025 (https://www.bpo.gr.jp/?p=12337); Takayuki Matsuo, "AI Concerning Elections and Politics - Focusing on Deepfakes on SNS," "Trust Formation from Social Aspects in the Information Society" (Korea Busan Research Group, February 21, 2025); and Takayuki Matsuo, "Disinformation Centered on Deepfakes in the Context of Elections as 'AI Evil' and Civil/Criminal Responses: Focusing on Defamation and Obstruction of Business," "Trust Formation from Social Aspects in the Information Society" (Japan-Korea Busan Research Group, July 25, 2025).

*5 Takayuki Matsuo, "On Private Law Remedies for Account Suspension by Platforms" (Information Law Studies No. 10, 2021), p. 66 et seq. See also Takayuki Matsuo, "Legal Issues of Cybernetic Avatars" (Kobundo, 2024), p. 168 et seq., and Takayuki Matsuo and Akio Takada, "Legal Relationships Surrounding Platformers," in Internet Case Practice Research Group, "Case System of Internet-Related Incidents: Thinking and Practical Response for Dispute Resolution" (Gyosei, 2025), p. 284 et seq.

*6 As mentioned above, I myself sometimes represent subjects to request content moderation.

*7 See Satoshi Narihara and Takayuki Matsuo, "Discrimination and Fairness by AI: Using the Financial Sector as a Subject" (Quarterly Personal Finance, Winter 2023), p. 11.

*8 See Takayuki Matsuo, "Legal Practice of Generative AI" (Kobundo, 2025), pp. 492–493, which discusses the necessity of human verification in light of the possibility of hallucinations.

*9 However, it should be noted that even if information is not physically deleted from a server, if the post becomes inaccessible, it is considered that measures to prevent the transmission of infringing information have been taken because the transmission of the problematic post is stopped.

*10 See also Guideline III-1 regarding the obligations of large-scale specified telecommunications service providers under the Act on Dealing with Rights Infringement Caused by Information Distribution via Specified Telecommunications, which states: "The criteria for implementing transmission prevention measures formulated by large-scale specified telecommunications service providers 'as specifically as possible' (Article 26, Paragraph 2, Item 1 of the Act) should be defined 'as specifically as possible.' Specifically, they should be clearly described using not only legal terms but also expressions used by the general public, categorized by types such as slander, piracy, and suicide."

*Affiliations and titles are as of the time of publication of this magazine.