Two weeks ago, the German Federal Government adopted its long-awaited digital strategy (“Digitalstrategie”). Among other things, it concretises a plan that has already been part of the coalition agreement: a Law Against Digital Violence (“Gesetz gegen digitale Gewalt”) is intended to take more consistent action against hate and incitement on the internet, especially in social networks. The law is intended to reduce “legal obstacles for those affected” and “gaps in the right to information”. In addition, “the legal framework conditions for electronic procedures for filing charges” are to be created and “account suspension ordered by a judge” are to be enabled. The latter in particular would mean a major interference in the platform sovereignty of providers of online platforms. Courts could order them to delete accounts if these accounts are accused by users of spreading hate speech.

In Germany, the Network Enforcement Act (“Netzwerkdurchsetzungsgesetz – NetzDG”), introduced in 2017, has so far served to combat hate crime and other punishable content online. Among other things, the NetzDG provides for specific deletion deadlines and reporting obligations of suspicious content to the Federal Criminal Police Office (“Bundeskriminalamt – BKA”), which are currently suspended due to lawsuits filed by numerous platforms, such as Facebook, Google, YouTube and TikTok. However, according to the Federal Government’s answer to a written question by the CDU/CSU, the NetzDG will largely be replaced by the Digital Services Act (DSA) due to its “fully harmonising effect”.

The DSA will come into force at the beginning of 2024; for particularly large platforms with more than 45 million users per month, some of the legal requirements it contains will take effect earlier. According to the final draft of the DSA, social network operators must offer extensive contact and complaint options in the future to enable users to report insults and threats more directly and quickly. In contrast to the German NetzDG, the DSA does not provide for concrete deletion deadlines or reporting obligations to judicial and law enforcement authorities. The DSA also does not provide for account suspensions ordered by a court. Instead of the courts, the DSA sees the responsibility here with the platforms themselves: Article 20 of the final draft states that “providers of online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.” This is apparently not enough for the Federal Government, as it already announced in an answer to a written question by the CDU/CSU: its concern to enable account suspensions ordered by a judge was not taken into account in the DSA because its general direction had “already been completed before the coalition agreement was published”. The Federal Government therefore wants to examine “the extent to which judicially ordered account suspensions can be implemented in national law.”

The project is thus of such importance to the Federal Government that it wants to go beyond the requirements laid out in the DSA. Providers of online platforms and especially social networks should therefore closely follow the development and further announcements of the Federal Government regarding this project. If a corresponding legal regulation were to be introduced, users of online platforms would in future have the right to enforce suspension of accounts that disseminate illegal content within the framework of civil law proceedings. This would require a new responsibility sharing between platform providers and the state, the practical feasibility of which must be ensured.

According to its digital strategy, by 2025 the Federal Government wants to be judged by whether “the law against digital violence and the corresponding advisory services offer effective support to those affected to defend themselves against digital violence.” After a comprehensive legal review, the Federal Ministry of Justice (“Bundesministerium der Justiz – BMJ”), which is responsible for implementing the law, will soon work out a first draft law. Operators of potentially affected platforms should therefore already seek discussions with the responsible persons at the BMJ. Only by closely accompanying the legislative process from the beginning can a form of responsibility sharing between providers of online platforms and the state be achieved that can be positively implemented in practice.

Photo Credit: