U.S. tech firms have always found the European market a tricky one to navigate. They struggled with a public backlash following Edward Snowden’s disclosures and have had to cope with what they perceive to be awkward court decisions, like the right to be forgotten.
Now, they face an entirely new challenge. This week, Germany will adopt the very German-sounding Netzwerkdurchsetzungsgesetz, or network enforcement law when translated literally. Dismayed with the reluctance of Facebook and other social media platforms to police hate speech—especially in the wake of the refugee crisis—and with a federal election on the horizon in September, German lawmakers want to rein in the online discourse.
The bill aims to improve the enforcement of social media providers’ existing legal obligation to delete online content once they learn it violates German law. According to the bill, social media companies with over two million users must set up an effective procedure to process complaints regarding certain criminal offenses such as incitement to violence or defamation. Moreover, they have to establish a system to delete all such “evidently unlawful” content within 24 hours, and less “evidently unlawful” content within seven days.
This is where the mess begins. Neither the bill nor the ministry of justice provides a clear explanation as to what constitutes “evidently unlawful” content and what criteria social media platforms should use to make a determination. Additionally, it remains unclear how the law would be enforced. Should platforms only remove unlawful content posted by German users? Should they block German users from seeing unlawful content that originates outside Germany? How do companies determine a user’s location? The bill’s preamble even references the fight against “fake news” as one of the reasons for introducing new legislation, despite the fact that misinformation is rarely illegal.
The legal mess aside, there are real concerns that the law will incentivize social media companies to excessively delete content. Faced with fines of up to €50 million, social media platforms will likely err on the side of caution and delete lawful content when in doubt. Combined with the short deadlines and the absence of a framework allowing users to challenge decisions, this clearly threatens the freedom of expression online, as the German Parliament’s research service lamented. Even worse, the determination of lawfulness will not be made by judges or government officials, but by the social media companies themselves. David Kaye, UN Special Rapporteur to the High Commissioner for Human Rights, correctly criticized the responsibility the bill places “upon private companies to regulate the exercise of freedom of expression.” The government tried to alleviate some of these concerns on short notice and announced last week that particularly difficult cases can be passed on to a new public agency. How that would work remains unclear.
Given these serious concerns and criticism from experts across the political spectrum, it is all the more unfortunate that the bill is likely to achieve little. Deleting posts within twenty-four hours or seven days might still be too long to protect victims of online hate speech given the fast-paced social media environment. Moreover, it is unlikely to effectively deter perpetrators, especially if there are no penalties for reposting deleted content.
The bill does have a silver lining. In the past, some platforms such as Twitter have refused to comply with law enforcement requests for user subscriber information and metadata on the grounds that they did not have an office in Germany. The bill now requires social media companies to establish a domestic point of contact to assist law enforcement with these types of requests (requests for content data will still need to go through a mutual legal assistance process if the content is hosted by U.S. platforms). This is a step into the right direction, since it recognizes that effective law enforcement is needed to signal that criminal offences online will not go unpunished.
Unfortunately, the weak bill and the resulting criticism have overshadowed the critical need to discuss rules for social media companies and user behavior. Whether or not companies like Facebook and Google need to be able to adapt to different cultural environments is an open question, but they certainly need to respect legal frameworks, such as those on hate speech, in different jurisdictions.
Setting the rules of the digital public square, including the identification of what is lawful and what is not, should not be left to private companies. They already exert broad powers over content on their platform through their user guidelines and community standards. Given the importance of social media in our daily lives and political discourse, this issue should also not be left to the courts, where the law is almost certainly going to be challenged. Hopefully, Germany’s next government learns from its mistakes and attempts to come up with a more sensible approach and a better written law.
This commentary was originally published by the Council on Foreign Relations on June 28, 2017.
Research for this publication was conducted within the Transatlantic Digital Debates (TDD) program.
by Tim Maurer
Cambridge University Press
by Mirko Hohmann
by Graham Webster, Niklas Kossow
Transatlantic Digital Debates 2017