The German Parliament passed a new social media law, called the “Network Enforcement Act” (“NetzDG” in German), that punishes social media companies with fines ranging from 5 million to 50 million euro if they don’t immediately remove “obviously illegal” content. Human rights groups have called the law rushed and harmful to free expression.
Striking Back Against Hate Speech
According to the German Federal Criminal Police Office, hate crime has grown by 300% in the past two years alone. In a speech today, Germany’s Federal Minister of Justice and Consumer Protection, Heiko Maas, said that freedom of expression is a great asset to have in an open society, but it ends where criminal law begins.
The minister believes that the real attacks against freedom of speech happen when some people use hateful comments and threats to silence and harass others online. In regards to mandating that companies such as Facebook and Google remove hateful comment from their platforms, the minister noted that the companies are not “above the law” and that they must fulfill their obligations.
Presumably, Maas was referring to the fact that, up until this point, companies that managed online platforms weren't liable for their users' posts. However, the ministers seems to believe that the platform managers are also at fault for the hateful content they allow on their platforms.
Although Germany is the first European country to pass such strict laws, Maas has also called for the adoption of EU-wide regulations against hate speech as well as "fake news."
Criticism Against Germany’s New Law
At a recent hearing, of the 10 experts who were invited to testify, eight criticized the law that was still in draft stage at the time. Five of them said the law would be incompatible with the German Constitution. The Network Enforcement Act was slightly modified since then and, for instance, the mention of companies having to implement automatic content filters has been removed.
However, a new “self-regulatory” body will have to be created, whose operation costs will be paid by the private industry. The companies will be able to send some content for evaluation if they are uncertain of its legality. It’s not clear how transparent or accountable this body will be, according to the European Digital Rights (EDRi) non-profit organization.
A recent post on ProPublica has shown that Facebook’s own censorship policies are inconsistent at best. For instance, the company would censor hate speech against white people, but not against certain Muslim groups that were considered radicalized by Facebook's low-paid contractors. It’s also not the first time Facebook’s inconsistent censorship rules have attracted the public’s attention, whether it was about old and popular war pictures or blocking competitors’ links.
The bottom line is that the private companies are expected to become a sort of privatized internet police, when they themselves don’t have a good track record of banning actual illegal content and not infringing on people’s free expression in the process.
EDRi also had some harsh words to say towards the European Commission because it didn’t do more to stop the alleged violation of European Union’s Charter of Fundamental Rights:
The European Commission had legal duties to take action to protect citizens’ interest. It failed to do so. It is not just the Charter of Fundamental Rights of the European Union that the Commission had promised to respect, but also not to “seek nor take instructions” from Member States. It is a pity the European Commission appears to have chosen politics over legal obligations. We can only assume that it is a coincidence that the Commission failed to act against an illegal proposal from the largest EU Member State, while saying that it might take action against other Member States that would propose similar measures.
The Network Enforcement Act won’t go into effect until October, after the German federal election. However, it’s unlikely there will be any more changes to the law until then.
The UK and France have also recently announced a partnership that aims to “tackle online radicalization” by imposing fines and penalties on technology companies that fail to remove content on time.