EU: New rules on hate speech for social media in the European Union


There is news for social media providers operating in the European Union regarding the prevention of hate speech and hate crimes: Austria has enacted a law against hate and crime on social media, the law on social media platforms. communication (KoPl-G). In accordance with the German Network Enforcement Act (NetzDG), the two laws aim to make the removal procedure simpler, more transparent and to transfer the responsibility to the social network provider. A unified European law, the Digital Service Act (DSA), could soon replace these local national rules.

1. German law on network enforcement

The German Parliament has just passed the law amending the NetzDG which involves some changes for social media providers. The NetzDG, enacted in 2017 in Germany, was the first in Europe to combat hate speech and crime on social media (more information on the provision of NetzDG on our previous blog).

The latest amendment, which was first proposed in April 2020 (more information on our previous blog) contains the simplification of reporting channels for the complaints procedure and additional information obligations for the semi-annual transparency reports. platform operators. A direct right to information against the platform operator is created in the Telemedia Act (TMG) for victims of illegal content in the networks. The amendment for the NetzDG provides that the user can request a review of the platform provider’s decision to remove or keep the reported content and has the right to have the content restored. This will prevent what is known as “overblocking”, that is, when legal content is removed, and will enhance the freedom of opinion of users. The network provider is now obliged to obtain the comments of the parties concerned and to justify each decision individually. Video sharing platforms are also subject to NetzDG according to the new Sec. 3 (e) NetzDG but only in the case of videos and user generated broadcasts.

2. Austrian law on communication platforms

On April 1, 2021, the KoPl-G entered into force in Austria. The layout of the KoPL-G seems to be inspired by the German NetzDG.

Applies to all platforms in the world that have some connection with Austria

Domestic and foreign communication platforms which have had more than 100,000 users in the last year and sales of more than 500,000 EUR fall within the scope of the law. Platforms that only provide sales and agent services for goods or real estate created by media entities with journalistic content, educational platforms and online encyclopedias are excluded. An exception has been made for video platforms for broadcasts and user-generated videos.

New obligation: review and delete

Like the NetzDG, the KoPL-G requires easy-to-find, always-available, and easy-to-use features to report illegal content. Content should be removed within 24 hours if the illegality is obvious to a layman. When the illegality of the content can only be identified by a detailed review, the social network provider must remove the content no later than seven days after the end of the review. The content is illegal if it constitutes a criminal offense under the Austrian Criminal Code, including criminal harassment, persistent harassment, unauthorized recording of images and pornographic presentations of minors.

Complaint procedure against “overlocking”

The user who reported the content as well as the reported user must be informed of the compelling reasons for the removal of the content to ensure transparency of the process. The KoPl-G further requires that platform providers provide a transparent review process regarding the decision to remove or retain content. In order not to unduly restrict the freedom of opinion of users, Sec. 3 (4) KoPL-G gives the reported user and the reporting user the opportunity to have the platform’s decision regarding the (non-) deletion reviewed.

Obligation to appoint a local representative

Social networks are obliged to appoint a representative for official and judicial functions. Unlike the German NetzDG, the KoPL-G involves the appointment of a German-speaking manager residing in Austria who has the necessary skills and resources to ensure compliance with the law. This representative must be a natural person while the representative can also be a legal person. It is also possible that one person is responsible for both positions.

Must do for social media operators

Social media providers should act quickly to include required functionality on their websites and appoint a service agent as well as a representative. The supervisory authority is empowered to check whether the obligations are met, failing which a fine may be imposed.

3. EU: digital service law

To date, Germany and Austria are the only EU member states to have adopted anti-hate and crime provisions on social media platforms. France has indeed enacted a law against hatred and crime on social media platforms, but the Constitutional Council deemed it unconstitutional due to its incompatibility with freedom of opinion. However, a European law, the Digital Service Act (DSA), could soon replace all local rules in the member state. In December 2020, the European Commission published a proposal which addresses the procedure for social networks in the event of illegal content and intends to prevent a fragmentation of acts across Europe. When the proposal is adopted, the DSA will be mainly applicable and will apply to all Member States.

4. EU: New rules adopted against terrorist content online (social networks and websites)

Another novelty concerning the legal situation of hosting service providers in Europe concerns the fight against terrorist content. Recently, the European Parliament adopted a regulation for the swift and smooth removal of terrorist content from online platforms or websites. Hosting service providers must remove or disable access to content that incites criminal offenses under Anti-Terrorism Directive (EU) 2017/541). The procedure is as follows: The competent authority of the Member State will inform the service provider of any terrorist content. Within one hour of receiving the notification, the content must be removed or made inaccessible in each EU member state. Suppliers can have orders reviewed. If no investigation is in place, the service provider is obligated to notify the user of the deletion. Monitoring or review of content in general is not required, but online platforms should act promptly after receiving the removal request. The regulation is applicable to providers of hosting services where the user can post articles on any platform or website. This means that not only social media platforms are submitted, but also websites with comments functionality. Hosts have less than a year to prepare for this new regulation, it will come into force on June 7, 2022.

5. Code of Conduct on Combating Illegal Hate Speech Online

The Code of Conduct on Combating Illegal Hate Speech Online contains the same rules as mentioned above. Five years ago, big IT companies signed this official code of conduct, committing to filter illegal content and remove it within 24 hours. The European Commission has published an annual assessment of the code of conduct showing that 90 percent of notifications were reviewed within 24 hours and 71 percent of the content was removed. This is a step forward in the fight against illegal content on online platforms.

Source link

Leave A Reply

Your email address will not be published.