J. Vaculíková | 8.11.2024
Supreme Court proposed to the Constitutional Court to repeal part of the Labour CodeTaxes, accounting, law and more. All the key news for your business.
The European Union remains committed to ensuring greater protection for Internet users. The new Digital Services Regulation is designed to drive the big online players with more than 45 million users towards greater transparency and accountability. Giant platforms such as Facebook, Twitter, TikTok, Google and others will have to comply with the requirements of the so-called Digital Services Act from 25 August. The main ambition of the new rules is to make the content targeting mechanism more transparent and to protect minors.
The European Union is once again trying to regulate the online environment more and protect European Internet users better and more effectively. This time, it will shine the spotlight on the big players in particular, through the Digital Services Regulation 2022/2065 (“DSA”), which came into force in November 2022.
Under the DSA, online platforms were required to publish the number of their users by 17 February 2023. If an online platform or search engine has more than 45 million users (10% of the EU population) it has been declared a “very large online platform” (hereinafter “VLOP”) by the European Commission (“EC”).
The EU has identified 19 of them based on the data provided. These are Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube, Zalando, Bing and Google Search, the latter two being described as very large online search engines, Google being, according to the EC, so large and operating a significant number of essential services and platforms that it has been split into multiple entities.
VLOPs will have to conduct an annual assessment of the risks their platforms pose in areas such as public health, child safety and freedom of expression. At the same time, they will need to identify measures to address these risks. The first such assessment must be completed by 25 August 2023.
In basic outline, the obligations could be summarised in the following terms: more permissions for users, strong protection for minors, stricter content moderation and less misinformation, more transparency and accountability.
For example, European users should be provided with an explanation of why they are being shown certain information, they will have the right to opt out of profiling, all ads will have to be labelled so that it is clear who is promoting them, the terms of use of each platform will have to be available to them in a clear way and in their national language.
Systems will have to be redesigned to ensure high protection for minors, and targeted advertising to minors will not be allowed. In addition to the above risk assessment, the VLOP must carry out a specific risk assessment for minors, including negative impacts on their mental health, which it must publish by 25 August 2024.
Platforms should also introduce tools that allow users to easily report illegal content and minimise the impact on freedom of expression. The platform must act swiftly following a report from users. At the same time, the EC, through the DSA, is putting pressure on the VLOP to do more to combat misinformation. European Commissioner Thierry Breton said that the September elections in Slovakia, for example, will be the first full test for the VLOP. In his opinion, Facebook influences public opinion in Slovakia to a large extent. Meta (the operator of Facebook) must ensure that the hybrid war being waged on social media (for example in connection with the Russian-Ukrainian war) does not have a negative impact on Slovak society.
Furthermore, VLOPs must ensure that their risk assessment and compliance with their obligations under the DSA are externally and independently verifiable. Researchers will have access to the data and authorised researchers will be given a specific data access mechanism at a later date. Platforms must also publish transparency reports on content moderation and risk management decisions.
Most of them have not reacted in any way so far. The only platform that has announced steps to comply with the DSA is the social network TikTok. It will allow users in the European Union to turn off the personalised recommendation algorithm and live streams, whereby such users will then be shown posts according to their popularity in the EU.
New tools for reporting objectionable posts will also be introduced. Another significant change will apply to juveniles. Profiles of people aged 13 to 17 will be locked by default, their posts will not be included in the personalised recommendation algorithm and they will be prohibited from being targeted with personalised ads.
Zalando and Amazon have taken the opposite approach. These companies decided to take the route of resistance and challenged their inclusion in the VLOP group before the General Court, the second highest court in the European Union. Amazon argues, among other things, that it is not the largest seller in any of the EU countries, in which it operates, and that its direct competitors in those markets have not been included among the VLOP and has requested that the General Court annul Amazon’s inclusion among the VLOP. Zalando said that due to its business model, it is not possible for it to distribute hateful or otherwise objectionable content.
The EC has a very powerful tool to enforce the DSA and compliance with its obligations. It may decide that the provider of a very large online platform or very large Internet search engine concerned is not complying with one or more of the DSA Regulations. In such a case, the VLOP provider faces a fine of up to 6% of its total worldwide annual turnover in the previous financial year, which is a truly significant blow, especially for these big players.
Author: Tomáš Přibyl