What Is the Digital Services Act?
The European Union Digital Services Act (DSA) is legislation that is intended to protect the fundamental rights of Internet users in the EU by holding Internet service providers (ISPs), search engines, social media platforms, online marketplaces, content delivery networks (CDNs) and other intermediaries that provide online services more accountable. The DSA will become effective on January 1, 2024.
Just like the EU’s General Data Protection Regulation (GDPR) and the EU AI Act, the DSA applies to all companies that provide digital services to EU customers. This includes services that are accessible from the EU, as well as services that are targeted at EU users.
Non-compliance could result in fines and penalties of up to 6% of an intermediary’s annual worldwide revenue for the preceding financial year.
Scope of the Digital Services Act
A large portion of the Digital Services Act focuses on content moderation and the responsibility that online platforms have to remove harmful content and identify misinformation.
Accountability is another core element of the DSA. It stipulates that online intermediaries must cooperate with public authorities to combat content and online behavior that could be harmful to others.
This includes the illegal collection, use, or disclosure of personal data without proper consent, as well as the immediate removal of:
- Content that promotes or incites violence, hatred, discrimination, or hostility based on factors such as race, religion, ethnicity, nationality, gender, sexual orientation, or disability;
- Material that supports, glorifies, or encourages terrorist acts, violent extremism, or recruitment into terrorist organizations;
- Content that includes deceptive dark pattern UX practices designed to obtain personal or financial information;
- Any text, image, or video content that exploits or harms children;
- Content related to the sale, distribution, or promotion of illegal drugs, narcotics, or substances controlled by law;
- The unauthorized sharing, distribution, or sale of copyrighted materials, including pirated software, movies, music, and counterfeit goods;
- Content that encourages or instructs individuals to self-harm, harm others, or engage in criminal behavior;
- Content that facilitates cyber threats, including the development and distribution of malware and low-code hacking tools such as ransomware-as-a-service.
Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) with more than 45 million monthly active users in the EU are required to conduct risk assessments of their platforms, publish annual transparency reports, and establish a mechanism that responds to user complaints within two weeks.
Very Large Online Platforms (VLOPs) | Very Large Online Search Engines (VLOSEs) |
Alibaba AliExpress | Bing |
Amazon Store | Google Search |
Apple AppStore | |
Booking.com | |
Google Play | |
Google Maps | |
Google Shopping | |
Snapchat | |
TikTok | |
Wikipedia | |
YouTube | |
Zalando |
Key Provisions of the DSA
The DSA has a number of key provisions, including:
- Mandates for online service provider transparency and accountability;
- Measures to address illegal and harmful content;
- Protection for end user’s right to privacy;
- Protection for end user freedom of expression;
- Increased obligations for very large online platforms.
Additionally, the DSA is designed to protect and empower end users in a number of ways. It requires intermediaries to:
- Obtain users’ consent before collecting or processing their personal data. This consent must be freely given, specific, informed, and unambiguous.
- Provide users with clear and concise information about the intermediary’s data collection and data sharing practices. This information must be easy to understand and must be accessible to users in a variety of formats.
- Give users the right to access their data and request that it be deleted. Users also have the right to have their data transferred to another platform.
- Protecting users’ privacy when they use online services. This includes measures to prevent tracking users across different websites and using their data for advertising purposes without permission.
- Prohibit dark pattern design elements that deliberately obscure, mislead, coerce, and/or deceive website visitors into making unintended choices.
- Require intermediaries to be more transparent about their content moderation practices. This will make it easier for users to understand why their content has been removed or restricted, and it will help to ensure that governments and artificial intelligence (AI) algorithms are not censoring content that is protected by the right to freedom of expression.
- Give users the right to appeal decisions to remove or restrict their content. This will allow users to challenge decisions that they believe are unfair or unjustified.
- Require intermediaries to take into account the right to freedom of expression when making decisions about content moderation. This means that platforms will not be able to remove or restrict content simply because they disagree with the content.
- Establish a new European Digital Media Observatory (EDMO) to monitor the implementation of the DSA and report on potential threats to freedom of expression. The EDMO will be an independent body that will be responsible for ensuring that platforms comply with the DSA’s provisions on freedom of expression.
Controversy
Despite its benevolent goals, the DSA has sparked controversy and has raised important questions about the heavy regulatory compliance burden the Act will place on smaller intermediaries as well as VLOPS and VLOSEs.
Concerns have also been voiced about the potential for the Act to censor content unnecessarily due to a lack of clearly defined terminology within the legislation. Critics maintain that phrases like “appropriate and proportionate measures” are so open to interpretation that the Act’s vague language will make it difficult for stakeholders to comply with – or enforce – the legislation.