EU Digital Services Act (DSA) Explained
What is the EU Digital Services Act? Landmark piece of the EU legislation aimed at creating a safer, more transparent, and more accountable digital environment.
Feature writer for Geopolitical Affairs: Leo von Breithen Thurn; Geopolitics, Policy, Risk Intelligence, Crisis Management Consultant
Key Policy Facts
Transparency and Accountability
Enhance User Safety
Coordinated Rules Across the EU
Protect Fundamental Rights
Due Diligence Obligations
Compliance and Enforcement
Transparency and Accountability
The DSA requires platforms to disclose how they moderate content, use algorithms for recommendations, and handle advertisements. This transparency is designed to hold platforms accountable for their decisions.
Example: Social Media app
Content Moderation Policies
Social Media app would need to provide clear and accessible explanations of its content moderation policies, detailing what types of content are not allowed on the platform and the processes it uses to detect and remove such content. For instance, if Social Media app uses automated systems to identify and remove hate speech, it would need to disclose how these systems work and under what criteria they operate.
Advertisement
The DSA mandates platforms to be transparent about their advertising practices. Social Media app would be required to clearly label content that is sponsored or part of an advertising campaign. Moreover, it would need to disclose any targeting criteria used in serving advertisements. If advertisements are personalised based on users' online activity or demographic information, Social Media app must inform users about the data collection and processing involved in these practices.
Algorithmic Transparency
The platform would also have to explain how its algorithms recommend content to users. This means disclosing the main parameters and logic behind the recommendation systems, without necessarily revealing proprietary algorithms. For example, if user engagement metrics (such as time spent on content or interaction rates) significantly influence content recommendations, Social Media app would need to inform its users about this practice.
User Reporting Mechanisms
The platform must implement and detail mechanisms through which users can report illegal content or activity. It should clearly outline the steps a user can take to flag content, the process Social Media app follows upon receiving such reports, and the expected timelines for action.
Appeal Process
Social Media app would need to provide an appeal process for users who disagree with content moderation decisions. This includes explaining how users can appeal, what the review process looks like, and the time-frame in which users can expect a decision.
Enhance User Safety
It aims to protect users from illegal content, goods, and services online. The act establishes clear obligations for digital platforms to swiftly remove such content once it is identified. Under the DSA, a social media platform is required to establish clear and effective mechanisms for users to report illegal content, including hate speech, misinformation, or illegal goods being sold through the platform. Once reported, the platform has the obligation to quickly assess the content against its policies and the law, and if found to be illegal, remove or disable access to it in a timely manner.
Example: User comes across a post promoting hate speech or violence
Under the DSA, the user should be able to easily report this post through a clearly defined process provided by the social media platform. The platform, upon receiving the report, would need to review the content promptly and decide on the appropriate action based on its terms of service and the legal framework defined by the DSA. If the content is deemed illegal, the platform must remove it and possibly take further steps, such as informing law enforcement or taking action against the user account that posted the illegal content.
Additionally, the DSA mandates transparency regarding these decisions. The social media platform would need to provide information about the number of reports received, the action taken on reported content, and how quickly these actions were taken. This requirement aims to hold platforms accountable for their role in moderating content and protecting users from harmful and illegal content online.
Coordinated Rules Across the EU
By applying the same rules across all EU member states, the DSA aims to create a single digital market, making it easier for digital services to operate across borders.
Example: Illegal content, goods, or services sold/purchased on the platform
Before the DSA, online marketplaces operating in multiple EU member states faced a complex landscape of national laws and regulations. This complexity made it challenging and costly for these platforms to ensure compliance as they expanded across borders, potentially stifling growth and innovation.
With the implementation of the DSA, an online marketplace, regardless of its country of origin within the EU, is subject to a unified set of rules concerning the handling of illegal content, goods, or services sold on their platform. This means that whether the marketplace is based in Croatia and sells to customers in Germany, France and Italy, or any other combination of EU countries, it must adhere to the same fundamental regulations regarding user safety, transparency, and accountability.
For instance, if a consumer in Germany purchases a counterfeit product from a seller in Portugal through an online marketplace based in Slovenia, the DSA ensures that the consumer has the same rights and protections regarding reporting the illegal item and having it removed, regardless of where the marketplace or seller is based. The marketplace is required to have mechanisms in place for the consumer to report the counterfeit product easily, investigate the report, and take appropriate action in a timely manner. This uniformity in regulations helps protect consumers, provides clarity for online marketplaces, and simplifies the regulatory environment for digital services operating across the EU.
Protect Fundamental Rights
The DSA seeks to balance the removal of illegal content with the protection of users' fundamental rights, including freedom of expression and information.
Example: Post criticises a government's policy
Imagine a scenario where a user posts content on a social media platform that criticises a government's policy. The content is sharp but does not incite violence, nor does it spread hate speech or misinformation. However, the platform receives complaints claiming the content is harmful and should be removed.
Under the DSA, the platform is required to assess this content not just against its own policies but also considering the DSA's emphasis on protecting users' fundamental rights, such as freedom of expression and information. The DSA mandates that platforms cannot arbitrarily remove content that is legally permissible, even if it's controversial or unpopular.
In this case, the platform would need to carefully evaluate whether the content genuinely violates any laws or its terms of service regarding illegal content. If the content is found to be a lawful exercise of freedom of expression, the platform, guided by the DSA, should allow it to remain accessible, despite the complaints. This decision would be in line with the DSA's goal to ensure that content moderation practices do not unjustly infringe on fundamental rights.
Furthermore, the DSA requires platforms to provide detailed reports on their content moderation activities, including decisions that involve balancing freedom of expression with content removal. This transparency allows users and regulatory bodies to assess how well the platform is adhering to its obligations under the DSA, including the protection of fundamental rights.
Due Diligence Obligations
Platforms are required to implement mechanisms for users to flag illegal content and for the platforms to cooperate with national authorities. They must provide clear terms of service, respect users' rights when moderating content, and implement systems for users to appeal content moderation decisions.
Example: Illegal goods and e-commerce platform
Consider an e-commerce platform that operates across several EU countries, offering a wide range of products, including electronics, clothing, and books. Under the DSA, this platform is required to implement a robust system that allows users and consumers to report listings that may involve illegal goods, such as counterfeit products, unsafe toys, or electronics that do not meet EU safety standards.
Upon receiving a report of such illegal goods, the platform must act quickly to assess the report and determine the appropriate course of action. This could involve removing the listing, suspending the seller's account, or even reporting the issue to relevant law enforcement or regulatory bodies if required by the DSA or national laws.
The platform must provide clear terms of service that explain the obligations of sellers, including the requirement to comply with all applicable laws regarding the safety and legality of products. The platform also needs to establish an appeal process for sellers whose listings are removed or accounts are suspended, ensuring that they have the opportunity to contest decisions that they believe were made in error.
Compliance and Enforcement
The DSA establishes a framework for the enforcement of its provisions, including the role of national Digital Services Coordinators, the establishment of a European Board for Digital Services, and significant penalties for non-compliance.
Example: VLOP addressing risks
A very large online platform (VLOP), such as a major social media company with millions of users across the EU, is found to have systemic issues with spreading misinformation. Despite having policies in place to combat fake news, investigations by national Digital Services Coordinators reveal that the platform's efforts are insufficient and that its algorithms inadvertently promote misleading content, posing a significant risk to public health and safety, especially during an election period.
Under the DSA, the national Digital Services Coordinator of the member state where the VLOP has its European headquarters initiates an investigation, leveraging the DSA's provisions for access to data and conducting audits. The findings are shared with the European Board for Digital Services, a collective body comprising representatives from all EU member states, to assess the cross-border impact and recommend actions.
Given the severity and potential cross-border impact of the issue, the European Board for Digital Services works in coordination with the relevant national coordinator to draft a set of mandatory risk mitigation measures for the VLOP. These measures include revising its content recommendation algorithms to prevent the amplification of misinformation and implementing more robust content moderation processes.
The VLOP is given a deadline to comply with these measures or face significant penalties, which could include fines of up to 6% of its global annual revenue for failing to address systemic risks effectively. The enforcement process involves close monitoring of the platform's compliance efforts, including periodic audits and assessments by the national Digital Services Coordinator and the European Board for Digital Services.
VLOPs and very large online search engines (VLOSE) must conduct annual risk assessments to identify systemic risks associated with the dissemination of illegal content, negative effects on fundamental rights, and the intentional manipulation of their service. They are also required to take measures to mitigate these risks.
Summary
For users, the DSA promises a safer online environment, greater transparency about the workings of online platforms, and enhanced mechanisms for redress and appeal.
The implementation of the DSA also poses challenges, including potential impacts on smaller platforms and startups due to the cost and complexity of compliance. There are also concerns about the balance between removing harmful content and safeguarding freedom of expression, as well as the technical feasibility of some of the DSA's requirements.
The EU Digital Services Act is a landmark piece of legislation aimed at creating a safer, more transparent, and more accountable digital environment. Its success will depend on effective implementation and the ability of both platforms and regulators to adapt to its requirements.