The Global Consequences of Europe’s New Digital Regulatory Regime
Why technology companies should care about the EU’s Digital Services Act
A new and dramatic approach to regulating big technology firms is coming into force across the European Union. The Digital Services Act (DSA), which the European Council signed into law on September 15, 2022, aims to protect the digital space against the spread of illegal content, particularly on social networks, content sharing platforms and e-commerce sites. The DSA carries significant financial penalties and enforcement actions against infringing companies, including potential sanctions of up to 6% of global annual sales or even a complete ban on operating in the EU single market for repeat offenders.
DSA Enforcement Regime
Digital Services Coordinators (DSCs)
- A national authority assigned by each EU member state
- Responsible for all supervision and enforcement of the DSA at the national level
- Has authority to: request and seize documents; inspect premises; impose fines or periodic penalty payments; adopt interim measures
- Primary regulator for very large online platforms and very large online search engines. Can initiate investigations on its own or at the recommendation of a DSC
- Responsible for streamlining enforcement and preventing lax enforcement by member states
- Has authority to (in addition to those available to DSCs): perform due diligence activities, including risk assessments; initiate independent audits; issue noncompliance decisions; impose legally binding commintments on companies.
The European Board for Digital Serviceds Coordinators
- Advisory group tasked with ensuring consistent application of the DSA
- Responsible for drafting Codes of Conduct and supporting joint investigations between DSCs and the European Commission
- Has authority to: Advise the European Commission and DSCs on enforcement measures and adopt opinions addressed to DSCs
Summary of key obligations
All Intermediary Services Providers [e.g., internet access providers, domain name registrars]
- Service contracts (terms and conditions) must meet certain minimum requirements to ensure clarity, transparency and fairness.
- Terms and conditions should include information on any policies, procedures, measures and tools used for content moderation.
- Companies must publish an annual transparency report.
- A company without EU establishment must appoint a local representative in one of the EU member states where it operates.
Hosting Services Providers (Including Online Platforms and Very Large Online Platforms) [e.g., companies offering cloud computing and web hosting services]
- User-friendly notices and take-down mechanisms must be provided to allow notification of illegal content by third parties.
- Notice of illegal content should be processed swiftly, and prompt actions taken to address the issue (i.e., remove or disable access to the content).
- Anonymity of content reporters should be protected, except in cases involving violation of image rights and intellectual property rights.
- Serious criminal offences involving threat to life or safety of persons must be reported to law enforcement or judicial authorities.
Online Platforms (all) [e.g., online marketplaces, app stores, social media platforms]
- An internal complaint-handling system should be created to enable service recipients affected by content moderation decisions to lodge complaints within a given period.
- Companies shall provide transparency reports detailing the number of complaints received, disputes submitted to out-of-court dispute settlement bodies, numbers of suspensions and advertisements removed, and use of automated content moderation tools.
- Companies are prohibited from using “dark patterns,” a range of potentially manipulative user interface designs used on websites and mobile apps.
- Qualified staff should be assigned to review the complaints and ensure compliance with standards.
- Platform operators must remove traders offering illegal products or content and maintain a record of removal.
- Companies are required to clearly identify parameters used to determine advertisement recipients.
- Providers that use software to predict user preferences must disclose to the user how the system operates and the options available to modify preferences.
Very Large Online Platforms (only) [e.g., platforms with a reach of more than 10% of the 450 million consumers in the European Union.]
- Companies must identify systemic risks stemming from the use of their services, particularly those related to the sharing of illegal content.
- Mitigation measures must be implemented to deal with the system risks.
- Independent audits, performed by independent firms, should be conducted to assess providers’ compliance with the DSA and related obligations.
- Providers using recommender systems must provide at least one that is not based on profiling and allows users to set their preferred options for content ranking.
- Providers are required to publish information on advertisements that have been displayed on their platform, including the targeted audience, relevant parameters and number of recipients reached.
- Content deemed to be “deep-fakes” shall be clearly labeled.
- Companies will be required to share data with authorities so they can monitor and assess compliance with the DSA. Vetted researchers must meet stipulated requirements.
The Digital Services Act... has the potential to become the gold standard for other regulators in the world. By setting new standards for a safer and more accountable online environment, the DSA marks the beginning of a new relationship between online platforms and users and regulators in the European Union and beyond.
While the short-term enforceability remains opaque, it is indisputable that the long-term impact of the DSA will fundamentally change how some companies approach doing business online and, more specifically, how they approach online content moderation. The impact will undoubtedly result in multiyear efforts by many organizations to rebuild or reengineer existing internal processes and control functions.
Many non-EU regulators are likely to follow the European Council in introducing similar stringent measures to fight so-called harmful content. In the United States, establishing such a regulatory framework at the federal level will require treading carefully between enabling free expression and access to information and fostering a digital environment that is safe for all people but does not stymie competition and innovation. In the absence of a DSA-like federal law, individual U.S. states are likely to continue the trend to regulate internet content.
The sheer complexity of the obligations in the DSA means that companies should be starting their journey in building their strategy and operational approaches to compliance. Now is the time to evaluate your organization’s current practices and legal obligations, and to develop a comprehensive approach that is able to detect, notify, remove and prevent illegal content. Developing such a program, one that aligns with where global regulatory trends are headed, will allow organizations to build trust with their consumers and regulators — and go a long way to ensure they act with integrity and the highest ethical standards when it comes to corporate behavior in a digital environment.