In the small hours local time, European Union lawmakers secured a provisional deal on a landmark update to rules for digital services operating in the region — grabbing political agreement after a final late night/early morning of compromise talks on the detail of what is a major retooling of the bloc’s existing ecommerce rulebook.
The political agreement on the Digital Services Act (DSA) paves the way for formal adoption in the coming weeks and the legislation entering into force — likely later this year. Although the rules won’t start to apply until 15 months after that — so there’s a fairly long lead in time to allow companies to adapt.
The regulation is wide ranging — setting out to harmonize content moderation and other governance rules to speed up the removal of illegal content and products. It addresses a grab-bag of consumer protection and privacy concerns, as well as introducing algorithmic accountability requirements for large platforms to dial up societal accountability around their services. While ‘KYC’ requirements are intended to do the same for online marketplaces.
A prohibition on the use of so-called ‘dark patterns’ for online platforms is also included — but not, it appears, a full blanket ban for all types of digital service (per details of the final text shared with TechCrunch via our sources).
See below for a fuller breakdown of what we know so far about what’s been agreed.
The Commission’s mantra for the DSA has always been that the goal is to ensure that what’s illegal offline will be illegal online. And in a video message tweeted out in the small hours local time, a tired but happy looking EVP, Margrethe Vestager, said it’s “not a slogan anymore that’s what illegal offline should also be seen and dealt with online”.
“Now it is a real thing,” she added. “Democracy’s back.”
In a statement, Commission president Ursula von der Leyen added:
In a statement, its rapporteur for the file, MEP Christel Schaldemose, further suggested the DSA will “set new global standards”, adding: “Citizens will have better control over how their data are used by online platforms and big tech-companies. We have finally made sure that what is illegal offline is also illegal online. For the European Parliament, additional obligations on algorithmic transparency and disinformation are important achievements. These new rules also guarantee more choice for users and new obligations for platforms on targeted ads, including bans to target minors and restricting data harvesting for profiling.”
Other EU lawmakers are fast dubbing the DSA a “European constitution for the Internet”. And it’s hard not to see the gap between the EU and the US on comprehensive digital lawmaking as increasingly gaping.
It’s worth emphasizing that the full and final text hasn’t been published yet — and won’t be for a while. It’s pending legal checks and translation into the bloc’s many languages — which means the full detail of the regulation and the implication of all its nuance remains tbc.
But here’s an overview of what we know so far…
Scope, supervision & penalties
On scope, the Council says the DSA will apply to all online intermediaries providing services in the EU.
The regulation’s obligations are intended to be proportionate to the nature of the services concerned and the number of users — with extra, “more stringent” requirements for “very large online platforms” (aka VLOPs) and very large online search engines (VLOSEs).
Services with more than 45M monthly active users in the EU will be considered VLOPs or VLOSEs. So plenty of services will reach that bar — including, for example, the homegrown music streaming giant Spotify.
“To safeguard the development of start-ups and smaller enterprises in the internal market, micro and small enterprises with under 45 million monthly active users in the EU will be exempted from certain new obligations,” the Council adds.
The Commission itself will be responsible for supervising VLOPs and VLOSEs for the obligations that are specific to them — which is intended to avoid bottlenecks in oversight and enforcements of larger platforms (such as happened with the EU’s GDPR).
But national agencies at the Member State level will supervise the wider scope of the DSA — so EU lawmakers say this arrangement maintains the country-of-origin principle that’s baked into existing digital rules.
Penalties for breaches of the DSA can scale up to 6% of global annual turnover.
Per the parliament, there will also be a right for recipients of digital services to seek redress for any damages or loss suffered due to infringements by platforms.
Content moderation & marketplace rules
The content moderation measures are focused on harmonizing rules to ensure “swift” removal of illegal content.
This is being done through what the parliament describes as a “clearer ‘notice and action’ procedure” — where “users will be empowered to report illegal content online and online platforms will have to act quickly”, as it puts it.
It also flags support for victims of cyber violence — who it says will be “better protected especially against non-consensual sharing (revenge porn) with immediate takedowns”.
MEPs say fundamental rights are protected from the risk of over-removal of content from the regulation putting pressure on platforms to act quickly through “stronger safeguards to ensure notices are processed in a non-arbitrary and non-discriminatory manner and with respect for fundamental rights, including the freedom of expression and data protection”.
The regulation is also intended to ensure swift removal of illegal products/services from online marketplaces. So there are new requirements incoming for ecommerce players.
On this, the Council says the DSA will impose a “duty of care” on marketplaces vis-à-vis sellers who sell products or services on their online platforms.
“Marketplaces will in particular have to collect and display information on the products and services sold in order to ensure that consumers are properly informed,” it notes, although there will be plenty of devil in the detail of the exact provisions.
On this, the parliament says marketplaces will “have to ensure that consumers can purchase safe products or services online by strengthening checks to prove that the information provided by traders is reliable (‘Know Your Business Customer’ principle) and make efforts to prevent illegal content appearing on their platforms, including through random checks”.
Extra obligations for VLOPs/VLOSEs
These larger platform entities will face scrutiny of how their algorithms work from the European Commission and Member State agencies — which the parliament says will both have access to the algorithms of VLOPs.
The DSA also introduces an obligation for very large digital platforms and services to analyse “systemic risks they create” and to carry out “risk reduction analysis”, per the Council.
The analysis must be done annually — which the Council suggests will allow for monitoring of and reduced risks in areas such as the dissemination of illegal content; adverse effects on fundamental rights; manipulation of services having an impact on democratic processes and public security; adverse effects on gender-based violence, and on minors and serious consequences for the physical or mental health of users.
Additionally, VLOPs/VLOSEs will be subject to independent audits each year, per the parliament.
Large platforms that use algorithms to determine what content users see (aka “recommender systems”) will have to provide at least one option that is not based on profiling. Albeit, many already do — although they often also undermine these choices by applying dark pattern techniques to nudge users away from control over their feeds so holistic supervision will be needed to meaningfully improve user agency.
There will also be transparency requirements for the parameters of these recommender systems with the goal of improving information for users and any choices they make. Again, the detail will be interesting to see there.
Limits on targeted advertising
Restrictions on tracking-based advertising appear to have survived the trilogue process with all sides reaching agreement on a ban on processing minors’ data for targeted ads.
This applies to platforms accessible to minors “when they are aware that a user is a minor”, per the Council.
“Platforms will be prohibited from presenting targeted advertising based on the use of minors’ personal data as defined in EU law,” it adds.
A final compromise text shared with TechCrunch by our sources suggests the DSA will stipulate that providers of online platforms should not do profile based advertising “when they are aware with reasonable certainty that the recipient of the service is a minor”.
A restriction on the use of sensitive data for targets ads has also made it into the text.
The parliament sums this up by saying “targeted advertising is banned when it comes to sensitive data (e.g. based on sexual orientation, religion, ethnicity)”.
The wording of the final compromise text which we’ve seen states that: “Providers of online platforms shall not present advertising to recipients of the service based on profiling within the meaning of Article 4(4) of Regulation 2016/679 [aka, the GDPR] using special categories of personal data as referred to in article 9(1) of Regulation 2016/679.”
Article 4(4) of the GDPR defines ‘profiling’ as: “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements;”.
While the GDPR defines special category data as personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, as well as biometric and health data, data on sex life and/or sexual orientation.
So targeting ads based on tracking or inferring users’ sensitive interests is — on paper — facing a hard ban in the DSA.
Ban on use of dark patterns
A prohibition on dark patterns also made it into the text. But, as we understand it, this only applies to “online platforms” — so it does not look like a blanket ban across all types of apps and digital services.
That is unfortunate. Unethical practices shouldn’t be acceptable no matter the size of the business.
On dark patterns, the parliament says: “Online platforms and marketplaces should not nudge people into using their services, for example by giving more prominence to a particular choice or urging the recipient to change their choice via interfering pop-ups. Moreover, cancelling a subscription for a service should become as easy as subscribing to it.”
The wording of the final compromise text that we’ve seen says that: “Providers of online platforms shall not design, organize or operate their online interfaces in a way that deceives, manipulates or otherwise materially distorts or impairs the ability of recipients of their service to make free and informed decisions” — after which there’s an exemption for practices already covered by Directive 2005/29/EC [aka the Unfair Commercial Practices Directive] and by the GDPR.
The final compromise text we reviewed further notes that the Commission may issue guidance on specific practices — such as platforms giving more prominence to certain choices, repeatedly requesting a user makes a choice after they already have and making it harder to terminate a service than sign up. So the effectiveness of the dark pattern ban could well come down to how much attention the Commission is willing to give to a massively widespread online problem.
The wording of the associated recital in the final compromise we saw also specifies that the dark pattern ban (only) applies for “intermediary services”.
An entirely new article was also added to the DSA following Russia’s invasion of Ukraine — and in connection with rising concern around the impact of online disinformation — that creates a crisis response mechanism which will give the Commission extra powers to scrutinize VLOPs/VLOSEs in order to analyze the impact of their activities to the crisis in question.
The EU’s executive will also be able to come up with what the Council bills as “proportionate and effective measures to be put in place for the respect of fundamental rights”.
The mechanism will be activated by the Commission on the recommendation of the board of national Digital Services Coordinators.