Tomorrow, the world's biggest tech companies will finally be confronted with a European Union law designed to change the Internet forever—requiring more transparency and accountability from companies operating large platforms like Meta, TikTok, X, Google, Apple, and Amazon than before.
This landmark legislation, the Digital Services Act (DSA), could end up cornering platforms into making a difficult choice: either incur heavy fines or risk disrupting their core business models by making it easier than ever to opt out of recommendation systems. Repeat offenders could be banned from operating in the EU entirely.
Today, as Reuters reported that platforms are bracing for this major change, the question remains: How prepared is the EU to actually enforce the DSA?
Under the DSA, platforms will have to stop harmful content from spreading, stop or limit targeting many users with personalized content and ads, and share insights with researchers into how their opaque algorithms work to recommend content.
Violations could occur if platforms don't provide clear information on what they're recommending to users or how to opt out of those recommendations. Platforms could also risk violations for not diligently processing reports of illegal content.
As of tomorrow, these rules only apply to 19 very large online platforms and search engines, but by mid-February, "they will apply to a variety of online platforms, regardless of size," Reuters reported.
Platforms do not appear to be prepared to comply with the DSA. This summer, Facebook, Instagram, Twitter, TikTok, and Snapchat all conducted "stress tests" to see if they could "detect, address and mitigate systemic risks, such as disinformation," Reuters reported, and all five platforms failed to satisfy DSA standards.
The European Commission (EC) instructed platforms to do more to prepare for compliance before the DSA takes effect, but the clock has now run out, and critics have said that platforms may just pay the massive fines that they can easily afford. With questions remaining on how consistent the EU will be with DSA enforcement, accepting that violations may be inevitable could seem worth the risk to some platforms that might anticipate little to no enforcement.
The stakes remain high for the EU, which continues to set the bar for stricter and stricter tech regulations. How effectively these rules can be enforced will likely influence not just how platforms behave and the Internet functions in the EU, but also whether officials everywhere decide to copy legislation like the DSA.
Last year, it seemed immediately clear to pretty much everyone that the DSA would be challenging to enforce. One EU policy expert, Asha Allen—the advocacy director for Europe, Online Expression & Civic Space for the Centre of Democracy and Technology, Europe Office—wrote in a Wired op-ed that when DSA was passed, there "simply" wasn’t "the institutional capacity to enact this legislation effectively."
It's unclear what changed in the past year or if the EU followed through on what Allen claimed was among the most critical steps to effectively support the DSA's ambitious goals: hiring enough "world-class data scientists and experts to aid in the enforcement of the new algorithmic transparency and data accessibility obligations."
"Quite simply, without the meaningful engagement of advocates in the implementation and enforcement of the entire DSA, the potentially groundbreaking provisions we have collectively worked so diligently to obtain in the text won't come to fruition," Allen warned.
Ars couldn't immediately reach the EC or Allen for comment on what has changed since last summer when the commission provided a "sneak peek" of what DSA enforcement would look like. At that point, the EC admitted that "introducing new obligations on platforms and rights for users would be pointless if they are not properly enforced."
EU’s plan to enforce the DSA
In July 2022, the EC's press release explained that the commission planned to designate a regulator in each EU member state with powers of enforcement, noting that DSA takedown "orders sent by national public authorities will be more effective to tackle illegal content and get information about the wrongdoer, in particular cross-borders." The EC also planned to rely on non-government organizations, hotlines, and rightsholders as "trusted flaggers" of illegal content—which the DSA requires platforms to make easier to report.
Any platform found violating the DSA risks fines up to 6 percent of the company's global turnover, with sanctions expected to "be gradual and unprecedented in their scope." The EC said that its enforcement plan would also make it easier for "damaged consumers" to raise "class actions against platforms breaking the rules" and seek compensation.
The costs of EU regulators monitoring platforms and enforcing the DSA will be covered by platforms. Tech companies are also on the hook to fund the EU's efforts to hire experts to monitor the platforms.
Earlier this year, the EC launched its "high-profile" European Centre for Algorithmic Transparency, which "will provide the Commission with in-house technical and scientific expertise to ensure that algorithmic systems used by the Very Large Online Platforms and Very Large Online Search Engines comply with the risk management, mitigation, and transparency requirements in the DSA." These experts will "not only focus on identifying and addressing systemic risks stemming from Very Large Online Platforms and Very Large Online Search Engines, but also investigate the long-term societal impact of algorithms," the EC promised.
To help platforms avoid violations, each platform will be linked to a legal representative in the EU, the press release said. This will seemingly ensure that platforms can always get their biggest questions regarding compliance quickly answered.
Platforms still not ready for DSA
Before Twitter rebranded as X, the platform failed a stress test in March, leading the EU's commissioner for internal markets, Thierry Breton, to direct X to hire more people to help with content moderation. At that point, then-Twitter said that it intended to "comply fully with the Digital Services Act" but that hiring mods would take time. Twitter also promised to hire enough staff this year to comply with the DSA, but it's unclear if X has achieved that goal within the past few months.
While X appeared to embrace DSA compliance, other tech companies have fought to exclude their products from the list of platforms that the DSA currently targets. Notably, Amazon claimed it was being "unfairly singled out" in an appeal to the EU to leave Amazon out of the DSA, insisting that it shouldn’t be counted as a very large online platform when its competitors with larger businesses in certain EU countries are not.
Amazon is still fighting that battle in court, Reuters reported, although any success there would seemingly only kick the can down the road for the e-commerce giant. Once the DSA gets past this initial rollout for large platforms, the rules will apply to even more platforms, regardless of size, by mid-February 2024.
Despite Amazon's efforts to wriggle out of complying with the DSA, Reuters reported that Amazon has taken some steps to comply, including providing a new channel where customers can report incorrect information in its product listings.
Google has also made some changes to comply with the DSA, a company blog said today. Those changes include a DSA-required complaint-handling system that allows YouTube users to appeal decisions to remove or restrict videos, a "priority flagger" program where experts can report illegal content, and new metrics in transparency reports to "give more context about our work to protect users from harmful content."
Additionally, Google promised to share more data on how targeted ads work and to "increase data access" for researchers investigating "systemic content risks in the EU" and "looking to understand more about how Google Search, YouTube, Google Maps, Google Play, and Shopping work in practice."
In Google's case, the company appears to have rushed to ensure compliance, but if platforms repeatedly fail to comply with the DSA, they could risk being shut down across the EU. And that has become a top concern for human rights groups, who worry that the DSA "gives a lot of power to government agencies and other parties with partisan interests to flag and remove potentially illegal content," the digital rights nonprofit the Electronic Frontier Foundation (EFF) warned in July. This could have "negative consequences for vulnerable and historically oppressed groups"—including allowing governments to use arbitrary Internet shutdowns "as a weapon to punish platforms for not removing 'hateful content,'” the EFF said.
The EFF and 66 human rights and free speech advocacy groups have urged EU lawmakers to embrace a "human rights-centered approach" to DSA enforcement. This only adds another layer of complexity to regulators' ambitious supervisory obligations under the landmark law, making it harder to determine how effectively the EU can enforce the law—and harder to avoid negative consequences due to what the EFF described as any "wrong turn in shaping enforcement policies."
Ars could not immediately reach the EFF for comment. [Update: The director of the Europe office for the Centre of Democracy and Technology (CDT), Iverna McGowan, told Ars that "to avoid the DSA becoming a paper tiger, it will be crucial that national level enforcement bodies are independent and well-resourced, that civil society be given a formal role in enforcement oversight, and that there be careful attention maintaining a public interest focus on questions such the foreseen auditing of algorithms." McGowan said that it's still too early to tell if the European Centre for Algorithmic Transparency has the staffing levels or correct methodology it needs to audit platforms' algorithms.
McGowan said that CDT expects fines to be significant enough to encourage compliance, noting that it would be a "cynical" approach for platforms to pay fines to avoid compliance, because "the aim of this law is to better protect human rights and democracy online." CDT is perhaps more "concerned at how some provisions will pan out in practice," like allowing law enforcement to flag and remove content outside usual due process procedures. To prevent negative consequences of these kinds of potential regulatory oversights, CDT "is calling for a formal consultative role for civil society to oversee effective enforcement."
Second update: The EFF's analyst and media relations specialist Karen Gullo told Ars that "the DSA has a co-regulatory model for a reason. The regulation gives governments and entities with partisan interests a lot of power to remove content from the internet, but there is a fine line to walk between removing allegedly illegal content and censorship. Users must have a voice in how the DSA is enforced; free expression and other rights are at stake, and taking a human rights-based approach to enforcement won’t just happen by itself."
Because "platforms’ decision-making on what content to remove could be susceptible to outside influences that undermine fundamental rights protections," Gullo said that regulatory authorities at both the national and EU levels "should be talking with civil society organizations like EFF and groups in the DSA Human Rights Alliance, whose members monitor human rights abuses related to legislation." Consulting with "civil society and digital right defenders" that advocate for users’ interests is "is especially important for vulnerable groups frequently impacted by bad enforcement schemes," Gullo said.
"The DSA HR Alliance has a critical role to play to help make sure that doesn’t happen," Gullo told Ars.]