European Commission cites burdensome research data access, unclear reporting, and inadequate appeal processes as violations.
Nguyen Hoai Minh
•
11 days ago
•
The European Commission isn't pulling any punches. In a significant move, it has formally charged tech giants Meta and TikTok over alleged failures to comply with the stringent rules of the Digital Services Act (DSA). These aren't minor technicalities, either. We're talking about fundamental obligations regarding the fight against illegal content, especially child sexual abuse material (CSAM), and ensuring researchers have proper access to data. This signals a serious escalation in the EU's ongoing efforts to rein in powerful online platforms.
The charges stem from investigations that highlight what the Commission views as systemic issues. Both Meta, the parent company of Facebook and Instagram, and TikTok, owned by ByteDance, stand accused of creating "burdensome procedures and tools" for researchers. What does this mean in practical terms? Well, it makes it incredibly difficult for independent experts to access public data, which is crucial for studying pressing issues like how minors are exposed to illegal or harmful content online. Incomplete or unreliable information is simply not good enough when it comes to child safety. The DSA is crystal clear: transparency through data access is a foundational requirement.
Let's drill down into the specifics. For both Meta and TikTok, the core issue around researcher access revolves around restrictive APIs and lengthy verification processes. Meta's new tools, for instance, demand approvals that can take weeks, significantly slowing down critical research. TikTok's API limits data queries, which is hardly conducive to comprehensive studies. How can anyone truly understand the spread of dangerous content if the data gates are locked?
And what about appeals? Under the DSA, users have a right to challenge content moderation decisions. Yet, the Commission found that neither Facebook nor Instagram provides adequate mechanisms for users to explain their side of the story or provide supporting evidence when appealing a content removal or account suspension. That seriously limits the effectiveness of the appeal process, doesn't it? It's like asking someone to defend themselves without letting them speak.
So, how are the companies responding? Meta, for its part, maintains it has already implemented changes since the DSA came into force, covering content reporting, appeals, and data access. The company expressed confidence that its current solutions meet EU legal requirements. TikTok, meanwhile, acknowledged the findings and is reviewing them, but also raised a rather interesting point: the requirements for easing data safeguards can create tension with GDPR privacy rules. They're asking regulators for guidance on how to reconcile these obligations, which is a fair question for regulators to tackle.
These are significant charges, folks. If found non-compliant, Meta and TikTok could face eye-watering fines—up to 6 percent of their total worldwide annual turnover. For a company like Meta, that could translate into billions of euros, based on their revenue. Both companies now have the opportunity to examine the Commission's investigation files and reply in writing. More importantly, they have a chance to implement changes to comply with DSA rules. The clock is ticking, with a deadline generally set for mid-November. The EU is serious about online safety, and this crackdown is proof positive. But will these tech giants finally fall in line, or will we see this saga unfold further into the new year?