Liabooks Home|PRISM News
Bluesky's First Transparency Report Reveals Growing Pains of a 41M-User Platform
TechAI Analysis

Bluesky's First Transparency Report Reveals Growing Pains of a 41M-User Platform

4 min readSource

Bluesky published its first comprehensive transparency report showing 60% user growth alongside surging moderation reports and legal requests, revealing the challenges facing the Twitter alternative.

41.2 million users and counting—Bluesky just dropped its first transparency report, and the numbers tell a story that goes far beyond simple growth metrics. The Twitter alternative didn't just share success stories; it laid bare the messy realities of running a rapidly expanding social platform in 2025.

The Price of Popularity

Bluesky grew nearly 60% in 2025, jumping from 25.9 million to 41.2 million users. That's impressive for any platform, let alone one positioning itself as a decentralized alternative to X and Threads. Users posted 1.41 billion times last year—representing 61% of all posts ever made on the platform.

But growth came with growing pains. User reports surged 54% from 6.48 million to 9.97 million. Legal requests from law enforcement and regulators exploded fivefold, from 238 to 1,470. It's the classic social media paradox: more users mean more problems, and Bluesky is learning this lesson in real-time.

What's particularly telling is that while user reports increased 54%, user growth was 57%—meaning the report rate stayed roughly constant. That suggests Bluesky's moderation challenges aren't necessarily getting worse per capita, but they're definitely getting bigger in absolute terms.

The Anatomy of Online Conflict

The most reported category? "Misleading" content at 43.73% of all reports, with spam accounting for 2.49 million of those. Harassment came second at 19.93%, followed by sexual content at 13.54%.

Here's where it gets interesting: Bluesky admits that most harassment reports fell into a "gray area" of antisocial behavior—think rude remarks rather than outright hate speech. This highlights one of the thorniest problems in content moderation: where exactly do you draw the line between free expression and harmful behavior?

The sexual content reports tell another story. Of 1.52 million reports, most concerned "mislabeling"—adult content that wasn't properly tagged with metadata. This reflects Bluesky's user-driven moderation philosophy, where people control their own experience through labels and filters. But it also shows that system only works when everyone plays by the rules.

Automation vs. Human Judgment

Bluesky's automated systems flagged 2.54 million potential violations on top of user reports. One success story: implementing a system that identifies toxic replies and hides them behind an extra click led to a 79% drop in daily antisocial behavior reports. Sound familiar? It's similar to what X does.

The platform also showed a preference for labeling over banning. They applied 16.49 million labels to content (up 200% year-over-year) while account takedowns grew 104%. This suggests Bluesky is trying to preserve user agency—letting people see content with warnings rather than removing it entirely.

The Decentralization Dilemma

Bluesky's approach raises fundamental questions about platform governance. Built on the AT Protocol, it allows users to run their own servers and create custom moderation rules. In theory, this distributes power away from a central authority. In practice, Bluesky still had to remove 3,619 accounts for suspected influence operations (likely Russian) and handle thousands of legal requests.

This tension between decentralization ideals and practical governance needs isn't unique to Bluesky. Every platform faces the same challenge: how do you maintain a safe, functional community while preserving user autonomy?

The Transparency Paradox

Bluesky's decision to publish comprehensive transparency data deserves credit. Most platforms share limited moderation statistics, if any. But transparency creates its own challenges—every number invites scrutiny, every policy decision becomes public debate.

Compare this to X under Elon Musk, which has largely abandoned regular transparency reporting, or Meta, which shares data but often lacks context. Bluesky's approach suggests they're betting that radical transparency will build trust, even when the numbers aren't flattering.

This content is AI-generated based on source articles. While we strive for accuracy, errors may occur. We recommend verifying with the original source.

Thoughts

Related Articles