Bluesky’s 2024 moderation report exhibits how rapidly dangerous content material grew as new customers flocked in
Bluesky skilled explosive development final 12 months, , necessitating that the platform ramp up its moderation efforts. In its lately launched moderation report for 2024, Bluesky mentioned it grew by about 23 million customers, leaping from 2.9 million customers to just about 26 million. And, its moderators obtained 17 occasions the variety of person stories they bought in 2023 — 6.48 million in 2024 in comparison with 358,000 the earlier 12 months.
The majority of those stories have been associated to “harassment, trolling or intolerance,” spam and deceptive content material (together with impersonation and misinformation). The presence of accounts posing as different folks within the wake of Bluesky’s reputation spike, and the platform with a “extra aggressive” method in an try to crack down on it. On the time, it mentioned it had quadrupled its moderation staff. The brand new report says Bluesky’s moderation staff has grown to about 100, and hiring is ongoing. “Some moderators focus on explicit coverage areas, equivalent to devoted brokers for youngster security,” it notes.
Different classes Bluesky says it obtained a number of stories about embrace “unlawful and pressing points” and undesirable sexual content material. There have been additionally 726,000 stories marked as “different.” Bluesky says it complied with 146 requests from “legislation enforcement, governments, authorized corporations” out of a complete of 238 final 12 months.
The platform plans on making some adjustments to the best way stories and appeals are dealt with this 12 months that it says will “streamline person communication,” like offering customers with updates about actions it has taken on content material they’ve reported and, additional down the road, letting customers enchantment takedown choices instantly within the app. Moderators took down 66,308 accounts in 2024, whereas its automated programs took down 35,842 spam and bot profiles. “Looking forward to 2025, we’re investing in stronger proactive detection programs to enhance person reporting, as a rising community wants a number of detection strategies to quickly establish and deal with dangerous content material,” Bluesky says.