- A survivor of child sexual abuse is calling on Elon Musk to take urgent action against the circulation of exploitative content on his social media platform, X.
- The woman, known publicly as “Zora,” discovered that images of her abuse—originally confined to the dark web—are now being openly marketed on X by traders offering illegal material to global buyers.
- A BBC investigation traced one such trader to Jakarta, Indonesia, revealing a network of over 100 accounts linked to the sale of child sexual abuse material (CSAM).
- Despite platform takedowns, new accounts continue to emerge, raising serious concerns about enforcement and accountability.
A survivor of child sexual abuse is urging Elon Musk to intervene after discovering that images of her trauma are being circulated and sold on his social media platform, X. The woman, who goes by the pseudonym “Zora,” was first abused over two decades ago in the United States. Although her abuser was prosecuted, the images captured during that time have continued to haunt her—now resurfacing in online marketplaces frequented by predators.
A BBC investigation into the global trade of child sexual abuse material (CSAM) uncovered a disturbing network operating on X. One trader, believed to be based in Jakarta, Indonesia, was offering “VIP packages” of illegal content through Telegram. The BBC, working alongside members of the hacktivist group Anonymous, traced the trader’s financial footprint to multiple bank accounts and payment platforms. When confronted, a man linked to the accounts denied involvement and claimed the accounts were dormant.
The scale of the operation was staggering. Activists reported that the trader managed over 100 nearly identical accounts on X. Each time one was removed, another would appear—highlighting the challenges platforms face in permanently banning offenders. The trader claimed to have thousands of files for sale, including material involving children as young as seven.
Experts from the Canadian Centre for Child Protection (CCCP), who are legally authorized to view such content, confirmed the presence of Zora’s images among the trader’s offerings. These materials, once hidden on the dark web, are now being promoted in plain sight using coded hashtags and cropped images that evade detection.
Zora, who has endured years of stalking and threats from individuals who discovered her identity, says the continued circulation of these images feels like a second violation. She believes platforms like X must do more than remove accounts—they must prevent them from reappearing and take proactive steps to protect survivors.
X maintains that it has zero tolerance for child exploitation and works closely with law enforcement and child protection agencies. Telegram, also implicated in the trade, claims to have banned over half a million groups and channels related to CSAM in 2025 and says it employs thousands of moderators to monitor content.
Despite these assurances, the persistence of such networks raises urgent questions about the effectiveness of current safeguards. For survivors like Zora, the fight is not just about justice—it’s about reclaiming dignity in a digital world that too often fails to protect the vulnerable.