Pump.fun, a Solana-based launchpad for memecoins, has come under fire for its livestream feature, which devolved into a showcase for extreme and disturbing content, including threats of self-harm, animal abuse, and explicit material. Legal experts warn the platform could face criminal or civil charges over its lack of moderation.
On Nov. 25, the platform announced it had indefinitely paused livestream functionality, acknowledging community concerns. “We strongly stand for free speech and expression, but it’s our responsibility to ensure users don’t see repulsive or dangerous content,” co-founder “Alon” stated on X.
Livestream Shutdown Amid Growing Backlash
Pump.fun’s livestream feature, initially designed to promote tokens, reportedly became a hotbed for controversial and unlawful acts. Developers used the platform to stage provocative stunts, including one threatening suicide if a token’s market cap was not achieved and another harming a goldfish on camera.
Mikko Ohtamaa, co-founder of Trading Strategy, noted on X that platforms like Pump.fun face two choices: self-regulation or eventual forced closure by regulators. “These streams are breaking laws live, and this will prompt action when mainstream media takes notice,” he said.
Despite its success as a code-free token launchpad, Pump.fun’s hands-off moderation has placed it in regulators’ crosshairs, especially as the platform has birthed several scam tokens and rug pulls.
Legal and Ethical Implications
Legal analysts point to potential liability under laws like the United States’ Section 230 of the Communication Decency Act, which shields platforms from direct responsibility for user-generated content but requires responsible moderation. Failure to act on harmful content, especially after committing to removal, could expose Pump.fun to legal risks, as seen in precedent cases like Barnes v. Yahoo!.
Yuriy Brisov of Digital and Analogue Partners called the livestream incidents a “legitimate reason” for investigations. He added that regulatory action is increasingly likely given the platform’s unchecked activities.
Content Moderation: A Persistent Challenge
Despite claims by Alon of having a “large team of moderators working around the clock,” Cointelegraph’s review of the livestream board on Nov. 25 revealed explicit, racially offensive, and violent content. Some videos appeared to have been removed, but the volume highlighted significant lapses in real-time oversight.
Alon acknowledged Pump.fun’s moderation shortcomings and pointed to an NSFW toggle to obscure extreme content, though critics argue this measure is insufficient. Industry participants have urged stricter oversight or a permanent shutdown of the livestream feature.
Regulatory Storm Brewing for User-Generated Platforms
Pump.fun’s controversy underscores the broader dilemma for platforms reliant on user-generated content. With moderation technologies unable to keep pace with the sheer volume of uploads, experts predict heightened scrutiny of platforms enabling harmful or illegal activities.
Whether Pump.fun’s recent decision to pause its livestream feature will satisfy regulators or prompt further inquiries remains to be seen. In the meantime, the platform’s future—and that of its users—hangs in the balance.