Meta and TikTok Let Harmful Content Rise After Evidence Outrage Drove Engagement: Whistleblowers
More than a dozen insiders revealed TikTok and Meta let harmful content rise to boost engagement, with Instagram Reels seeing 75% more bullying comments, internal research shows.
- Whistleblowers and insiders told the BBC that algorithms promoted outrage and engagement at the expense of user safety, with internal research showing harmful content amplified for young users.
- Pressure to grow users and ads drove staffing and product trade-offs as Meta invested 700 staff in Instagram Reels while safety teams were denied specialist hires.
- Internal examples show safety teams deprioritised teen-harm reports as TikTok trust and safety team prioritised politicians over a 17-year-old in France, while Meta engineer instructions allowed more borderline harmful content on Reels.
- Public scrutiny prompted company statements and rebuttals as the BBC documentary Inside the Rage Machine aired, with Meta saying 'Any suggestion that we deliberately amplify harmful content for financial gain is wrong' and TikTok rejecting whistleblowers' claims.
- Engineers conducting experiments ran large-scale ranking tests on hundreds of millions, while whistleblowers warned of radicalisation from age 14 and Meta's global reach of north of three billion.
15 Articles
15 Articles
A BBC investigation published yesterday only confirmed what we already suspected – that big tech platforms are turning a blind eye to harmful content in the pursuit of profit. The platforms allow so-called fringe content – misogynistic, sexist, racist, conspiracy-based – that is harmful but legal.
TikTok and Meta's algorithm race compromised safety for engagement, say whistleblowers: Report
Whistleblowers told BBC that Meta and TikTok and Meta prioritised user engagement over safety, allowing harmful content on their platforms. The algorithm race led to increased harassment and exploitation, putting vulnerable users like children, at risk.
According to the BBC, TikTok and Meta have become more bold in allowing harmful content to be pushed on users in order to make money or gain popularity.
Inside TikTok-Meta algorithm war: How the race for engagement is putting users at risk
In the high-stakes arena of digital dominance, a silent war is being waged not with weapons but with codes and algorithms. In 2026, the “algorithm arms race” between TikTok and Meta has...
Coverage Details
Bias Distribution
- 50% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium












