A whistleblower has told US senators that Facebook puts profits before people.

Speaking in a congressional testimony, Frances Haugen, 37, claimed the social media giant and its sister platform Instagram is damaging to both children and democracy yet makes decisions based on maximising interactions and revenues.

Haugen has come forward as the source of a series of stories in the Wall Street Journal last month, based on internal Facebook documents, which revealed the company knew Instagram was damaging teenagers’ mental health and that changes to its News Feed feature was making the platform more polarising and divisive.

“I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy,” she said in an opening address. 

“The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people.”

Referring to a worldwide six-hour outage on Monday of Facebook, Instagram and messaging platform WhatsApp, also owned by Facebook, she said: “For more than five hours Facebook wasn’t used to deepen divides, destabilise democracies and make young girls and women feel bad about their bodies.”

Haugen claims an algorithm “led children from very innocuous topics like healthy recipes… all the way to anorexia-promoting content over a very short period of time”.

She added: “It’s just like cigarettes… teenagers don’t have good self-regulation.”


As well as targeting teens, Haugen told lawmakers the platform “definitely” targets children as young as eight via its Messenger Kids app.

Facebook reported profits of more than $29 billion in 2020 and has 3.5bn monthly active users across its platforms.

“The core of the issue is that no one can understand Facebook’s destructive choices better than Facebook, because only Facebook gets to look under the hood,” she said, calling for regulators to step in to “build sensible rules and standards to address consumer harms, illegal content, data protection, anticompetitive practices, algorithmic systems and more”.

Referencing an apparent poisonous Facebook influence in unstable democracies such as Ethiopia – “it is literally fanning ethnic violence” – Haugen said the buck stops with CEO and founder Mark Zuckerberg.

“We have a few choice documents that contain notes from briefings with Mark Zuckerberg where he chose metrics defined by Facebook like ‘meaningful social interactions’ over changes that would have significantly decreased misinformation, hate speech and other inciting content.

“The buck stops with him.”

IT must remain a boardroom priority

She said Facebook should share internal information and research with the authorities, and remove algorithms on its News Feed so posts are ranked chronologically.

In response, Zuckerberg wrote in a blog post: “At the heart of these accusations is this idea that we prioritise profit over safety and wellbeing. That’s just not true.

“[The testimony] just doesn’t reflect the company we know… we care deeply about issues like safety, wellbeing and mental health. It’s difficult to see coverage that misrepresents our work and our motives.

“We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content. And I don’t know any tech company that sets out to build products that make people angry or depressed.

“The moral, business and product incentives all point in the opposite direction.”

Senior Facebook employees played down Haugen’s role at the social media giant, questioning whether she had enough access to top-level decisions to whistleblow. 

Lena Pietsch, Facebook’s director of policy communications, said: “Today, a Senate commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives and testified more than six times to not working on the subject matter in question. We don’t agree with her characterization of the many issues she testified about.

“Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”

Social media