WHAT do eating disorders, Trumpist conspiracy theories and ethnic violence in Ethiopia all have in common? The answer, according to whistleblower Frances Haugen, is that Facebook is fuelling them. And she’s not asking us to take her word for it – she took a mountain of evidence with her when she quit the tech giant in May.

The 37-year-old spent two years at Facebook, initially focusing on fighting misinformation as part of its “civic integrity” team. The role appealed to her, she says, because she had lost a friend to online conspiracy theories and wanted to help save others from falling down the same rabbit holes of digital disinformation.

The focus of her team’s work was countering election interference, but following the US election she was told it was to be dissolved. One lesson firms can learn from this is that giving temporary teams lofty names might come back to haunt them. Speaking to US current affairs show 60 Minutes, Haugen said Facebook told the staff: “We’re dissolving civic integrity”.

What a phrase.

She went on: “Fast-forward a couple months [and] we got the insurrection. When they got rid of civic integrity, it was the moment where I was, like … I don’t trust that they‘re willing to actually invest what needs to be invested to keep Facebook from being dangerous.”

I do hope there’s a fledgling death metal band in a garage somewhere near Silicon Valley, hastily printing a run of T-shirts with “DISSOLVING CIVIC INTEGRITY” on the back.

READ MORE: Why were Facebook, WhatsApp and Instagram down? Outages explained

Haugen’s clear implication was that re-opening these online extremism floodgates set the stage for the astonishing scenes in Washington DC in January 2020.

Naturally, Nick Clegg isn’t happy about her implying this. Remember him? Friendly chap, chums with Cameron, everyone agreed with him for about 48 hours in 2010? Yes, that Nick Clegg, the LibDem! Liberally defending the democratic right of his big tech paymaster to stand by and shrug while Trump rallied his troops, “stop the steal” echoed across social media and armed terrorists stormed the Capitol building.

He’s now Facebook’s vice-president of policy and global affairs, and in a memo to employees he warned them that 60 Minutes would “suggest that the extraordinary steps we took for the 2020 elections were relaxed too soon and contributed to the horrific events of January 6 in the Capitol”. Clegg says that’s not fair because people have been falling out with each other for decades and there’s no reason to believe thon shirtless chap with the face paint and furry hat even knows how to work a computer. I am paraphrasing slightly. Vince Cable must be so proud.

“The idea that Facebook is the chief cause of polarisation isn’t supported by the facts,” Clegg asserts, somewhat missing this point that you don’t need to be the chief cause to nonetheless serve as a chief facilitator. No-one is suggesting (or at least not yet) that Facebook itself is starting arguments or making up conspiracies to share out to its gullible users – what Haugen is saying is that the platform will quite happily putting those arguments and conspiracies into the eyeballs and earphones or its users if it means keeping them engaging with the website or app for longer.

This is by no means a new claim – what’s new is the sheer volume of evidence Haugen has collected, from internal research reports, draft presentations and online discussions among employees.

Netflix documentary The Social Dilemma featured countless former big tech employees testifying that algorithms are used to point users to harmful content. Tristan Harris, a former design ethicist for Google, argued that a “disinformation-for-profit business model” allows firms to make more money by allowing unregulated content to be disseminated, while the film’s director Jeff Orlowski describes “a machine whose main currency is outrage and anger”.

At best, this trade-off is likely to make Facebook users feel unhappy and insecure – at worst, it can be deadly. Giving testimony at the US Senate on Tuesday, Haugen said the platform was “literally fanning ethnic violence” in places like Ethiopia, where the rapid spread of disinformation and fake videos has had horrific outcomes. In 2019 the former Olympic athlete Haile Gebrselassie told the BBC that Facebook was to blame for days of violence in which 78 people died, and the following year the UN Human Rights Council warned the company to “ensure that its platform contributes to people’s expressions, rather than becoming a tool for the spread of hatred or disinformation”.

​READ MORE: Facebook, WhatsApp and Instagram all down as users report worldwide crash

Not all of the harms of social media are as obvious. Facebook-owned Instagram is primarily about sharing inspirational lifestyle images, and 68% of its users are female. It’s not a shocking new revelation that scrolling through pictures of beautiful people having a wonderful time can have negative as well as positive effects, but one presentation leaked by Haugen includes the troubling admission that “32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse”.

Of course, it’s good that this is being discussed behind closed doors, but the concern is that rather than steering vulnerable users away from harms (which might mean away from the app), the algorithm sucks them back in.

“Profits before people” is the charge each time. The question now is, can Facebook find a way to make its platforms both profitable and safe? And if it refuses to even try, then what?