After whistleblower Frances Haugen unleashed a torrent of unflattering revelations about Facebook in the Wall Street Journal and on CBS’ “60 Minutes,” the social media giant pledged to “tackle the spread of misinformation and harmful content.” But as long as the social network makes money off such garbage, such a promise comes across as a sick joke rather than reassurance.
I’ve never been a fervent Facebook hater, though I’ve also never been an everyday user. I’ve spent time with some of Facebook’s top executives (not including founder and CEO Mark Zuckerberg) and I’ve found them to be bright, personable and excellent at projecting social consciousness. Their products, however, are very different. Haugen’s leaks make clear just how vast the gap is between the friendly facade and the ugly reality.
Haugen worked at Facebook as part of a team that was supposed to figure out how to stop the platform from being used to interfere in elections. She left after two years, disappointed and disillusioned. After reading the Journal’s series of articles and watching the “60 Minutes” interview, it’s hard to avoid the conclusion that “misinformation and harmful content” are a feature of the platform, not a bug.
Perhaps Haugen’s most explosive allegation is that Facebook executives are aware that Instagram — which Facebook owns and operates, and which has been a growth center for the company as its core customers age — is toxic to the mental health of some users, especially teenage girls. Vulnerable young Instagram users can spend hours each day scrolling through photos and blaming themselves for not living up to the unrealistic, Plasticine standards of “beauty” that proliferate there.
“There were conflicts of interest between what was good for the public and what was good for Facebook,” Haugen said on “60 Minutes.” “And Facebook, over and over again, chose to optimize for its own interests, like making more money.”
Haugen also alleges that, while Facebook did tighten its policies against incendiary political misinformation in the run-up to the 2020 election, the company relaxed those policies again as soon as the election was over. We know the result: Much of the “Stop the Steal” nonsense — a weaponized lie alleging widespread election fraud, encouraged by President Donald Trump, that fueled the violent Jan. 6 insurrection at the Capitol — circulated on Facebook.
Instead of blaming Facebook, responded the company’s chief of global affairs, Nick Clegg, we should blame “the perpetrators of the violence, and those in politics and elsewhere who actively encouraged them.” He maintains that Facebook is not a “primary cause” of the polarization that splits the country into warring tribes.
But is that supposed to absolve the company? Do Facebook executives feel blameless because their decisions merely made it easier for a dangerous falsehood to spread, with deadly consequences? By all means, deliver justice to the rioters. But Facebook’s hand-wringing is not exactly convincing.
Facebook cannot deny that its algorithms amplify toxic misinformation. I believe wholeheartedly in free speech, so yes, people should have the right to say crazy things. But there’s a difference between allowing users to post vile nonsense and feeding someone who “likes” that nonsense more of the same bile.
It appears the company understands that. If Zuckerberg can dial down the heat as an election approaches, he can keep it at a simmer afterward. Maybe that would come at the cost of some of the engagement that keeps users signed into Facebook and exposed to advertising. But what counts as enough profit? This trillion-dollar company, with nearly 3 billion users worldwide, ought to be able to survive without putting democracy in peril and causing anguish to impressionable young people.
If Facebook will not take seriously the need for reform, governments must act at the federal and perhaps the state level.
One obvious but radical solution would be to classify social media sites not as platforms but as publishers. As such, they would have the same liability for spreading false, defamatory or otherwise damaging information as do the publishers of The Washington Post and other media outlets. They could be sued for monetary damages — a prospect that tends to concentrate the mind.
But in the news media, editors police content. How many editors would Facebook need to comb through everything users posted? Millions?
A less drastic step would be for Facebook to be completely transparent about how its algorithms work and about the process for adjusting them. That would still require a considerable number of humans to oversee the process and make sure the algorithms were doing their job.
But what is no longer acceptable is the status quo. In pursuit of profit, Facebook has cost the rest of us too much.