Jeff Horwitz and Latanya Sweeney.

Wall Street Journal reporter Jeff Horwitz (left), who led the 14-part series known as “The Facebook Files,” spoke with Harvard Professor Latanya Sweeney. Horwitz said the public’s understanding of Facebook and its motivations is outdated.

Jon Chase/Harvard Staff Photographer

Nation & World

Exploring the dark, puzzling inner workings of Facebook

4 min read

Reporter who wrote about whistleblower documents outlines findings at Kennedy School event

Republicans think Facebook is silencing conservative voices because Silicon Valley is full of  Democrats. Democrats think Facebook is promoting right-wing hate speech and dangerous conspiracy theories because it’s profitable.

“The answer is that Facebook is deeply partisan in favor of Facebook,” said Jeff Horwitz, a technology reporter for The Wall Street Journal who led the paper’s investigation into the company’s abandoned effort to tamp down political polarization on the platform after the 2020 election. The inquiry offered an unprecedented look into the inner workings of the enormously popular and influential social media service.

In conversation with Latanya Sweeney, Daniel Paul Professor of the Practice of Government and Technology, at Harvard Kennedy School on Nov. 1, Horwitz said the public’s understanding of Facebook and its motivations is quite outdated.

“It’s not that Mark Zuckerberg is just trying to get another buck. It’s not that the company is some sort of evil force that is trying to affect our politics in one way or another,” he said. “It’s they built a thing, tremendously successful, and in very typical Silicon Valley fashion, did not really think that much about externalities … and then realized very late in the game that their products had some significant deleterious effects around the world.”

The 14-part Journal series, known as “The Facebook Files,” was based on tens of thousands of internal documents provided by whistleblower Frances Haugen, which included research and employee communications about negative social and political effects Facebook and Instagram had facilitated.

The still ongoing series showed that the company knew that Instagram damaged the mental health of teen girls. It also found that despite public statements to the contrary, Facebook applied rules to limit abusive behavior by users — including harassment and incitement to violence — and it did so unevenly, often giving a free pass to celebrities and high-profile political figures.

Outside the U.S., where regulation or enforcement is “minimal,” Horwitz said, the platform remains a haven for trafficking of humans and body parts, radicalization by terrorists, and foreign interference in elections.

He said one fact that shocked him most was learning that many of the issues Facebook had been pilloried for “seemed solvable.”

Despite company claims that platform changes to curb the widespread sharing of misinformation or hate speech would be hard to develop and implement or would involve vague “trade-offs,” for example, he said many solutions had already been developed internally and were fairly easy to implement, he said.

There appeared to be many areas where Facebook simply opted not to act to address the issues and minimize public fallout with fixes that wouldn’t really damage “the underlying business that heavily. And I’m still trying to figure out why that is.”

Ideas such as eliminating the reshare button (“just kill it”) or limiting how far reshares can travel. “That alone was a very effective tool that required zero staffing and no hard choices in terms of censorship,” he said.

Slowing how fast content is shared is another effective tool the company knows would cut down on misinformation, as its research showed cutting the speed of the fastest shared posts to the 85 percentile yields “double digit” result in terms of the content Facebook has already classified as misinformation, he said.

But Horwitz demurred when asked what should be done. He said the goal he and Haugen had for the series was not “necessarily” to deliver solutions, but to update and expand the public’s understanding of Facebook, disclose how the company understood itself, and figure out why it seemed “tone deaf” to the problems.

Facebook has criticized the conclusions about teen body image, saying the Journal’s reporting is based on noncausal research.

“And they are totally correct,” Horwitz said. “It does represent, however, the company’s best understanding of reality.”

As Facebook faces its worst public-relations crisis yet, Sweeney asked whether employees can still feel comfortable raising questions internally about the company’s actions. Horwitz answered, “Probably not.”

He said the company’s ongoing culture of secrecy remains deeply problematic for Facebook and the world, pointing out the value of Haugen’s documents stems largely from the fact that its the largest single tranche of information about the 17-year-old company ever made public.

“Frances Haugen talking to me is not a functional system of getting information about the world’s most widely-used communications platform.”