Monika Bickert and Jonathan Zittrain looking up at a screen

Facebook’s Monika Bickert joins Harvard’s Jonathan Zittrain for a talk hosted by the Berkman Klein Center for Internet & Society.

Jon Chase/Harvard Staff Photographer

Nation & World

The view from inside Facebook

4 min read

Professor Jonathan Zittrain discusses social media giant’s ‘long year’ with head of global policy management

At a time when social media affects everything from our private lives to our public discourse, the rules governing online behavior are increasingly under scrutiny. At Facebook, the process behind those rules — how they are determined, and how they continue to change — is the province of Monika Bickert, the head of global policy management.

On Monday, Bickert, who holds a J.D. from Harvard Law School, joined Jonathan Zittrain, the George Bemis Professor of International Law, for a wide-ranging conversation about the social media giant’s policies and its evolution. The event, which included tough questions from audience members on the company’s recent headline-making controversies, was hosted by the Berkman Klein Center for Internet & Society.

Citing his sense of the “pessimism and near-despair that permeate our feelings about social media,” Zittrain opened the conversation by recalling a September 2017 discussion in which he and Bickert looked at the rise of white nationalism and the first indications of how social media manipulation had been at play in the 2016 elections. Since then, of course, more information about fake accounts and online attacks has come to light.

“It’s been a long year,” Zittrain added, before asking Bickert how the time had been spent.

Much of it has been focused on process, answered Bickert. Specifically, she described the company’s efforts to update and implement standards and rules, and to make them as transparent as possible. Addressing the first, she explained how Facebook sets the guidelines for its approximately 2.2 billion regular users, 87 percent of whom are outside the U.S.

“Every single decision we make is vetted across the company,” she said.

Bickert oversees 11 locations around the world. These operations are staffed by lawyers, as well as by specialists in such areas as child sexual abuse, terrorism, and hate speech. The centers set and oversee standards, handling more than a million reports a day in areas including fake accounts, violence and criminal behavior, and sexual solicitation.

Many of those reports are evaluated by increasingly sophisticated algorithms. Terrorist propaganda, for example, uses various patterns and phrases that are easy for machines to recognize. Other offenses have been harder to codify, meaning that many posts still need to be seen by one or more of the company’s 15,000 content reviewers.

Context shades meaning, and subtleties of expression may elude a program, Bickert said. For example, “If you don’t come to my party I’ll kill you,” is probably not a real threat, even if an algorithm might mark it as one.

The rules of what is permissible have changed over time, Bickert noted. The definition of hate speech, for example, has been broken down into three tiers. The first two involve attacks on a person or group of people because of a characteristic such as ethnicity, nationality, gender, or sexual orientation. Posts or comments that fall into these tiers are not allowed on the platform. However, a third tier involves posts that call for the exclusion or segregation of a certain group, which may be allowed in the context of a political discussion about policy.

“‘Burn the immigrants’ … is hate speech,” Bickert explained. “‘I don’t want more immigrants,’ we would allow. We want people to discuss immigration.”

How Facebook responds to such posts differs as well. Offensive posts that are caught on review will be taken down. Others, caught by technology, will not even be uploaded. Those that are explicitly criminal will be sent to the relevant authorities.

“If it’s child sexual abuse we don’t just stop the upload, we report it,” said Bickert.

Exactly what gets reported and to whom can be a thorny question. In countries with more restrictive laws governing speech, some posts may break laws but not international norms. Bickert used the example of flag burning, which is illegal in India. After considering issues of legality, she said, a flag-burning post would be removed or blocked in that country, she said, but not in others.

Attendees pushed back at several points, addressing recent controversies including Facebook’s hiring of an opposition research firm to investigate critics of company policy and behavior.

While Bickert answered that her work does not touch directly on the company’s strategic efforts, she acknowledged that there have been areas where content-moderation policies have resulted in valid criticism, and said the company has been working to be more inclusive in its decision-making.