Martha Minow.

Harvard Law’s Martha Minow says there are plenty of steps the federal government could take to clean up disinformation and misinformation without running afoul of the Constitution.

Kris Snibbe/Harvard file photo

Nation & World

How the government can support a free press and cut disinformation

9 min read

Law School’s Martha Minow on the First Amendment and making web and media firms more accountable

The mainstream news industry has been in sharp decline since the 1990s, owing to a series of financial and cultural changes brought by the rise of the internet. Amid the closing or shrinking of newspapers, magazines, and other legacy news outlets, Americans have increasingly turned to social media and heavily partisan websites and cable networks as their main sources of news and information, which has led to a proliferation of disinformation and misinformation and fueled polarization.

Given the vital role a free and responsible press plays in American democracy and the unique protections the Constitution provides for it under the First Amendment, is it time for the government to get involved? Is it government’s place to do so? And how could that happen without infringing on that freedom?

In a new book, “Saving the News: Why the Constitution Calls for Government Action to Preserve Freedom of Speech” (Oxford University Press, 2021), Martha Minow, 300th Anniversary University Professor at Harvard Law School, says the First Amendment not only does not preclude the federal government from protecting a free press in jeopardy, it requires that it do so. Minow spoke with the Gazette about some of the ways to potentially clean up social media and bankroll local news, and why arguing on Twitter isn’t a First Amendment right.


Martha Minow

GAZETTE: There seems to be broad misunderstanding about what speech is protected by the First Amendment and what is not. Underlying “cancel culture” and complaints about “deplatforming” is a belief that people should not be penalized for saying things online that others find objectionable or that are inaccurate or even false because of their right to freely express themselves. Can you clarify how the First Amendment applies and doesn’t apply to social media platforms, like Twitter or Facebook, and online generally?

MINOW: I wrote a book to examine the challenges and decline of the news industry during a time of exploding misinformation and disinformation, a global pandemic, and great challenges to democracies in the United States and elsewhere. Certainly, one big dimension of this context is [what] some people are calling [an] infodemic: the flood of information that is enabled by the internet, and particularly social media. But it is not just social media. It’s conventional media, particularly cable news, but also some broadcast news.

Most of the sources of communications are private, and private communications are not governed by the First Amendment. Private companies are entitled to edit, elevate, suppress, remove [speech], whether it’s in broadcast, cable, or on a social media platform. Indeed, private companies have First Amendment freedoms against any government intervention. We in America are very fond of rights, and rights maybe are what hold us together more certainly than shared traditions, shared identities. And one of the ways that’s really evolved is how we talk about rights as if it’s a cultural phenomenon or it’s part of our identities. But that kind of informal conversation about “I have First Amendment freedom” may be a metaphor on a social media platform, but it is not a legal right. We sign terms-of-service agreements with platform companies. They’re the ones that control what is communicated and what’s not. That’s much less edited than broadcast or cable or print media. So, we’re living in an unprecedented time of lowered barriers to communicating to mass audiences — almost anybody can have access to a mass audience. But that’s all enabled by private providers and the private providers are not restricted by the First Amendment in what they remove or amplify.

GAZETTE: What are a few of the measures that could effectively hold tech firms to account for what is published and shared on their platforms?

MINOW: When it comes to holding the platform companies responsible for conveying, amplifying, even escalating hateful communications, misinformation, [and] disinformation, there are some techniques, but we have to be careful because if the government is involved, then the First Amendment is front and center. The techniques include eliminating or reducing the immunity currently granted under the [1996] Communications Decency Act, which has a section, Section 230, that treats platform companies differently from any other media and specifically immunizes them from liabilities that apply to all these other entities. They include liabilities for fraud, for defamation, for violating contract terms. [But] even Section 230 does not immunize the platforms from criminal responsibility or from violations of intellectual property rights. So, one very direct step to hold companies responsible would be to either eliminate this immunity or make it conditional. I actually prefer that alternative.

Companies adopt and should adhere to standards of moderation, content moderation rules. They can develop their own, but the idea would be they’d have to announce standards; they’d have to report on them; and they’d have to have processes to act on anyone calling them out for violating their own standards. That’s pretty direct, and it would put them on the same par as all the other media entities that exist in the country.

Another possibility would be to take intellectual property seriously and make the platforms pay when they take or steal or promote information from other news sources. They don’t put the revenues that they gain, particularly from advertising, back into investment in news. It’s not a punishment; it’s simply the idea of holding them responsible like [the] grown-up companies that they are.

You know, the fact of the matter is, the big disinformation source is as much broadcast and cable [television as it is online] and on those, there is a basis for government regulation. The FCC could take that seriously and withhold licenses, remove them, terminate them, for companies that are misleading people, that are labeling as news something that’s entirely opinion. Cable is largely a monopoly. Local communities grant franchises to cable companies; local communities could hold them more responsible. I don’t look forward to a day, I hope we never see it, that the government, at any level, is deciding the content. But when there is scarce opportunity to amplify communications given to private companies, it’s only fair that they should have standards that they then deliver on [by] providing some quality control of what they amplify. There is no right to have your message sent to everybody in the world anywhere. What there is, is a right to be free from government restrictions on your speech. So, one very specific suggestion that I have is that when we deal with digital communications, there could be a delay, and there could be speed bumps. Before people can spread messages to large numbers of people, there could be a delay, they could even use artificial intelligence to monitor it before it can be spread beyond a handful of people.

GAZETTE: The era of self-policing hasn’t worked very well so far, but you say there are things companies can and should be doing right now to act more responsibly and to help support the news. What are a few of those?

MINOW: I agree with you that self-regulation has not worked. It’s striking to me that Mark Zuckerberg has said, in effect, “We need help. We can’t do it alone.” And so, I think this is a problem that’s bigger than any one company, and it does require government action. The government can act by enforcing, or strengthening and then enforcing, consumer protection rules, including rules about the uses of our data. The government can act by limiting the immunity granted to internet platforms and condition it on the development of codes of conduct that are then enforced. And the government can act by making rules that require sharing the information about the algorithms and their uses with a watchdog, whether academic or nonprofit organizations. We need to improve the entire ecosystem in which information circulates.

GAZETTE: Local news has been a vital part of that ecosystem. Government can support local news without necessarily wading into a First Amendment quagmire, you argue. What are some ways that could be accomplished?

MINOW: Local news is more trusted by people. There’s less polarization in local communities; there’s more accountability. But with its decline, which is massive, there’s a loss of accountability journalism in local communities and a loss of this ecosystem. And so, one thing to consider is to tax the big platforms and to cordon off the revenues that are generated and plow them back into supporting local news and public media and nonprofit media. There’s a Local Journalism Sustainability Act introduced in the Senate [in July], and it parallels a bipartisan [bill] in the House that uses tax deductions and tax credits to strengthen local news. One of the interesting ideas there is to give a tax break to local companies that buy ads in local news. Another is to relieve payroll taxes for nonprofits and for-profit local news if they hire more journalists. And finally, a dimension that I think is interesting but has its tricky elements, is to allow [tax] deductions for individuals who either subscribe to local news or make gifts to local news. That’s great in the sense that there’s no government involvement. But it’s problematic in that we already know that they are disguises for disinformation, for foreign governments to pretend that they are nonprofit organizations in this country and to hijack what’s otherwise a good idea. So that needs some work.

GAZETTE: So, not a government-funded news service, like Voice of America, but financial assistance for individual news organizations so they can continue working independently and ideally, thrive?

MINOW: That is what’s being proposed, and it certainly poses many fewer worries. It is really to strengthen this ecosystem of public and private, multiple, diverse, news sources. [These are] especially needed in the local context where we have “news deserts” — thousands of communities that have no local news. When Michael Brown was killed in Ferguson, Missouri, and the Department of Justice undertook a massive investigation exposing the way that the legal system relied on fines and fees on the backs of poor people, one of the things that emerged was there was no local news. Bad things happen where no one is watching.

Interview has been edited for clarity and length.