Nation & World

AI presents challenges to journalism — but also opportunities

Sotiris Sideris speaking during the event.

Sotiris Sideris.

Photos by Niles Singer/Harvard Staff Photographer

4 min read

Data editor explains how digital tools sift through mountains of government, business data to find ways to make things better or unearth crimes

The surge of AI produced articles has ignited a series of concerns about the accuracy of news amid the dwindling number of working journalists who serve as a counterforce against the dissemination of inaccurate or false information.

Certainly AI does pose ethical and others challenges, but it also offers reporters greater opportunities to do more high-impact, consequential stories, according to Sotiris Sideris, data editor at the Center for Collaborative Investigative Journalism and at Reporters United in Greece during a talk at the Center for European Studies Tuesday.

Data- and generative AI-driven tools allow reporters to analyze in a timelier fashion vast troves of government and business data and identify important patterns that point the way to improvements or uncover questionable, or even illegal, activities, he said.

“The question today isn’t whether we are using AI in journalism, because we do it already,” but whether “we can do journalism without outsourcing our skepticism, our ethics, and our sense of accountability, both as journalists ourselves and the accountability we are asking people and organizations that hold power to provide,” said Sideris who is studying how generative AI can better assist investigative reporting as a 2026 Nieman Fellow.

Attendees look on during Sotiris Sideris’s presentation.
Sideris shared how AI tools helped uncover a fleet of Greek-owned ships that was stealthily transporting Russian oil to Europe in violation of sanctions.

Human reporting, writing, and editing are still essential to getting stories, but data and generative AI can play an important dual role as a “microscope” that helps reporters quickly cut through the information “noise” hidden within disparate documents and reports, he said.

At the same time, these tools can serve as a “mirror that reflects our own biases and our own stereotypes” that can mislead journalists into drawing the wrong conclusions, he said.

Sideris shared with the students and journalists present how AI tools helped him and his colleagues uncover a fleet of Greek-owned ships that was stealthily transporting Russian oil to Europe in violation of sanctions.

They were also able to show how the popularity of Airbnb since its introduction in Greece less than a decade ago drove up rents and home sale prices, which led to widespread displacement of Athens residents as areas became unaffordable and foreclosures spiked.

Reporters were able to pull and scrutinize data from records and filings publicly available on the internet and piece together the complicated global network of property for a follow-on piece. That report showed how foreclosed properties were auctioned off on platforms not accessible to the public and bought at steep discounts by the very banks that had foreclosed on the homes.

Journalists need to learn how to use data and generative AI, understand their power and their limitations, and still do the old-fashioned hard work of vetting and documenting what they know and how they know it, he said.

In this new era of journalism, “total transparency” is more imperative than ever.

“When we ask for somebody to be transparent, we cannot ask them without us being transparent in the very, very beginning of the story about how we are using the tools, about who is funding our work, about any editorial decision that we make along the way,” said Sideris.

Whether in long investigative pieces or daily newsgathering, journalists need to be up front about how they’re working with AI, he said.

“There is no reason to conceal it. We know that you are using it; we know that everybody is using it. It’s not a secret anymore. It’s not something to feel bad about.

“But let us know.”