Fighting disinformation

Q&A: Jane Lytvynenko on disinformation and how journalists can navigate an increasingly cloudy social media landscape

Alanna Dvorak, Poynter.org

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Q&A: Jane Lytvynenko on disinformation and how journalists can navigate an increasingly cloudy social media landscape

JANE LYTVYNENKO

Photo courtesy of Jane Lytvynenko

The Kyiv-born journalist, who has worked on stories such as Russia's stolen grain, and the Wagner mercenary group, discusses reporting techniques, trends in disinformation, and covering elections

This article was originally published on Poynter.org on March 28, 2024.

A pioneering reporter in the disinformation space, Jane Lytvynenko’s work has taken her both across continents and deep into the underbellies of the internet and social media. In a career that includes a five-year stint at BuzzFeed News and a research fellowship at Harvard’s Shorenstein Center, Lytvynenko has reported on extremism, disinformation networks and elections.

Now a full-time freelancer, her current efforts focus on Russia’s war against Ukraine, a topic of personal significance to the Kyiv-born Lytvynenko.

On April 4, Lytvynenko will be hosting a webinar for the international fact-checking community on investigating Telegram, a cloud-based, cross-platform messaging service with more than 900 million monthly active users. During this 90-minute event, Lytvnenko will explain how journalists can use the platform in their reporting and fact-checking. While the event is free to participants, registration is required.

Before her webinar and in anticipation of International Fact-Checking Day on April 2, Lytvynenko sat down with us to discuss her work and how fact-checkers can approach a rapidly evolving disinformation space.

This interview has been edited for clarity and brevity.

Alanna Dvorak: Hi Jane. Thanks for taking some time to talk with the IFCN and provide training for the fact-checking community. First, can you talk to me a little bit about some projects you’ve worked on recently?

Jane Lytvynenko: I’m a freelance reporter, and I’ve been very lucky to be able to contribute to The Wall Street Journal’s video investigations team on some heavy-hitting projects. The team’s approach has been to use publicly available information and combine it with source work, document work and traditional reporting. Our first project together, which was an immense team effort, showed the entire path of grain stolen by Russia from occupied territories in Ukraine and shipped internationally. Last year, it won the Gerard Loeb Award in the video category.

We applied that approach to other projects, the biggest of which was an investigative documentary into the Wagner group. The idea from the team and the editors was to look at the funding of the Wagner group, their actions, and, of course, the victims. The reporting techniques we used were very diverse. We gathered social media posts, worked with documents, spoke to sources and victims, looked at satellite photos — if it was useful to us, we utilized it. The result from the team is a comprehensive investigation, and a testament to the growing set of tools available to reporters right now.

I remain focused on covering Ukraine with a mix of both online investigations and traditional reporting. One thing that became clear early on in the full-scale invasion is that social media will be key to accessing otherwise difficult-to-tell stories. At the moment, a large part of the country is occupied by Russian forces, and many Ukrainian residents who remain don’t want to speak with journalists out of fear. Yet, we have some idea of what’s going on there. Social media, satellite imagery, and propaganda are all windows that can help reporters.

Dvorak: How has covering Ukraine affected you? Not only because you are covering a conflict, but you are covering a conflict in Ukraine, where you are from.

Lytvynenko: For every reporter, covering stories about your own community carries added responsibility. At the same time, understanding the cultural context can lead to fuller stories and better angles, and the hope is for that to resonate with the audience. But when global events directly affect your loved ones and your community, it’s immensely difficult to witness.

From my perspective, despite the challenge and responsibility of it, an understanding of a culture, a history and a language is an obvious asset. It helps translate on-the-ground reality into well-rounded stories, focusing on important perspectives that can sometimes remain hidden from people outside the community.

Dvorak: What do you think are the emerging trends in the disinformation space? Is it all just AI or are there more things coming down the pipeline?

Lytvynenko: It’s a complex question! Algorithmically-generated false information is, of course, one of them. The biggest worry here is the volume, which far outpaces human capacity. Machine-generated content is being created and targeted at specific audiences, sometimes on ideological or political grounds, and sometimes just to reap profits from online advertising. This is something not only journalists, but social media companies have to grapple with if they want to protect their user base.

The social media environment itself has changed a lot, too. We see less appetite from companies to tackle disinformation on their platform, work with researchers or maintain tools that help journalists dissect the environment. So those factors compounded make the work of fact-checkers and journalists more challenging.

But, just as we’re keeping an eye on new trends, we have to remember the fundamentals of disinformation. Although a lot of it spreads online, that’s not the only medium it functions in — traditional media are also important. And we have to remember the purpose of disinformation and propaganda. Some of it is financially motivated, some of it is used to build political will, and some of it is created to widen societal divisions. In many cases, it’s all of the above. So as researchers and reporters, we have to keep in mind not only the tools behind disinformation, but the intent, too. That focus will allow us to keep an eye on the bigger picture and not get lost in the weeds.

Dvorak: You’ve done a lot of research on the social media space. What changes do you anticipate in that sphere? Do you expect platforms to improve their fact-checking on platforms or worsen?

Lytvynenko: The social media experience is becoming ever-more fractured and more personalized. Ephemeral media, like Instagram Stories or Snapchat, continue to be popular, but in addition to that, each mainstream social media platform hyper-individualizes the user experience. This ecosystem makes it challenging to evaluate the prevalence, spread, and origin of a given false narrative or manipulation. Additionally, written content is becoming secondary to visual or audio content, which adds another layer of difficulty in tracking.

Some false narratives still rely on websites for spread, but we’ve seen other trends like the use of influencers, online advertising and algorithmically generated content grow. In this, reporters and fact-checkers will have to work actively with their audiences to understand the ecosystems and narratives they’re exposed to. There also has to be a level of education from our coverage about how social media function and their vulnerabilities.

Dvorak: Your workshop for the IFCN focuses on Telegram, a cloud-based, cross-platform messaging service. Why should this platform be on fact-checkers’ and journalists’ radars?

Lytvynenko: Telegram is not a new social media platform, but its user base has skyrocketed globally in recent years. It functions like a mix between a newsfeed, a forum and a group chat. It hosts a breadth of content — there are large aggregate news channels, local neighborhood community groups and even government agencies using it.

But it’s also ground zero for visual evidence from conflicts. As more mainstream social media networks limit graphic or violent content, Telegram does not. Meaning, for a group that wants to get information out, Telegram is usually the first stop. I often see images or videos that were first posted on Telegram then get aggregated to other platforms. This makes Telegram increasingly indispensable for reporters and fact-checkers. For many breaking news events, we just cannot see the full picture without looking at Telegram.

Dvorak: You also have a history of covering elections. With more than 64 elections slated for 2024, what impact do you think this will have on the disinformation space? How can journalists prepare? What can they do to help their audiences navigate this noisy ecosystem?

Lytvynenko: Each country’s experiences of elections will be unique, so we have to be mindful of the local context above all else. One important thing to remember during election coverage is that any disinformation narrative has to start somewhere. Usually, with a grain of truth. When I covered the 2020 election in the U.S., I saw that firsthand. In one ballot-counting center, a man was removed after causing a disturbance and filming the count, a fact that was confirmed to me by election workers and independent observers at that center. By the end of the day, the man was on Fox News, making false statements about why he was asked to leave to a wide audience. It was a narrative based on a real incident, but the wrong facts.

From that story and others, I can say that the most important thing for reporters covering elections, including fact-checkers, is to have contact with election authorities and people on the ground who can help explain what went on. National narratives often build on local narratives, and reporters have to watch both. This means setting up a robust monitoring environment across social media platforms, particularly in contentious races, and inviting audience participation where possible. To not add confusion, reporters should go to great lengths to explain how they obtained information. Without an explanation of the journalistic process, audiences can be more likely to distrust and discount the findings of a story.

Dvorak: The media faces a lot of challenges right now, but we wanted to end on a positive note. What do you think are the more positive developments in reporting right now for fact-checkers and investigative journalists?

Lytvynenko: For me, the most positive development in newsrooms and the industry at large is the growth of open-source and visual investigative journalism. It’s a trend we’ve seen for a few years now, and it’s developing further every day.

I was recently one of the judges for the inaugural Centre for Information Resilience Open Source Film Awards, and I was blown away by the creativity of the entrants. Journalists are finding ways to combine shoe-leather reporting with online investigations in immensely effective ways. And not only are more and more newsrooms investing in these skills, the toolset available to reporters is growing. From social media analysis to satellite imagery to targeted machine learning and algorithmic tools, it’s clear to me that this direction will continue developing. Audiences respond well to it, too. Because a key component of this type of reporting is showing your work, it helps build trust and understanding.

But my favorite part of online investigations is the skill exchanges. Reporters in this field can be pretty nerdy, excited to share the new tool or approach. This comradery is infectious, and it’s at the core of open-source investigations. – Rappler.com

Register for Poynter’s April 4 webinar with Jane Lytvynenko here.

Alanna Dvorak is the International Training Manager with IFCN (International Fact-Checking Network), where she helps produce interactive learning materials for journalists around the globe. 

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!