Shelby Grossman of the Stanford Internet Observatory
Shelby Grossman of the Stanford Internet Observatory

Q&A: How ‘fake news’ and disinformation fuel the war in Ukraine

Sign up for Weekday J and get the latest on what's happening in the Jewish Bay Area.

Shelby Grossman spends her days on the internet studying disinformation campaigns as a research scholar at the Stanford Internet Observatory. The nonpartisan research, teaching and policy group, a program of Stanford University’s Cyber Policy Center, is dedicated to uncovering and studying false information online, from “fake news” on TikTok to the creation of whole news sites dedicated to spreading untruths. The Internet Observatory was set up in 2019 as a resource for policymakers and produces publicly available data and reports they can use to understand and stay ahead of online trends around disinformation.

Grossman, whose work focuses on political disinformation online, recently spoke to J. about false stories concerning Ukraine being put online to muddy the waters, sow dissension and present false information as fact.

This interview has been condensed and edited.

J.: How dangerous is disinformation about the war in Ukraine and the Russian invasion?

Shelby Grossman: I think it’s important to take it seriously, but I also think it’s important to not overstate its effect. For example, maybe you heard that a Ukrainian TV station was hacked [on March 16], and they broadcast a deep fake video of [Ukrainian President Volodymyr] Zelensky saying that Ukraine was surrendering. And it didn’t work. Apparently, the audio and the video were really janky so people saw it, and they were like, “What? This isn’t like Zelensky.” Also Zelensky countered it very quickly. But you could imagine a scenario where that kind of thing was effective, so we need to be alert.

One of the reasons that they’re not taking off is because the media and disinformation researcher community is just so on top of things. It’s just been really incredible to watch. A pro-Kremlin false flag allegation will come out and within 12 hours someone on Twitter will have geolocated the video and show that the video was from before the invasion. But I’m still scared that something could slip through the cracks and change people’s minds about what’s happening on the ground.

A tweet from an account with a stolen profile picture sharing a forged document supposedly from a Ukrainian gay rights group. (Courtesy/Stanford Internet Observatory)
A tweet from an account with a stolen profile picture sharing a forged document supposedly from a Ukrainian gay rights group. (Courtesy/Stanford Internet Observatory)

Tell me more about the kind of false information you’ve been seeing around the invasion.

Some of the narratives we’ve seen are recycled and some of them are brand new. There’s this narrative going around that the U.S. was funding a bioweapons lab in Ukraine. [Russia] made that allegation in many different countries over the years. The U.S. was funding a bio lab in Ukraine — but they’re claiming it was this nefarious thing where they were trying to create bioweapons, which is not true.

All the false flag allegations that Ukraine did something to somehow trigger the Russian invasion, those are brand new stories that are being made up for this particular moment in time.

What methods do people spreading disinformation use to get their story out?

One thing that we’ve seen is the use of fake news outlets. This is a super-common tactic. Instagram suspended a network of accounts that were pretending to be Polish news outlets. These accounts were spreading narratives that Ukrainian refugees had committed hundreds of crimes in Poland, and that these Ukraine refugees were organizing fights with Poland — which didn’t really make that much sense.

Another tactic that we’re seeing is this network called Secondary Infektion. They’ve been around for a long time. All we know about Secondary Infektion is that it originates in Russia — we can’t say that it’s attributed to the Russian government — but their MO is to create these fake profiles. They steal a profile photo from someone else, they come up with a fake name. They post to a blog that’s kind of marginal … and then they hope it gets picked up. They were pushing this narrative, for example, that Ukraine is anti-LGBTQ, maybe to try to draw a wedge between the U.S. and Ukraine.

A cartoon shared by a pro-Kremlin Telegram channel pushing the view that the U.S. was causing the Ukraine crisis. (Courtesy/Stanford Internet Observatory)
A cartoon shared by a pro-Kremlin Telegram channel pushing the view that the U.S. was causing the Ukraine crisis. (Courtesy/Stanford Internet Observatory)

You and your colleagues spend a lot of time tracing these campaigns. What is your work like on a daily basis?

We go on these web blogs that we know from previous research, we’ll just look at the posts. And if the person says they’re Vladimir so-and-so and then you reverse image search the photo and it’s a guy from New Jersey and his name is John Smith —  that’s interesting and we’ll take note of that.

We do a lot of narrative tracing. We’ll Google key phrases from the narrative to try and figure out where it originated, and where it spread. And then we’ll look at who on social media is sharing those narratives, and if those social media accounts have anything in common. So if you find that there is a weird article that seems kind of random, but it was shared by 100 Twitter accounts all created in the same month in 2017 and all with profile photos like cartoon characters — those are the kinds of things that we’re looking for. You go down a lot of rabbit holes! For everything that we put out, there are five rabbit holes we went down that were completely unproductive.

Are you seeing an evolution of disinformation from Russian parties about the war as it progresses?

In the context of the war, they [are] taking the approach of “throw everything at the wall and see what sticks,” so I don’t really anticipate any arc to this. I think the bioweapon narrative resonated with some conspiracy theory people, so they might stick with that one for a bit, but they’re almost certainly going to continue to try to generate other random [stories] and see what works.

The common refrain these days — and this is one that I agree with — is that Russia is losing the information war in Ukraine and in the West. And I think one of the reasons is that a lot of the narratives that they’re pushing are just, on their face, insane. Like this idea that Ukraine has a neo-Nazi problem, when the president of Ukraine is Jewish. That just doesn’t make sense. That narrative is not going to resonate with anyone in the West. But I think there’s an argument that those narratives are not really meant to resonate with the West, but they’re meant to resonate with Russians in Russia.

Maya Mirsky
Maya Mirsky

Maya Mirsky is a J. Staff Writer based in Oakland.