Society Needs to Adapt to a World of Widespread Disinformation
Society Needs to Adapt to a World of Widespread Disinformation
Renée DiResta is leading the fight against online disinformation. On the World Class Podcast, she describes what it’s like to expose malign actors in the emerging world of ceaseless propaganda and conspiracy theories.
Renée DiResta has been studying the way information spreads on the internet and social media platforms for almost a decade. During that time, the sharing of false information and propaganda online has accelerated exponentially, and DiResta is not surprised that it has gained so much traction within partisan politics, she told host Michael McFaul on the World Class Podcast.
The explosion of conspiracy theories and conspiratorial communities — such as QAnon — is something she says she couldn’t have predicted, however.
“What has happened is a devolution into bespoke realities,” DiResta said. “People are operating in complete parallel universes.”
DiResta is the research manager at the Stanford Internet Observatory (SIO), where she and her team study the emergent ways in which information is spreading online, with a focus on malign narratives. They work to understand how the internet and tech platforms are misused by both foreign and domestic actors in order to help governments and the platforms fight back.
Oftentimes, the platforms will reach out to the SIO for help in analyzing suspected coordinated inauthentic behavior, or “manipulative campaigns,” DiResta said. Once they’ve identified the suspicious behavior, SIO researchers will examine the tactics of how the operation was executed, and they’ll study the narratives to try to understand the motivations of the actors behind it.
Drawing on the data gained from their research, the SIO suggests policy changes to members of government and the tech platforms to potentially mitigate the misuse going forward.
These days, DiResta and the SIO are spending a lot of time working with the Election Integrity Partnership (EIP), a coalition of research entities that are trying to detect and mitigate the impact of attempts to prevent or deter people from voting or to delegitimize election results.
Researchers are working around the clock to monitor keywords and platforms in an attempt to understand how certain voting narratives are taking place online. In addition to partnering with tech platforms, the EIP is also working with organizations within civil society and government — including the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency — that are assisting in the analysis process and in communicating the EIP’s findings to the public.
DiResta thinks that the tech platforms are much more prepared to handle the onslaught of election-related disinformation this time around than they were four years ago. All of the large companies now have “integrity teams” in place, whose entire focus is to ensure that the integrity of the 2020 election (and elections worldwide) are not compromised because of malign actors.
She’s concerned about what may happen in the days following November 3 though, especially if half of the U.S. population does not believe that the election was conducted fairly.
“A real challenge we have is that intervention in the form of labeling a presidential tweet [as potentially misleading] is now seen as censorship,” she said. “We’re still in a transitional period — we as a society have not yet developed ways of living in a time of high-velocity, high-virality, and often completely false stories.”