The authors, from the UCLA College and the UCLA Samueli School of Engineering, illustrated the difference in the storytelling elements of a debunked conspiracy theory and those that emerged when journalists covered an actual event in the news media. Their approach could help shed light on how and why other conspiracy theories, including those around COVID-19, spread — even in the absence of facts.
The study, published in the journal PLOS ONE, analyzed the spread of news about the 2013 “Bridgegate” scandal in New Jersey — an actual conspiracy — and the spread of misinformation about the 2016 “Pizzagate” myth, the completely fabricated conspiracy theory that a Washington, D.C., pizza restaurant was the center of a child sex-trafficking ring that involved prominent Democratic Party officials, including Hillary Clinton.
The researchers used machine learning, a form of artificial intelligence, to analyze the information that spread online about the Pizzagate story. The AI automatically can tease out all of the people, places, things and organizations in a story spreading online — whether the story is true or fabricated — and identify how they are related to each other.
Finding the puzzle pieces
In either case — whether for a conspiracy theory or an actual news story — the narrative framework is established by the relationships among all of the elements of the storyline. And, it turns out, conspiracy theories tend to form around certain elements that act as the adhesive holding the facts and characters together.
“Finding narratives hidden in social media forums is like solving a huge jigsaw puzzle, with the added complication of noise, where many of the pieces are just irrelevant,” said Vwani Roychowdhury, a UCLA professor of electrical and computer engineering and an expert in machine learning, and a lead author of the paper.
In recent years, researchers have made great strides in developing artificial intelligence tools that can analyze batches of text and identify the pieces to those puzzles. As the AI learns to identify patterns, identities and interactions that are embedded in words and phrases, the narratives begin to make “sense.” Drawing from the massive amount of data available on social media, and because of improving technology, the systems are increasingly able to teach themselves to “read” narratives, almost as if they were human.
The visual representations of those story frameworks showed the researchers how false conspiracy theory narratives are held together by threads that connect multiple characters, places and things. But they found that if even one of those threads is cut, the other elements often can’t form a coherent story without it.
“One of the characteristics of a conspiracy theory narrative framework is that it is easily ‘disconnected,'” said Timothy Tangherlini, one of the paper’s lead authors, a professor in the UCLA Scandinavian section whose scholarship focuses on folklore, legend and popular culture. “If you take out one of the characters or story elements of a conspiracy theory, the connections between the other elements of the story fall apart.”
Which elements stick?
In contrast, he said, the stories around actual conspiracies — because they’re true — tend to stand up even if any given element of the story is removed from the framework. Consider Bridgegate, for example, in which New Jersey officials closed several lanes of the George Washington Bridge for politically motivated reasons. Even if any number of threads were removed from the news coverage of the scandal, the story would have held together: All of the characters involved had multiple points of connection by way of their roles in New Jersey politics.
“They are all within the same domain, in this case New Jersey politics, which will continue to exist irrespective of the deletions,” Tangherlini said. “Those connections don’t require the same ‘glue’ that a conspiracy theory does.”
Tangherlini calls himself a “computational folklorist.” Over the past several years, he has collaborated regularly with Roychowdhury to better understand the spread of information around hot-button issues like the anti-vaccination movement.
To analyze Pizzagate, in which the conspiracy theory arose from a creative interpretation of hacked emails released in 2016 by Wikileaks, the researchers analyzed nearly 18,000 posts from April 2016 through February 2018 from discussion boards on the websites Reddit and Voat.
“When we looked at the layers and structure of the narrative about Pizzagate, we found that if you take out Wikileaks as one of the elements in the story, the rest of the connections don’t hold up,” Tangherlini said. “In this conspiracy, the Wikileaks email dump and how theorists creatively interpreted the content of what was in the emails are the only glue holding the conspiracy together.”
The data generated by the AI analysis enabled the researchers to produce a graphic representation of narratives, with layers for major subplots of each story, and lines connecting the key people, places and institutions within and among those layers.
Quick build versus slow burn
Another difference that emerged between real and false narratives concerned the time they take to build. Narrative structures around conspiracy theories tend to build and become stable quickly, while narrative frameworks around actual conspiracies can take years to emerge, Tangherlini said. For example, the narrative framework of Pizzagate stabilized within a month after the Wikileaks dump, and it stayed relatively consistent over the next three years.
“The fact that additional information related to an actual conspiracy emerged over a prolonged period of time (here five and half years) might be one of the telltale signs of distinguishing a conspiracy from a conspiracy theory,” the authors wrote in the study.
Tangherlini said it’s becoming increasingly important to understand how conspiracy theories abound, in part because stories like Pizzagate have inspired some to take actions that endanger other people.
“The threat narratives found in conspiracy theories can imply or present strategies that encourage people to take real-world action,” he said. “Edgar Welch went to that Washington pizzeria with a gun looking for supposed caves hiding victims of sex trafficking.”
The UCLA researchers have also written another paper examining the narrative frameworks surrounding conspiracy theories related to COVID-19. In that study, which has been published on an open-source forum, they track how the conspiracy theories are being layered on to previously circulated conspiracy theories such as those about the perceived danger of vaccines, and, in other cases how the pandemic has given rise to completely new ones, like the idea that 5G cellular networks spread the coronavirus.
“We’re using the same pipeline on COVID-19 discussions as we did for Pizzagate,” Tangherlini said. “In Pizzagate, the targets were more limited, and the conspiracy theory stabilized rapidly. With COVID-19, there are many competing conspiracy theories, and we are tracing the alignment of multiple, smaller conspiracy theories into larger ones. But the underlying theory is identical for all conspiracy theories.”
###
Original post https://alertarticles.info