However, today’s disinformation actors use social media to amplify disinformation that users knowingly or, more often, unknowingly perpetuate. Such disinformation spreads quickly, threatening public health and safety. Indeed, the COVID-19 pandemic and recent global elections have given the world a front-row seat to this form of modern warfare.
A group at ORNL now studies such threats thanks to the evolution at the lab of location intelligence, or research that uses open data to understand places and the factors that influence human activity in them. In the past, location intelligence has informed emergency response, urban planning, transportation planning, energy conservation and policy decisions. Now, location intelligence at ORNL also helps identify disinformation, or shared information that is intentionally misleading, and its impacts.
“Up until now, we knew disinformation campaigns existed online, but we did not know how the disinformation flowed,” said Gautam Thakur, leader of ORNL’s Location Intelligence Group. “By bridging a gap between the virtual world and the physical world, we can now help provide insights that agencies and organizations can use to counteract such threats.”
Today’s disinformation campaigns spread fast and deep. Disinformation actors design carefully crafted messages targeting specific audiences. To spread disinformation, these actors often use bots, or computer algorithms that emulate human behavior online. Only a few narratives need to catch on to create vulnerability, build cohesion among extremist groups or erode civil trust.
Some of the group’s latest work involves understanding how to measure the intent of social media users based on tweets sent during the COVID-19 pandemic.
“We discovered we needed an automated method to quantify the intent of all social media users to help keep the public safe from disinformation actors,” said Chathika Gunaratne, a postdoctoral researcher in ORNL’s Computing and Computational Sciences Directorate.
With help from computational data engineer Varisara Tansakul and data science researcher Debraj De, the multidisciplinary team tested a new approach on 4.7 million COVID-19-related tweets from more than 14,000 users. The team combined the results of this study with its other studies on intent and disinformation. As a result, it can now correlate breaking news notifications with disinformation actors’ online responses, incorporating information such as the unique spatial patterns of information spread and methods used by social media users to intensify this spread. Some of this work is captured in the group’s latest conference paper published in September 2022.
Thakur began collecting data on human activities when he came to ORNL in 2013, but he soon realized the lab needed the ability to characterize — in real time — the human behaviors that were driving the data. Then, in 2015, ORNL released PlanetSense, a digital platform used to analyze online crowd-sourced data in real time. PlanetSense enables researchers to study human activity through the lenses of economy, culture and social ties, and to illustrate how and why things happen in different places. PlanetSense has since underpinned the development of other digital tools at ORNL used in human dynamics research.
“While we build computing systems that capture data from points of interest across the entire planet, it is really understanding the cultural and social ties that allows us to better understand human activities,” Thakur said. “In order to help agencies and organizations respond to certain events, we need to be able to make sense of the data, which means we must be ready to narrate certain activities as they are happening. This requires a very interdisciplinary approach, which is possible at ORNL given our breadth of foundational sciences.”
For more information: https://www.osti.gov/biblio/1891407