The Unpublished Project: Part V

#Kremlin: Using Hashtags to Analyze Russian Disinformation Strategy and Dissemination on Twitter. #MH17 Twitter Cyborgs Three Years Later.

Foto de Towfiqu barbhuiya en Unsplash

Since 2017 we’ve worked on numerous projects, which for reasons of confidentiality have not been published. We will feature five previously unpublished projects now edited. Looking back, this is also a reflection of what’s the same, what’s changed, and what we learned. Our goal is to bring some of our past efforts out of the shadows. Part I Part II Part III Part IV

________________________________________________________________

The original report #Kremlin: Using Hashtags to Analyze Russian Disinformation Strategy and Dissemination on Twitter, was a collaborative project with Sarah Oates ( Merrill College of Journalism, University of Maryland) and we co-presented the paper at the 2019 APSA conference.

Bottom Line Up Front (BLUF)

Out of curiosity, in September 2022 we returned to an original set of 340 Twitter profiles classified as cyborgs (see note #1) and connected to the hashtag #MH17 to see what if anything has changed about their activity.

Account attrition was noteworthy with 181 accounts, being either suspended, inactive, or no longer existing. Additionally, we categorized 62 accounts having a pro-Kremlin feed.

______________________________________________________________

Image From EUvsDisinfo “SIX YEARS OF MH17-LIES: THE KREMLIN IS LOSING ITS OWN GAME”

Background

The following are excerpts from the original paper.

This original paper leverages both social science and data science by using traditional content analysis and Twitter analytics to trace how key aspects of Russian strategic narratives were distributed via #skripal, #mh17, #Donetsk, and #russophobia in late 2018. This work will define how key Russian international communicative goals are expressed through strategic narratives, describe how to find hashtags that reflect those narratives, and analyze user activity around the hashtags. This tests both how Twitter amplifies specific information goals of the Russians as well as the relative success (or failure) of particular hashtags to spread those messages effectively

This research makes two important assumptions. First, it assumes without further investigation (based on previous media and analytical reports) that the Russian government is choosing to pursue a disinformation strategy on Twitter. Nor does this paper attempt to measure the specific effect of the disinformation on the target population, i.e. whether it is causing any change in attitude or behavior. What this paper does is analyze the presence and behavior of Twitter users linked to hashtags that are promoting Kremlin strategic narrative priorities. By doing this, we can provide critical evidence to identify those who are creating or amplifying disinformation in our media ecosystem. By combining the identification of hashtags linked to key Kremlin narrative goals, Twitter users who are amplifying those hashtags, and the behavior of Twitter users over time, we can identify more precisely how online communities of disinformation function. This allows the discussion to evolve beyond identifying whether a Twitter user is a bot or cyborg, which is useful but only one element of disinformation activity. Rather, we are more broadly interested in how the Kremlin’s strategic narratives are amplified, over time and via different hashtags, on Twitter.

Our research posits that, in the case of Russia, strategic narratives can be defined and linked to specific Twitter hashtags. This is a plausible approach for the Russian case for the following reasons:

1. Russia has strategic narratives that can be readily identified through speeches and documents from the central government, specifically policy concepts and key addresses by President Vladimir Putin. As an authoritarian regime that closely controls key communication nodes (particularly state-run television and foreign broadcasts), the Kremlin provides coherent and consistent messages that can be identified in both its domestic news and its foreign propaganda. 2. There is no viable political opposition in Russia (unlike the deeply divided voices of current U.S. elites), allowing for a unified signal of national strategic messages. 3. Russia has demonstrated consistent and pervasive dedication to promoting its views on social media sites, efforts that have been fairly closely documented due to Russian interference in U.S. election campaigning.

2. Russia has a strong history of a state role in constructing (as opposed to just amplifying) strategic narratives. As a successor to the Soviet regime, Russia was forced to create symbols and narratives in the 1990s, which gave the state latitude to dictate a vision of the state from above, as opposed to reflecting on the reality from below. At the same time, the Russian state could take advantage of both the powerful images and methods of the Soviet propaganda system, which it eventually did after a relatively brief experiment with a more democratic society (Oates, 2006). 5. Russians can also hijack hashtags in an attempt to control narratives, such as when the Russian Foreign Ministry tweeted #UnitedforUkraine in 2014. This was a (see source #1) small-scale, human effort; those using automated efforts are more sustainable and could arguably be more effective.

______________________________________________________________

#MH17 Cyborgs. September 2022

This was a curiosity project, motivated by an interest to see “who” was still active on Twitter almost four years after starting the original research. The attrition is noteworthy with over 50% of the original group of cyborgs, being ‘inoperable.’ Of course this could already have changed. It fair to ask what suspended accounts are now active in the Spring of 2023, along with previously inactive account have sprung back to life. Small projects for another day.

Borrowing for Heraclitus, Twitter’s feed is like the experience, where “no man ever steps in the same river twice, for it’s not the same river and he’s not the same man.”

______________________________________________________________

Revisiting our research limitations and conclusions

One of our limitations is reconciling our classification system with using average tweet volume over a week. We have identified three different types of accounts above: cyborg, moderates, and low-volume. Mentionmapp used average activity over a time period (typically a week), but this average hides peaks and valleys in the activities of some accounts. For example, an account that is very active may fall silent or slide from cyborg to moderate levels of engagement or back again. There are also problems of suspended accounts, deleted profiles, deleted feeds of tweets, and protected tweets that can obscure the findings. Mentionmapp cannot download or analyze private accounts, although it is questionable how effective a private account can be in information warfare.

Conclusions

The challenges posed by the spread of disinformation via social media create a significant threat to democratic discourse. We deployed two approaches that can be replicated by other scholars and analysts. First, we relied on several studies of narrative to take a broader view of media messages, narrowing down to examples of Russian strategic narratives that could reasonably be identified by a single hashtag. In this way, we were able to isolate critical signals from the vast noise of social media, although we concede that a study of Twitter alone is relatively narrow. Second, this study would not have been possible without a robust retrieval and measurement tool for Twitter that had been developed and tested in a range of situations. This research was a partnership between an academic and a technology entrepreneur, an attempt to leverage different types of expertise. We found this partnership quite fruitful. Even when considering the limitations, this study provided three key findings:

1. It is highly useful to know what you’re looking for and what you’re looking at when approaching social media analytics. The pre-selection of hashtags that we feel confident represent Kremlin strategic narratives makes this a much more useful exercise.

2. The Twitter activity identified by Mentionmapp demonstrates the value of moving beyond dichotomies such as bot/human and using a wider range of user behaviors to establish relative roles specific Twitter users play in synthetic audience engagement.

3. It is possible to find disinformation influencers by analyzing user behavior across multiple hashtags linked to Russia’s disinformation priorities. This is an approach that can be applied in a range of situations beyond Russian information operations.

______________________________________________________________

Notes/Sources

Note #1 — A cyborg is a Twitter user that appears to use both automated and human activity (which will be discussed in more detail below). A bot is an autonomous program on a network that is designed to respond or post in response to programmed cues online.

Source #1- https://www.rferl.org/a/ukraine-us-russia-twitter-trolling/25362157.html

Leave a comment