All annotations in context.
RuNet Echo has previously written about the efforts of the Russian “Troll Army” to inject the social networks and online media websites with pro-Kremlin rhetoric. Twitter is no exception, and multiple users have observed Twitter accounts tweeting similar statements during and around key breaking news and events. Increasingly active throughout Russia’s interventions in Ukraine, these “bots” have been designed to look like real Twitter users, complete with avatars.
Using the open-source NodeXL tool, I collected and imported a complete list of accounts tweeting that exact phrase into a spreadsheet. From that list, I also gathered and imported an extended community of Twitter users, comprised of the friends and followers of each account. It was going to be an interesting test: if the slurs against Nemtsov were just a minor case of rumour-spreading, they probably wouldn’t be coming from more than a few dozen users.
Then I used Gephi, another free data analysis tool, to visualize the data as an entity-relationship graph. The coloured circles—called Nodes—represent Twitter accounts, and the intersecting lines—known as Edges—refer to Follow/Follower connections between accounts. The accounts are grouped into colour-coded community clusters based on the Modularity algorithm, which detects tightly interconnected groups. The size of each node is based on the number of connections that account has with others in the network.