Bizarre Essex Police #cyberaware Tweet Mystery

Information operations on Twitter: new data released on election tampering

Back in April, we talked about the wealth of options available to Russian hackers and others launching social engineering campaigns, whether on social networks or through clever attacks launched via Advanced Persistent Threats. Some of that was information published by Twitter at the time in relation to election tampering/interference by so-called “Russian Troll farms”—specifically, the IRA (Internet Research Agency).

Some of the numbers involved were already impressive: 3,841 accounts were linked to the IRA and around 1.6 million notifications were sent out to people who had interacted with these accounts in some way. At the tail end of 2018, Twitter has released yet more data related to this particular campaign.

For example, there’s now an additional 770 accounts (potentially from Iran) to sit alongside the original 3,841 from Russia. That includes “10 million Tweets and 2 million images, GIFs, videos, and periscope broadcasts.” Some of the oldest accounts date back to 2009.

All of this has been put onto an “Elections Integrity” portal by Twitter for researchers to investigate further. That’s 1.24GB of Tweet information and 296GB of media data across 302 archives for the IRA, and 168MB of Tweet information and 65.7GB of media across 52 archives for what’s being referred  to as “Iran.”

DFRLab are one of the organisations given access to the data ahead of time, and the story has recently broken elsewhere, so expect many updates and developments over the next few days. As Ben Nimmo puts it:

They were about the home government first 

– had multiple goals 

– targeted specific activist communities 

– apolitical 

– opportunistic 

– evolved 

– not always high-impact

The timeline of the Tweets is fascinating, as are the posting habits of both Russian and Iranian groups. For example, some individual accounts developed a “personality,” while others just attempted to trend fake stories. That thread is going to grow and grow, so you may wish to bookmark it for easy reference.

Meanwhile,DFRLab are going to be publishing a series of Medium blogs on their findings in more detail. The first is already live, and covers seven key takeaways from the research done so far.

Any doubts you may have had about the likelihood of large scale, long term, professional troll campaigns should have just been swept away. There is no doubt: This is indeed a “full fledged influence op,” and it raises many questions about what’s put into the social sphere, and (more importantly) what we do with it once viewed alongside a response from the platform itself.

We’ve already seen how Russian Facebook ads were used to try and divide opinion in the run up to the 2016 US elections, and it’s clear no expense was spared and no major platform was ignored in the quest to troll the public at large. Everyone needs to step up their game, from the people unwittingly republishing state-sanctioned social engineering ops to the platforms we use on a daily basis possessing the ability to do something about it.

ABOUT THE AUTHOR

Christopher Boyd

Former Director of Research at FaceTime Security Labs. He has a very particular set of skills. Skills that make him a nightmare for threats like you.