Sunday, May 21, 2023

Senate Intelligence Committee Report, Volume II: Russia's Use of Social Media

 


Well, this is what you get for procrastinating.  I am finally getting around to posting about the Senate Intelligence Committee's report on Russian interference in the 2016 election and already Special Prosecutor John Durham has come out with his report on the investigation.  That will be next, after the Senate Committee report.  In fact, it should be an interesting counterpoint to the Senate Committee and the Inspector General.

But on to the Senate Intelligence Committee Reports.  Volume II (85 pages) deals with Russian activity on social media. 

Its findings are decidedly depressing for anyone who is confident that if free speech is unrestricted, the truth will out.  Rather, it offers actual, empirical evidence for the old saying that the lie is halfway round the world while truth is still putting its boots on.  It quotes an MIT study (p. 10) that tracked over 125,000 news stories on Twitter, shared by three million people over 11 years:

Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information, and the effects were more pronounced for false political news than for false news about terrorism, natural disasters, science, urban legends,. or financial information.

The study also found that false stories were 70% more likely to be retweeted than true stories, and that true stories take about six times as long to reach 1,500 people on Twitter as false stories.  This was exacerbated by the frequent use of bots.

The reason for this should not be so hard to understand.  Unfortunately, our brains are not all that well equipped for recognizing truth, flattering delusions to the contrary.  Our brains tend to glom on to things that are simple, unequivocal, catchy, and meet our preconceptions.  Lies can easily be crafted to do these things.  Truth is under no such obligation.  

The report goes on to comment that commercial actors have been manipulating social media for a long time, but with relatively innocuous intent of selling things. The techniques the Russians used in the US in 2016 were ones they had been perfecting had home for many years.  Of course, the Russian government has used propaganda since Soviet times (or earlier), but its propaganda was crude by today's standards. At least two factors have given the Russian government of the 21st century an advantage over its 20th century predecessor.  One is the invention of the internet and later social media, which allows much greater diffusion of propaganda.  The other is that while Communist propaganda had ideological constraints and had to provide a consistent message, present-day Russia had no such constraints and will support any viewpoint so long as it is divisive. 

The report (pp. 15-20) found several consistent characteristics of Russian media tactics: high volume and multiple channels to overwhelm the audience; merging overt and covert operations, as with hack and leak; speed in putting out (false) information; automated accounts and bots, and paid trolls.  The report also emphasized Russian propagandists' complete lack of commitment to ideology in favor of divisiveness for its own sake. Russian propagandists have found that introducing radical viewpoints with no widespread support tends to fall flat. Rather, the more effective approach is to exploit existing divisions and resentment to tear a society apart. 

And on the subject of tearing a society apart, the report also noted that black people were targeted more than any other demographic, often under the guise of anti-police brutality activism and imitations of Black Lives Matter.  This posed a problem for Russian intelligence.  While Black people were the most alienated part of American society and most easy to exploit with social divisions, they were also the least likely to support Donald Trump. So Russian propagandists worked on encouraging Black people to despair and not to vote at all.

Yvgeny Prigozhin
The most interesting part of the report dealt with the Internet Research Agency (IRA)* popularly known as the Troll Farm.  The Troll Farm was run by our friend, Yvgeny Prigozhin, now of Wagner Group infamy. He also operated a troll farm in St. Petersburg.  The report (p. 25) quotes a news article on the operation, which had some 400 employees, working two 12 hour shifts, in nearly 40 rooms.  Originally target toward domestic audiences, the Troll Farm branched out into international operations, including the US.  Employees had a quota per shift of five political posts, and 150 to 200 comments. Leaked materials (p. 26) revealed that trolls were supposed to post on news articles 50 times a day, have at least three Facebook and ten Twitter accounts, with quotas as to the number of subscribers.  The non-political posts were essential to lure in unsuspecting followers and build rapport and trust with people so that scattered political comments would be convincing.  Trolls were expected to constantly create new content and penalized for copying previous posts.  One employee (p. 27) described working for the Troll Farm as feeling very much like working for the Ministry of Truth in 1984.  

The English language language section was the most prestigious.  Trolls obviously had to have strong English language skills and were briefed in US holidays and directed to synchronize with US time zones. Trolls were also trained in what was and was not grounds for blocking for various publications, and to read tens of thousands of comments to learn to fit in.  Interestingly (p. 29) they were specifically forbidden from saying anything about Russia, based on the belief that most Americans just don't talk about Russia. The goal, instead, was to seize on any divisive issue in the US and use it foment divisions.

As discussed before, most Troll Farm content in the US was apolitical and intended to establish trust.  Russian trolls posted in right-wing and left-wing forums and advocated all different political positions so long as they were (1) divisive and (2) anti-Hillary. (Left-wing posts supported either Bernie Sanders or Jill Stein of the Green Party, or else encouraged not voting at all).  Right-wing content was pro-Trump.  After the election, left-wing trolls posted strongly anti-Trump comments that had been absent before the election.  (pp. 32-35).  There was also a spike in activity, particularly on the left, just before Wikileaks released the John Podesta e-mails (pp. 36-37).  The report speculates that this may have been a sign that the trolls had advanced notice, but has no evidence to support this.  Trump primary opponents were also targeted (p. 37). Troll activity increased, rather than decreased, after the election and was geared toward stirring up opposition to Trump (p. 42). 

The report (pp. 43-62) then goes on to give an extended and (to me at least) rather dull account of the different ways that Russian trolls used different social media platforms.  I will forbear to go into detail.  Two larger points are significant. For one, manipulating social media for advantageous posting is older than Russian trolls.  It has been done by commercial actors, mostly for the relatively harmless purpose of selling their product.  Presumably legitimate political actors have used similar forms of manipulation, although the report does not go into it.  But ultimately, social media was designed for part-time, amateur posters.  Full time, professional posters of this type were not anticipated.

I may also have been wrong in saying that the Senate Intelligence Committee did not address the hack and leak operation, considering the Mueller indictment and report to have said all that needs saying.  In fact, Volume II does have a section (pp. 63-70) devoted to Russian military intelligence a/k/a the GRU, a/k/a Fancy Bear's social media activities to spread their hacked documents.  What it says remains unknown to the general public, however, because most of this section is blacked out.

Although the existence of the Troll Farm was reported in the US in 2015 (pp. 25, 72), no one in the US, either government or private citizen, appears to have been aware of its activities in the 2016 election until after the election was over. The report offers various doomsday scenarios of ever more sophisticated forms of micro-targeting and more sophisticated fakes, ever more difficult to detect (pp. 73-75).  But it is the Committee's recommendations that are most . .  interesting . . .  in the clear light of hindsight.  

The Committee, alarmed at the possibility of more trolling activities or spread of hacked information, urges greater information sharing between the public and private sectors.  Social media companies should also cooperate more closely with each other and share information on how to spot fake accounts.  The Committee also recommends warning users of such efforts. (p. 79).  The report encourages Congress to pass laws establishing formal procedures for information sharing between government and tech companies (p. 80), the executive branch to develop a plan for notifying targets of foreign plans to interfere in an election (p. 81) and campaigns to be judicious in what sources they quote (p. 82).

But here is the thing.  If social media companies are on the lookout for disinformation campaigns of this type, invariably they will sometimes get it wrong. That is exactly what the Twitter Files are all about.  Long story short, Hunter Biden left his laptop at a Delaware computer shop and never bothered to reclaim it.  Under Delaware law, the laptop and its contents then became the property of the owner, to do with as he pleased.  What the owner pleased was to turn the laptop over to the FBI for possible criminal activities, and to keep copies for himself and share them with Rudy Giuliani.  Giuliani passed the contents on to the New York Post, which ran a story hinting at corruption.  In response, 51 retired intelligence officials signed a letter saying that story looked like Russian disinformation.  Twitter and Facebook both (temporarily) blocked the story, although it remained accessible on Google and other internet searches.  Neither company was acting at the direction of the FBI, but both had had previous conversations with the FBI about the dangers of Russian hacks and disinformation and saw the laptop as an example of what to look out for.

It was never very convincing. Hunter Biden's laptop seems like a most unlikely target for either hacking or planting false information.**  In fact, there does not appear to have been any significant foreign interference of any kind in the 2020 election.  This looks very much like an example of generals fighting the last war and getting it wrong. 

But it has led to a world class freakout by right wingers, whose view now appears to be that any contact whatever between government officials and social media, except as part of a criminal investigation, is government censorship and should be banned. Can anyone doubt that that, too, will lead to further problems down the line if Republicans manage either to enact it as law or intimidate social media companies to refrain from all communication with government without a warrant.  Overcorrection for past mistakes is all too common and leads to overcorrections of its own.

If only Republicans could be persuaded to read the Senate reports and see that there were, in fact, past mistakes that called for correction.

____________________
*To people of my generation, IRA will always stand for Irish Republican Army, so I am avoiding the acronym.
**To use Hunter Biden's laptop struck me as an Underpants Gnomes approach to disinformation.  Step 1: Plant disinformation on Hunter's laptop.  Step 2: ???  Step 3: Exposure!

No comments:

Post a Comment