Freelancers and third party contractors are being targeted for disinformation-for-hire services to run social media smear campaigns across 81 countries, according to the 2020 media manipulation survey from the Oxford University’s Oxford Internet Institute.
The OII team said governments and political parties through third party contractors are spending millions on private sector ‘cyber troops’, who “drown out” other voices on social media.
“Such professional cyber troops have also used citizen influencers to spread manipulated messages. These include volunteers, youth groups and civil society organisations, who support their ideologies,” said the report.
Researchers examined how cyber troops use different communication strategies to manipulate public opinion, such as creating disinformation or manipulated media, data-driven targeting and employing abusive strategies such as mounting smear campaigns or online harassment.
- 76 countries used disinformation and media manipulation as part of their campaigns
- 30 countries used data-drive strategies to target specific users with political advertisements.
- 59 countries used state-sponsored trolls to attack political opponents or activists in 2019, up from 47 countries in 2019
- 79 countries used human accounts
- 57 counties used bot accounts
- 14 countries used hacked or stolen accounts
“The manipulation of public opinion through social media remains a growing threat to democracies around the world,” said the report.
Organised social media manipulation campaigns are operating in in 81 countries, up from 70 countries in 2019, with global misinformation being produced on an industrial scale by major governments, public relations firms and political parties.
Professor Philip Howard, Director of the Oxford Internet Institute and the report’s co-author said: “Our report shows misinformation has become more professionalised and is now produced on an industrial scale. Now more than ever, the public needs to be able to rely on trustworthy information about government policy and activity.”
Facebook said last year that it found some “co-ordinated inauthentic” online activity likely linked to Turning Point USA, a pro-Donald Trump youth campaign group, and carried out by Rally Forge, a marketing firm, ahead of the US election, reported the Financial Times. Facebook banned Rally Forge from its platform, but Turning Point escaped any penalties, it was reported.
On UK soil there were accounts of similar disinformation-for-hire practices. The Facebook report included the relabelling of the Conservative party Twitter account as “factcheckUK” during a political debate ahead of the 2019 general election as an example of “party-led disinformation”.
“It noted that the digital communications firm Topham Guerin had been hired by the Conservatives after its success in the Australian election earlier that year,” said an FT report.
Key findings include:
- Private ‘strategic communications’ firms are playing an increasing role in spreading computational propaganda, with researchers identifying state actors working with such firms in 48 countries.
- Almost $60 million has been spent on firms who use bots and other amplification strategies to create the impression of trending political messaging.
- Social media has become a major battleground, with firms such as Facebook and Twitter taking steps to combat ‘cyber troops’, while some $10 million has been spent on social media political advertisements. The platforms removed more than 317,000 accounts and pages from ‘cyber troops’ actors between January 2019 and November 2020.
Cyber troops are frequently and directly linked to state agencies, according to the report, “In 62 countries, we found evidence of a government agency using computational propaganda to shape public attitudes.”
But established political parties were also found to be using social media to ‘spread disinformation, suppress political participation, and undermine oppositional parties’, said the Oxford researchers.
Dr Samantha Bradshaw, a fellow researcher and report co-author at OII said: “Cyber troop activity can look different in democracies compared to authoritarian regimes. Electoral authorities need to consider the broader ecosystem of disinformation and computational propaganda, including private firms and paid influencers, who are increasingly prominent actors in this space.”