2019 started as 2018 ended, with a global deficit of trust in the world’s businesses, leaders, governments, and media. Research by Ipsos Mori exponentiated this truth further, by revealing clear public confusion; TV newsreaders are trusted to tell the truth by 62%, but journalists by only 26%.

The legacy of ‘fake news’ being the word of the year in 2017 carries on. As Alison Flood at The Guardian remarked, ‘Trump’s apocryphalinvention’, although for those of us who work in reputation management, fake news is a very real motivator for boardrooms to drive communications programmes to address made-up headlines.

Motivations for fake news

The First Draft project, backed by technology giants and charitable organisations, began in 2015 to tackle challenges relating to trust and truth in the media. At the time, Claire Wardle from First Draft, identified 7 types of mis– and disinformation.  

But the real sentence marking one of my trends for 2019 can be found in her sentence,

“If we’re serious about developing solutions to these problems [fake news], we also need to think about who is creating these different types of content and why it is being created.”

Before fake news, we had astroturfing. A definition that I broadly use to describe when an online presence has been deliberately meddled with to support an individual or organisation, or to maliciously target the competition. Wikipedia offers a more precise definition of,

“… the practice of masking the sponsors of a message or organisation to make it appear as though it originates from and is supported by grassroots participants.”

Welcome to the dark arts of communication. A place where business pitches against business, using people and bots to place fake reviews on a Google Review listing. Whole networks of spam websites are created to negatively impact the search engine results of competitors. Alternatively, an entire client’s Google presence can be whitewashed into a fairy-tale land of client satisfaction and excellence.

The unspoken world of dark arts

Broadly, this is the unspoken discipline of content creation that flies in the face of ethical industry bodies, but has always been part of the internet’s core foundation. Like those whose minds were trapped into the artificial world of the Matrix, we’ve all seen hints of the truth.

  • In September 2002 SearchKing, a small search engine at the time, supported by the PR Ad Network, was penalised by Google for helping people buy and sell links to climb Google rankings. SearchKing sued Google, Google won, and this very much marks the beginning of Google’s lawful search dominance. 
  • In 2013 Interflora was famously penalised by Google for allegedly manipulating links through using advertorials to improve its search rankings.
  • The scale of Wonga’s organic search optimisation, particularly after Google banned payday loan advertising in 2016, you could allegedly say was an astroturfing job. 

Other times it’s difficult to publicly identify when content online isn’t as it seems. Which’s recent advertising campaign on the London Underground highlights the blight of fake news; it really is tricky identifying genuine reviews. Their investigation found: 

  • A network of Facebook groups setup to reimburse Amazon customers in exchange for positive reviews. 
  • Sellers offering refunds for high ratings. 
  • Sellers refusing to reimburse costs when ‘honest’ reviews were posted.

In my own experience running reputation management programmes, I’ve seen clients targeted by fake reviews, suspiciously up-weighted stories on Google that highlight negative coverage, social networks of news being created out of nowhere to spread stories… the list goes on. It marks a growing trend to look beyond fake news, to instead the motivators.

Beyond political manipulation

Last year the Oxford Internet Institute described the manipulation of public opinion over social media platforms as a ‘critical threat to public life’. Government agencies and political parties were found to exploit social media platforms to spread mis- and disinformation, exercise censorship and control, and undermine trust in the media, public institutions and science. 

The report makes for troubling reading:

  • 48 countries were found to have evidence of organised social media manipulation campaigns. At least one political part of government agency used content to manipulate public opinion domestically.
  • Fifth of these countries found disinformation campaigns operating over chat applications such as WhatsApp, Telegram, and WeChat.
  • Automation is still a staple for social manipulation, but paid advertisements and search engine optimisation are also important contributors. 

To say political manipulation is a well-trodden trend shouldn’t lessen its continuous impact on our public lives. Advocating a fair democratic process, particularly with Brexit, should be troublesome for political powers; fair campaigning can never be assured. So, if it’s suspected, previous democratic processes should be reviewed. 

2019 is the year when organisations should have a heightened awareness of their own presence. To look beyond fake news, approaching content consumption as a sceptic to really question logically – what is the truth in this matter? The philosopher, Bertrand Russell, advocates a similar approach, challenging us to ignore what we want to be truebut focus on the facts at hand. 

This isn’t just fake news, it’s mixed with astroturfing, and beyond Google – this is a very much the unpoliced wild west of reputation management today.

If you find this article interesting, you can see me speak at the Content Marketing Association Breakfast Event in London on January 23rd 2019.

Leave a Reply

Your email address will not be published. Required fields are marked *