Lies Go ‘Round The World Before The Truth Is Out Of Bed (Social Media Series 2/6)

The US, like many other similar Western nations, has become increasingly politically polarized in recent years. It would seem the situation has only become more apparent since the 2016 election cycle. It has been recently discovered that social media has been manipulated by bad actors to turn the various platforms into propaganda powerhouses. 

In the pre-digital and 24/7 news cycle era, journalism consisted of commonly accepted and followed guidelines regarding sourcing and ethical practices. However with the advent of the internet several things have changed journalism arguably for the worse. Content is widely available freely online from more sources. The walled city had crumbled and the floodgates of information have opened. Agencies that formerly relied on print sales are now forced to offer content for free and rely on ad revenue or put up a paywall. Those that choose the former may give into the temptation to sensationalize their content in pursuit of the all important “click.” 

A recent study found that so-called “fake news” was significantly more likely to be widely shared than verified true stories. The study found false political news to be the most likely to be shared likely inspired by feeling of fear, disgust, or surprise. These findings give rise to the idea that “lies can be halfway around the world before the truth has time to put its pants on.”

It would appear people are most likely to share “fake news,” and the more political the better. This phenomenon would certainly explain the “success” of the disgraced data analytics organization Cambridge Analytica which played a crucial role in the run-up to the EU Membership Referendum that took place in the United Kingdom in 2016. The firm cooperated with campaign organizations that desired a result to leave the European Union to manipulate content online and target users with political ads to influence public opinion regarding the issue. (Scott, 2019).

Various types of alarmist ads were presented to individuals most likely to be influenced by the material as determined by the scraping of Facebook user data. One popular subset of advertisements focused on the false idea that Turkey would be joining the EU imminently and the idea this would mean more immigration the the UK alarmed a particular subset of the population usually those who live in very homogeneous parts of England. There are no current plans for Turkey to join the European Union and even so any member state would have a veto vote on any new nation joining the bloc. 


Image result for turkey ad

Cambridge Analytica worked with groups like LEAVE.EU to target specific individuals online and presented them with demonstrably false information to influence their vote.

The obvious solution to this problem would be for the various social media platforms to enforce fact checking rules along with our ethical boundaries regarding what content would be acceptable to promote through paid advertising. However, Facebook recently announced their new ad rules would continue to allow political organizations to promote, with paid ads, any message they wish even those that prove to be completely false. If you have the money, Facebook will happily stand by and help you promote deliberately false information. (Tony Romm, 2020) 

This should worry anyone who values the integrity of the democratic process. It appears that organizations, including foreign bad actors, have a green light to fund fake news and influence public opinion online. The “Brexit” vote is a prime example of how a large swathe of the population can be influenced to vote against their own interests for completely ridiculous reasons that are not based in fact or logic. When users are more likely to click “share” rather than “fact check,” lies can spread like wildfire. These lies are often accepted at face value and are therefore used to influence opinions and views. Combine that with the ability to create an echo chamber by ignoring any information the goes contrary to one’s pre-conceived ideas and voila. 

The most frustrating part of this enigma is how easily it could be rendered powerless if the people refused to play ball. If people were willing to stop and think before clicking the share button this process would lose its efficacy. However, it appears the situation is poised to get worse before it gets better. Are our choices and views really our own if they’re based completely on fake news paid for by foreign governments and shadowy organizations working in their employ? 

I’m afraid the answer could be no. 

Scott, M. (2019, July 31). Cambridge Analytica did work for Brexit groups, says ex-staffer. Retrieved February 23, 2020, from
Tony Romm, I. S.-B. (2020, January 9). Facebook won’t limit political ad targeting or stop false claims under new ad rules. Retrieved February 23, 2020, from

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.