Misinformation seems to be more the norm than actual news on Facebook

In recent times it has become common place for social platforms to be distribution tools for fake news and reports, amplifying real-life movements based on the sharing and re-sharing of misinformation and outright lies.

Such content has now increased the capacity to infiltrate, and impact, people’s lives. For example, both Facebook and Twitter have now taken action against anti-vax groups, while just this month, all the major tech giants agreed to the Christchurch Call, which aims to unify their efforts to curb the spread of terrorism and extremist messaging.

Given the dangers of such sharing, this is rightfully a key issue of focus – and this week, a new study conducted by Oxford University has again underlined why it’s such a crucial concern.

An analysis of Facebook data as shown that ‘junk news’ – or content from less reputable sources – gets shared 4x more than content from reputable, trusted news outlets across The Social Network.

Chart shows the rate of shares for fake news versus reputable news content

Of course, a lot lies in the definition of ‘junk news’ exactly. 

From the report:

“These sources deliberately publish misleading, deceptive, or incorrect information purporting to be real news about politics, economics, or culture. This content includes various forms of propaganda and ideologically extreme, hyper-partisan, or conspiratorial news and information.”

Now, some will argue that this content is, in fact, true news, and that it’s the mainstream outlets which publishes lies – which in itself is a concerning trend of the modern era.

Again, that may not be enough to convince those who would prefer to agree with this type of reporting over traditional outlets – but that, in itself, points to why this type of content is so effective, especially on Facebook.

Because it aligns with our established, internal bias, and reinforces entrenched viewpoints – which is actually a surefire way to boost engagement.

Psychological studies throughout history have repeatedly underlined the power of confirmation bias, which is essentially the way in which our brains look for shortcuts to process information, by selectively choosing which parts we’ll believe, and which we’ll ignore. 

That, as noted, is particularly prominent on Facebook, where not only do users have access to more alternative, agreeable information sources, but where the News Feed algorithm works to show users more of what they agree with – and less of what they don’t.

As such, it’s little surprise that people go to less effort to verify the content they find, and which is shared within their established networks of like-minded peers and friends.

If someone consistently posts content you don’t agree with, you can mute them, shut them out, which makes the platform, really, the ultimate tool for the amplification of this type of content, and the reinforcement of our inherent bias.

The question, then, is how do you fix it? Facebook is good at working with human nature to amplify such sharing, and that works to the company’s benefit, because more sharing means more engagement, and subsequently, more ad dollars.

If Facebook sees that the News Feed algorithm is working to drive user activity, what motivation would they have to change it, to reduce its effectiveness?

As noted, all the major tech giants are now looking at how to reduce such impacts, but this aspect, in particular, has been key to Facebook’s growth – the company weaponises human psychology to manipulate audience action, be that through increased time spent on platform or specifically targeted ads. 

That’s why Facebook has been such a huge success, and it would likely take a significant accumulation of evidence to convince Zuck and Co. to change course.

This may well be the pressing issue of our time – it could arguably be the reason why issues like climate change, for example, struggle to gain significant traction with the populace, because people simply don’t trust what they read.

Or at least, they don’t trust what they don’t agree with, which may include what’s inconvenient or difficult.

The new age of social media is also the age of confirmation bias, and with money to be made in ‘alternative’ news, a whole new information economy has now been established.

Banning the more extreme examples of such is one option, but that can also work to embolden believers, and see them shift their movements into new outlets. 

So is there a solution? Your perspective, ironically, will likely come down to what you choose to believe.

source: http://www.socialmediatoday.com


MARKETING Magazine is not responsible for the content of external sites.




Subscribe to our Telegram channel for the latest updates in the marketing and advertising scene