May 8

Antisocial media: Why we can’t act on climate


Every disaster movie starts with a scientist being ignored, and the phrase “We thought we had more time”. In the face of the shouty articles about escalating climate crisis, the term “climate emergency” has become easy to ignore; but it’s also an unfortunate reality. Like that Monday morning alarm clock that is so easily ignored, the climate science really is ringing alarm bells across the globe.

As scientific equipment gives us data that looks more and more like the flashing lights of the cockpit of an aeroplane that is on fire, one wonders if as a society, are we even taking this seriously? As the cries for urgency comes from our collective hurtling toward the mountain of irreversible consequences are our pilots even paying attention? But as your drunk uncle at the family BBQ is happy to explain – it’s just not happening. It is, but the deeper question is, is it your Uncle’s fault he’s telling you this? Why as the science data becomes clearer and more urgent, is it getting harder and harder to find consensus, even with friends and family? The data tells us the recent surge in global temperatures, surpassing the critical threshold of 1.5°C, dangerously close to 2°C, represents and clear and imminent peril. So where might the popular confusion and lack of urgent action be coming from? Yes, there are bad actors, fuel corporations, dictators and sheiks that profit from the status quo, but a more systemic issue may be at play.

Author John Hari in the book “Stolen Focus”, examines how the impact of social media, and specifically the YouTube algorithm influences divisive issues like climate change. What he talks about is of critical concern for anyone with an interest in politics or the environment. An algorithm is simply a programmed, automatic way videos are selected to watch, after a search or after we have finished with our original content. Let’s delve into how algorithms such as YouTube and others inadvertently contribute to misinformation and division. Profit in the online world comes from attention. Clicks and time drive attention revenue. Social media originally came from users, so irrespective of the source, the algorithm would measure the popularity of content, and its ability to attract and retain attention.

Unfortunately this collides with a blind spot in the human mind that we pay more attention to outrage than to bland but correct facts. Things that scare, offend, traumatise and contradict us are harder to look away from. It is this inadvertent and unconscious psychological fault in all of us that may be driving climate change misinformation and divisiveness. Algorithms closely monitor attention. Without really meaning to, algorithms in our social feeds drive us to focus on dramatic nonsense over boring science. The other psychological flaw in all of us is that we remember (and often believe) what is most repeated to us, and the things that are the most outrageous and divisive materials are often the most memorable.

Unconscious Algorithmic Recommendations: YouTube’s recommendation algorithm plays a significant role in shaping what users see. When users search for terms related to climate change (such as “global warming”), the algorithm suggests videos based on their search history and viewing patterns, with one core economic goal, to keep people watching more videos, or scrolling, for longer. Without really any malicious programming, after between one and five factual based videos, users are universally sent down a rabbit hole of Climate Change Denial, right wing conspiracy theories, and dodgy but dramatic false information by their social feeds. Avaaz, a global activism group, conducted an investigation on climate denial online (YouTube algorithms promote climate change-denying videos: report – POLITICO).

They found that following a climate change video, the next video served up after nearly 1 in 5 of the most popular YouTube climate science videos, are videos that deny the science behind global warming. These denial videos collectively reach millions of viewers worldwide. Pick any topic, whether it is the holocaust, how government works, or space travel, and soon you will be fed popular clickbait presenting entertaining misinformation. Avaaz also found that on top of any video, advertisements with the same topic will be served somewhat randomly by algorithms. Shockingly, climate-change-denying videos were served up with a side of online ads from organisations like Greenpeace and L’Oreal.

Oops, green groups and ethical companies are inadvertently funding the clickbait denial videos and giving unconscious credibility by association. Greenpeace adds appearing beside videos claiming things like “There has been no significant warming trend in the 21st century”. The connection between such videos and reputable brands is not a diabolical plan, it is simply an economic model that priorities attention above reliable content. What is Algorithmic Responsibility? YouTube reaches billions of people monthly, making its algorithm influential. All other successful online platforms also carefully track user attention, likes, clicks and focus as a core part of their economic profit model.

However, is it up to the user, the social platform, or the various social regulators and pilots of our flaming social aeroplane? When do we decide to direct society away from the sort of disingenuous but entertaining misinformation that will inevitably lead to climate collapse. When does entertaining nonsense become dangerous propaganda, and who gets to decide this? Will our collective intelligence and judgement protect us? Avaaz has documented that as a trend, the more misinformation videos users watch, the more likely they will start to doubt common sense. By now we have all seen how social media drives a previously sane friend or relative to crazy ideas, right to the point where they start moving to the comfort of simplistic but society destroying right wing popularism, or even to not so secretly believe that the royals and most global leaders are, in fact 10ft reptilian shapeshifting aliens in 6 ft human costumes.

Who has not experienced wasted hours as the algorithm serves up more and more extreme but traumatic and engrossing rubbish, where it gets harder and harder to look away. Strict ad policies and content filtering could in theory help, but then who decides what is filtered. Filtering could backfire if advertisers with deep pockets or specific mis-agendas can influence the filters. Bad faith actors might prefer the continued profits and taxes from endlessly growing our fossil fuel use. Much of the misinformation is better funded, better produced, and deliberately more entertaining than the dry science information.

Banning popular misinformation videos that drive clicks and likes is also fundamentally counter to the advertising revenue model that the algorithms very accurately measure and protect. John Hari in the book “Stolen Focus” does not blame Google or Facebook for failing to only serve credible voices when recommending videos. Nor does he blame the average user for having a detectable behaviour profile that clicks and scrolls – aiming more for entertainment and distraction than for reliable credible information.

There is a more systematic issue. For topics like climate change, even if a social media company aims to provide accurate information, the algorithm inadvertently amplifies click bait and outrageously false climate change denial content, affecting public perception and contributing to division on this critical issue. Consider this: the world’s vibrant coral reefs, teeming with life and biodiversity, are rapidly disappearing as temperatures rise. To label it anything less than an emergency would be a disservice to the growing number of people hit by natural disasters each year, the millions who can no longer fish or subsistence farm, and the future generations who may or may not be around to inherit the consequences of our inaction.

But for something requiring popular action and political will, this emergency is also about our fundamental relationship with information technology. It’s not merely a matter of preserving picturesque landscapes or charismatic species; it’s about safeguarding the very fabric of human civilisation and ensuring a habitable planet for generations to come. In the face of such existential threats, complacency is not an option, but entertaining misinformation will be in your feed. The time for half-measures and empty promises has long passed. But all evidence is that no matter how bad things get, we will collectively prefer something entertaining to distract us from taking action.


You may also like

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
Organica Engineering