YouTube corporate claims disinformation doesn't dominate the platform. Science says otherwise.

Today is the day when climate change deniers should hang up their conspiracy theories as the whole debate of whether or not increasing temperatures and extreme weather is caused by human civilization is officially put to bed. Scientific consensus has officially passed the 99% threshold: there’s “no doubt” that humans are responsible for the current climate disaster. But this is not the impression you get from YouTube, a breeding ground for climate change deniers who refuse to hang up their conspiracy theories.

Take geoengineering, when scientists intervene in climate processes to counteract the impact of climate change. The method is, admittedly, controversial—there is concern that large-scale intervention could unintentionally cause more harm than good. But on the first page of YouTube results for the term, videos give advice on how to detox from evil government chemtrails, while others claiming to be documentaries on immune-suppressing airborne chemicals have garnered over a quarter of a million views. These videos are enshrined with the same anti-science stance of anti-vaxxers, propagating totally unsubstantiated conspiracy theories to whip up paranoia and help reaffirm their beliefs.

Half of the online world visits YouTube at least once a month, and with exposure to videos like that, it’s no wonder that a new study published by Frontiers in Communication is recommending that scientists team up with YouTuber content makers to try to offset the platform’s deluge of disinformation. “Searching YouTube for climate-science and climate-engineering-related terms finds fewer than half of the videos represent mainstream scientific views,” says the study’s author Dr. Joachim Allgaier, Senior Researcher at the RWTH Aachen University. “It’s alarming to find that the majority of videos propagate conspiracy theories about climate science and technology.”

YouTube has always been pretty adamant that the algorithms orchestrating the site’s tagging systems and recommended videos don’t encourage viewers to watch increasingly more extreme content. In fact just last week, the site’s new managing director for the UK, Ben McOwen Wilson, told the BBC that YouTube “does the opposite of taking you down the rabbit hole” and that their developers work really hard to make sure disinformation doesn’t dominate. But that could have something to do with governments around the world dangling the prospect of regulation over their heads.

So if the platform itself isn’t going to put a stop to the spread of disinformation, it looks like the task might fall on the shoulders of the content creators themselves. Users on other platforms like Twitch, have already formed strong climate channels like Climate Fortnight, stream where anyone can watch climate scientists playing Fortnite while discussing the more complex nuances of climate science.

Just like Fortnite, YouTube has all kinds of users but one of its biggest cohorts is kids. A study released by the PEW Research Center this week revealed that videos aimed at children, and that feature children themselves, are the most popular on the site.

With videos that actively aim manipulate our understanding of the world, the job of making sure conspiracy videos don’t get filtered down into everyone’s recommended viewing list shouldn’t fall on video creators themselves. But until Youtube takes responsibility, we look to them to keep chemtrail content to a minimum.

Tags