YouTube to curb conspiracy theory video recommendations

YouTube has announced plans to “reduce” the number of theory and “misinformation” videos users are exposed to through its algorithmic recommendations.

While the Google-owned giant has often courted controversy over some of the content that finds its way onto its platform, the company does have policies in place that serve as a guide to what is, and isn’t, allowed. Some of these videos are eventually taken down. But then there is content that YouTube refers to as “borderline” — it doesn’t breach any policies, per se, but at the same time many people would rather not see them.

And that is the content YouTube is now looking to scrub from users’ “up next” queue.

Rabbit hole

Anyone who’s spent even a short time on YouTube will know its addictive nature: what begins as an innocent 30-second session to watch a prank skit sent by their buddy, descends into a rabbit-hole of neverending autoplay “recommendations” served up by the data-powered internet gods.

It’s in YouTube’s interests to keep you there, of course, as the more you’re on its platform the more ads you’ll likely view. The company also recently added swiping to the mobile app — to make it easier for you to skip to the next recommended video.

Bad actors

These recommendations all-too-often serve up unsavory content: ludicrous conspiracy theories about mass-shooting events being staged, far-fetched proclamations that the moon-landing never happened, and hair-brained notions that the Earth on which we live is, well, flat.

Moving forward, YouTube promises that you’ll see less of those kinds of videos. This is similar to moves its made in the past to reduce clickbaity recommendations, or videos that are slight variations on something else you’ve watched.

“We’ll continue that work this year, including taking a closer look at how we can reduce the spread of content that comes close to — but doesn’t quite cross the line of — violating our Community Guidelines,” YouTube said in a blog post.

“While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community.”

Today’s news comes just a week after YouTube won back one of the biggest advertisers in the U.S. AT&T had previously pulled its ads from YouTube after they were displayed alongside extremist content back in 2017, but it said it was now satisfied that YouTube had sorted out its programmatic advertising systems.

The latest changes will apply only to viewers in the U.S. at first — the company said it’s meshing human evaluators, subject experts, and machine learning to make these tweaks. More countries will receive this update in the future, according to YouTube.

You might also like
Leave A Reply

Your email address will not be published.