Fact-checking organizations from around the world are calling on YouTube to take stronger action against misinformation on its platform.
In an open letter published Wednesday through journalism nonprofit Poynter Institute, more than 80 fact-checking organizations addressed YouTube CEO Susan Wojcicki, citing a list of conspiracy theories and misinformation that have spread around the globe in the last few years. The letter said YouTube is being weaponized by unscrupulous actors.
“That is why we urge you to take effective action against disinformation and misinformation, and to elaborate a roadmap of policy and product interventions to improve the information ecosystem — and to do so with the world’s independent, non-partisan fact-checking organizations,” the letter states.
A YouTube spokeswoman said that “fact checking is a crucial tool to help viewers make their own informed decisions, but it’s one piece of a much larger puzzle to address the spread of misinformation.”
“Over the years, we’ve invested heavily in policies and products in all countries we operate to connect people to authoritative content, reduce the spread of borderline misinformation, and remove violative videos,” she added, noting that less than 1% of all views on YouTube is content that borderlines misinformation or violates rules and is later removed. “We’re always looking for meaningful ways to improve and will continue to strengthen our work with the fact checking community.”
YouTube has more than 2 billion monthly visitors, but it doesn’t disclose how many total views YouTube generates in any particular timeframe, making the rates of problematic content hard to put in context.
YouTube, like Facebook, Twitter, Reddit and many other internet companies that give people a platform to post their own content, has grappled with how to balance freedom of expression with effective policing of the worst material. Over the years, YouTube has grappled with misinformation, conspiracy theories, discrimination, hate, harassment, child abuse and exploitation and horrific, violent crimes — all on an unprecedented global scale.
YouTube has modified its policies and enforcement practices to try to adapt to dangerous content on its platform. Recently, it has broadened a ban on vaccine misinformation, ratcheted up its policies on QAnon content and taken down more than a million videos for COVID misinformation. Critics of YouTube argue that the company’s content moderation efforts still fall short too often.
The letter lists proposed solutions, including taking stronger action against repeat offenders, making greater efforts in languages other than English, and further working with and investing in independent fact-checking organizations.
#YouTubeCommunityGuidelines #MisleadingcontentYouTube #Whatismisinformation #FakeYouTubewebsite