YouTube to delete videos that claim fraud in US elections

YouTube said it will start deleting videos that mislead people about the outcome of the U.S. presidential election, a move that could include future postings from President Donald Trump.

Online platforms have been under pressure to police misinformation about the election on their sites.

YouTube, owned by Alphabet Inc’s Google, was widely seen as taking a more hands-off approach than Facebook Inc and Twitter Inc, which started labeling content with election misinformation. YouTube labels all election-related videos.

YouTube said “enough states have certified their election results to determine a president-elect”.

Democrats have criticised YouTube for not doing enough to take down fake news and conspiracy theories on the platform.

Mr Trump and senior Republicans have repeatedly made unsubstantiated claims that the election was “rigged”. Trump’s lawyers have failed to provide evidence of this.

YouTube said in a blog post Wednesday that its efforts would apply to content posted as of today, given that enough states have certified their election results to determine a president-elect.

“We will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election,” YouTube said. “We enforce our policies regardless of speaker,” a spokesperson added.

“For example, we will remove videos claiming that a Presidential candidate won the election due to widespread software glitches or counting errors,” the company said in the blog post. “We will begin enforcing this policy today, and will ramp up in the weeks to come.”

Exit mobile version