YouTube announced on Wednesday that it will remove videos from the platform containing false claims about coronavirus vaccines, a day after Facebook announced it will reject ads discouraging people from getting vaccines.
Used by 2 billion people logged in each month, YouTube aims to expand its ongoing efforts to fight against conspiracy theories about the pandemic with its policy update. The company said it will ban any content “that contradicts information about those vaccines from the World Health Organization (WHO) or local health authorities”.
YouTube said in a blog post it has removed more than 200,000 videos containing dangerous or misleading COVID-19 information since early February.
Andy Pattison, manager of digital solutions at the World Health Organization, told Reuters that the WHO meets weekly with the policy team at YouTube to discuss content trends and potentially problematic videos. Pattison said the WHO was encouraged by YouTube’s announcement on coronavirus vaccine misinformation.
The content including discussions on broad concerns about the vaccine will remain in YouTube.
YouTube also said it will take more steps in the upcoming weeks emphasizing true information about coronavirus vaccines.
Comments
Loading…