YouTube updates medical misinformation policy to curb myths about cancer, COVID, and vaccines

YouTube revises medical misinformation policy. (Image via Unsplash/ Alexander Shatov)
YouTube revises medical misinformation policy. (Image via Unsplash/ Alexander Shatov)

On Tuesday, August 15, YouTube announced that it would revise its guidelines to address medical misinformation. In an effort to uphold its commitment to ensure that users receive accurate and trustworthy health information, the platform will now actively remove content that promotes erroneous claims regarding cancer therapies.

Dr. Grath Graham, head of YouTube Health, said that restricted content will contain:

“[Videos that] promote cancer treatments proven to be harmful or ineffective, or content that discourages viewers from seeking professional medical treatment.”

What is the framework of YouTube’s updated policy?

According to the video-sharing platform, the three categories of its revised policy are prevention, treatment and denial.

In the category "Prevention," the video-sharing company states that it will remove any video that "contradicts health authority guidance on the prevention and transmission of specific health conditions, as well as on the safety and efficacy of approved vaccines.”

Video-sharing platforms will restrict content promoting misinformation. (Image via Unsplash/ Christian Wiediger)
Video-sharing platforms will restrict content promoting misinformation. (Image via Unsplash/ Christian Wiediger)

Regarding "Treatment," the platform will remove material that "contradicts health authority guidance on treatments for specific health conditions, including promoting specific harmful substances or practises," such as explicitly advising viewers to use unproven remedies instead of getting professional medical help.

Under the third category, “denial”, the video-sharing company will restrict “content that disputes the existence of specific health conditions.” Such as claims of denying COVID-19 and deaths occurred.

Additionally, influencers who are discovered to have spread untrue material about a medical condition will be punished, and the videos will be removed.


Misinformation related to cancer will be removed

Creating policies is a positive step. (Image via Unsplash/ Sara Kurfess)
Creating policies is a positive step. (Image via Unsplash/ Sara Kurfess)

The revised policy has a focus on deleting false information about cancer treatments from the site. When people search YouTube for information about cancer, the firm claims that it is their job to make sure that they can quickly find high-quality content from reliable health sources.

The blog post stated that such videos would be taken down first, citing the examples claiming that "garlic cures cancer" and "taking vitamin C instead of radiation therapy" work.

They further clarified in the blog post:

"We’ll continue to monitor local and global health authority guidance to make sure our policies adapt. We want our approach to be clear and transparent, so that content creators understand where the policy lines are, and viewers know they can trust the health information they find on YouTube.”

Although creating policies is a positive step, their enforcement is where the true difficulty resides. According to a YouTube announcement, limits on false information about cancer treatments will take effect immediately and be strictly enforced in the coming weeks.

The platform will analyze videos and their context using human and automatic moderation to make sure this policy is effective. The video-sharing giant also intends to advertise videos about cancer from reliable sources like the Mayo Clinic.