Public Knowledge Welcomes YouTube Recommendation Changes Targeting ‘Borderline Content” and Misinformation
Public Knowledge Welcomes YouTube Recommendation Changes Targeting ‘Borderline Content” and Misinformation
Public Knowledge Welcomes YouTube Recommendation Changes Targeting ‘Borderline Content” and Misinformation

    Get Involved Today

    Today, YouTube announced that it would begin reducing recommendations of “borderline content” — materials that stop short of violating the company’s community guidelines but still may be harmful — and content that could misinform users. According to YouTube, this would include “videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”

    The following can be attributed to Charlotte Slaiman, Competition Policy Counsel at Public Knowledge:

    “This is a great first step by Alphabet to address the problems in YouTube’s recommendation algorithm. The internet in general, and YouTube specifically, may have contributed to radicalization and division. I am so glad to see that YouTube is taking responsibility by making this change.

    “Recommendation algorithms like YouTube’s usually recommend content that is the most likely to attract some kind of user engagement, in this case watching the video. But this is often not the best metric of quality. Sometimes the worst content can drive the most user engagement. Knocking down the recommendation scores for borderline content, as YouTube is doing, is an important counterweight to these engagement-driven models. I look forward to seeing the impact of this change and I hope YouTube will continue to work on this important issue.”

    You may view our recent blog post, “Application of the ‘Diversity Principle’ in Content Moderation,” for more information.