YouTube says it’s updating its recommendations feature so as to promote videos that people actually find interesting.
The decision is said to have been made following numerous complaints from users that YouTube’s suggested videos are unsuitable, or too similar to other content.
The Google-owned company said that this is the first major change to its algorithm since it made an update designed to prevent so-called “clickbait” videos with misleading titles from being surfaced by its recommendation engine. YouTube did this by focusing more on viewer ratings instead of the total view count.
With the latest update, YouTube is attempting to recommend fewer videos that “could misinform users in harmful ways”, it said in a blog post.
The company didn’t say exactly what constitutes a video containing harmful misinformation, but gave examples including content that promotes phony cures for serious illnesses, flat Earth theories, and conspiracy theories around historic events such as 9/11.
In addition YouTube will also try to abstain from recommending “borderline content” that comes close to violating its content rules.
YouTube said that less than 1 percent of the videos on its site would be affected by the change.
“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” YouTube officials said in a blog post.
They added that the change would only affect recommendations, and would not result in any videos being deleted from the site.
“This change relies on a combination of machine learning and real people. We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations,” the company said. “These evaluators are trained using public guidelines and provide critical input on the quality of a video.”