SAN FRANCISCO — YouTube said Friday it is retooling its recommendation algorithm that suggests new videos to users in order to prevent promoting conspiracies and false information, reflecting a growing willingness to quell misinformation on the world's largest video platform after several public missteps.

In a blog post that YouTube planned to publish Friday, the company said that it was taking a "closer look" at how it can reduce the spread of content that "comes close to — but doesn't quite cross the line" of violating its rules. YouTube has been criticized for directing users to conspiracies and false content when they begin watching legitimate news.

The change to the company's so-called recommendation algorithms is the result of a six-month technical effort. It will be small at first — YouTube said it would apply to less than 1 percent of the content of the site — and only affects English-language videos, meaning that much unwanted content will still slip through the cracks.

The company stressed that none of the videos would be deleted from YouTube. They would still be findable for people who search for them or subscribe to conspiratorial channels.

"We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users," the blog post said.

YouTube, which has historically given wide latitude to free speech concerns, does not prohibit conspiracy theories or other forms of false information. The company does ban hate speech, but defines it somewhat narrowly as speech that promotes violence or hatred of vulnerable groups.

Advocates say those policies don't go far enough to prevent people from being exposed to misleading information, and that the company's own software often pushes people to the political fringes by feeding them extremist content that they did not seek out.

YouTube's recommendation feature suggests new videos to users based on the videos they previously watched. The algorithm takes into account how much time people spend watching a video and the number of views in the decision to suggest it. If a video is viewed many times to the end, the software may recognize it as a quality video and automatically start promoting it to others. Since 2016, the company has also incorporated likes, dislikes and other metrics into its recommendations.

But from a mainstream video, the algorithm often takes a sharp turn to suggest extremist ideas. The Washington Post reported in December that Youtube continues to recommend hateful and conspiratorial videos that fuel racist and anti-Semitic content.

More recently YouTube has developed software to stop conspiracy theories from going viral during breaking news events. In the aftermath of the Parkland, Florida, school shooting in February, a conspiracy theory claiming that a teenage survivor of the school shooting was a so-called "crisis actor" was the top trending item on YouTube.

YouTube's separate search feature has also been called out for promoting conspiracies and false content.