Facebook Inc. said it will take stronger steps to eliminate false information about COVID-19 and vaccines on its social network, a move that could remove major groups, accounts and Instagram pages for repeatedly spreading misinformation.

The company is acting on advice from the World Health Organization and other groups to expand its list of false claims that are harmful, according to a blog post on Monday. Facebook will ask administrators of user groups to moderate such misinformation. Facebook-owned Instagram will also make it harder to find accounts that discourage vaccination, and remove them if they continuously violate the rules. The company this week will also include in its COVID-19 information center details from local health departments about when and where people can get vaccinated.

Facebook, the world’s largest social network, had already made false vaccine claims in ads against its rules. The company is working to undo years of momentum gained by the anti-vaccination movement on its platforms, where emotional anecdotes and stories that provoke fear tend to spread more quickly than scientific facts. The changes on the sites, which start this week, will roll out globally in more than 50 languages, but may take a while to be effective, Facebook said.