The internet giant is promising to fight disinformation and to tweak the recommendations (again) to minimise the lies, distortions and outright dangerous material that's being served up.
That, however, is an after-the-fact reaction which does nothing to explain why the rubbish videos are on YouTube in the first place — or why comments to them are allowed when the moderation of what's being said is so poor.
It's not just the videos that can be fake on YouTube. Security vendor RiskIQ recently detailed how easy it is for scammers to impersonate YouTube celebrities with millions of subscribers to trick them into clicking on fraudulent sites.
The impersonation scam's been going on for three years, RiskIQ reckons.
Will Google's tweaks and changes make things better so that YouTube becomes a safer place for everyone, advertisers included? Probably not unless the goal of the algorithm is changed fundamentally.
Former YouTuber Guillaume Chaslot of Algotransparency.org worked on the artificial intelligence system powering the recommendations, and he's not hopeful as the algorithm creates a feedback loop.
For instance, for depressed people who hang out on YouTube a lot, the site will recommend often terrible material. Such content gets more views, which means there's an incentive for other people to make more of it, which YouTube duly feeds back to depressed viewers.
Likewise, deleting millions of comments and hundreds of channels is unlikely to make the AI change its mind and stop recommending videos that appeal to child abusers as it is designed to increase time spent watching clips, Chaslot noted.
It is a shame, really. YouTube — and other internet video sites — can be great as archives of material that would never be discovered or forgotten.
There is plenty of useful and informative content on YouTube as well amid the dross.
Even the comments are, at times, funny and informative. The irony here is that without algorithms surfacing content that you might be interested in, it would be very difficult to find what you want. People aren't designed to browse databases.
Google should think about that and change YouTube's AI away from maximising views at all costs.
Whether or not that happens remains to be seen, as it would cost Google ad revenue and chances are it'll happen through regulation after the next few scandals that are waiting in the wings.