For some time, those of us studying the problem of misinformation in US politics - and especially scientific misinformation - have wondered whether Google could come along and solve the problem in one fell swoop.
After all, if Web content were rated such that it came up in searches based on its actual accuracy - rather than based on its link-based popularity - then quite a lot of misleading stuff might get buried. And maybe, just maybe, fewer parents would stumble on dangerous anti-vaccine misinformation (to list one highly pertinent example).
Read also: Can Google tell which 'facts' on the Net are true?
It always sounded like a pipe dream, but in the past week, there's been considerable buzz that Google might indeed be considering such a thing. The reason is that a team of Google researchers recently published a mathematics-heavy paper documenting their attempts to evaluate vast numbers of Web sites based upon their accuracy. As they put it:
The quality of web sources has been traditionally evaluated using exogenous signals such as the hyperlink structure of the graph. We propose a new approach that relies on endogenous signals, namely, the correctness of factual information provided by the source. A source that has few false facts is considered to be trustworthy.