Google currently uses the number of incoming links to a web page as a proxy for quality, determining where it appears in search results.
So pages that many other sites link to are ranked higher. This system has brought us the search engine as we know it today, but the downside is that websites full of misinformation can rise up the rankings, if enough people link to them, and if enough spammy optimisation firms work on promoting their keywords.
A Google research team is adapting that model to measure the trustworthiness of a page, rather than its reputation across the web. Instead of counting incoming links, the system – which is not yet live – counts the number of incorrect facts within a page.
It's probably going to explode when it reaches Wikipedia...
Read the whole story at New Scientist