Google’s proposed Knowledge-Based Trust Score is a major attack on shoddy SEO tactics says Jack Peat, from 72 Point
19th January 2016
Since Google became the most popular search engine, the SEO industry has grown exponentially, with companies investing tens of thousands of pounds to ensure they rank above their competitors on the web. Yet of all the big brands jostling for the number-one spot on Google, the one that is consistently placed at the top is one that doesn’t even have anything to sell; Wikipedia.
When software company Intelligent Positioning researched a selection of 1,000 search terms back in 2012, it found that Wikipedia was in page one of Google’s search results for 99 per cent of the searches and in the top five positions for 96 per cent of searches. Some claim it to be preferential treatment, but it really shouldn’t be construed as anything more than simple guidance.
Google’s goal is to interpret what you want and return with what you need. But since there are droves of companies on the web that want to cheat the system in order give people what they need even if they might not want it, separating the wheat from the chaff has become its primary objective.
The Panda algorithm initiated a mass purge of low-quality, spammy sites that relied on tactics such as content farming, keyword advertising and link building to top search results, and it is about to get even smarter in 2016.
The Knowledge-Based Trust (KBT) Score is a major attack on shoddy SEO tactics. Ray Kurzweil, Google’s director of engineering, has been tasked with creating a virtual fact-checker which effectively eliminates Google’s reliance on third-party signals like links. The software will be designed to literally fact-check your content to ensure sites are an objective and accurate source of fact-based content, like, say, Wikipedia.
This shift from “exogenous”’ to “endogenous” signals won’t necessarily penalise sites that lack facts, and it will only form part of the algorithm, but it indicates that Google is moving away from “popularity signals” and towards accuracy, which is good news for good journalism.
Alongside Wikipedia the sites that frequently rank highly on search results are national news publications. Like Wiki, newspapers don’t employ teams of SEO experts, but rather teams of fact-checkers and producers of trustworthy content. A link back from a national news site, or even a name reference, is a tasty dollop of Google juice if you can get it.
As is a link back from a relevant site. Trustworthiness and relevance are high on the agenda at Google, and as I wrote in this blog, SEO whiz kids are placing a big emphasis on getting authoritative links from highly relevant sites. As a case study, this mum’s story that featured on the Mail Online and Mumfidential among others would have had a nice mix of reliability and relevancy.
For PR companies, SEO could increasingly become a marketable product in 2016. As several “digital” disciplines start to fall under the “content-marketing umbrella”, producing accurate and trustworthy content and distributing it on reliable and relevant sites will be a sought-after service.
Article written by Jack Peat, head of digital at agency 72 Point