We still suffer from the common cold!
Decades of medical research and development haven’t destroyed the influenza virus. It evolves, adapts and improves. Successful business entities and organizations must behave in a similar manner if they are to survive and thrive. Google does it – very well.
Every year, the common cold virus changes structure slightly. That’s why some of us are susceptible to seasonal flu every spring. But once in a while, the virus undergoes a more radical “antigenic shift” – creating an almost completely different germ that leaves everyone vulnerable!
Web searches aren’t usually destinations. They are the starting point of a voyage of exploration into a subject or space. People don’t want just a single web page. They want answers, guides and advisories to help them understand, learn or decide.
Google’s venture into semantic search is an attempt
And then, to go farther!
Implementation of semantic search will result in SERPs being constructed in a totally different way, with more information being presented within search results themselves about the things being searched for. And when you combine it with ‘near instant’ or ‘real time’ data from social sites, the changes will be dramatic and remarkable.
Google knows a lot about you. And many others. As well as plenty of things, events, places and other stuff. Be it a famous landmark or the latest celebrity, your favorite sports team or the city you live in, a new planet, star or galaxy, drug, microbe or therapy, Google knows about it.
And it is constantly striving to learn even more.
Google bought Metaweb’s Freebase.com and at a stroke, added information about 12 million items. Why do it? To beef up their database of knowledge about the world, and thereby enrich search results and make them more meaningful.
The Knowledge Graph being built up allows users to search for things, people or places and receive relevant information instantly. As the graph grows, evolves and gathers even more data, learning as it goes, it could help the search giant understand the world in a way that humans do. Maybe even better!
When that ties in with ‘instant, live feed’ data from social networks, this Knowledge Graph can monitor activities and events across the world in real time rather than presenting search results as a collection of links to web pages that may or may not satisfy a user’s need.
These changes at Google are expected to impact search results for nearly 20% of all queries, which adds up to tens of billions of searches each month and will affect millions of websites that depend upon the current algorithm of ranking pages on SERPs.
And while the driver behind these changes is that Google expects them to make audiences ‘sticky’ and offer more options to serve them up advertising, there’s no doubt that the shift will have a significant effect upon websites like yours and mine that rely upon organic traffic from search engines.
This has potential to become an amazing opportunity – or a devastating disaster. Which one, will depend upon the preparations and precautions you take in advance of the change.
To harness the power of the fire hose of data now available, search engines must be able to understand information in the way it was intended. Lacking advanced artificial intelligence, engineers have realized how difficult it is to try and teach computers to think like humans. It is far easier to teach humans how to present data so that machines can collect, organize and analyze it easily.
When this happens, SERPs for specific queries can move away from the traditional and familiar pattern of a list of websites ranked by an algorithm based on keyword density or inbound link relevance, and become a page that actually answers questions which were on the searcher’s mind when they typed in the search term.
Not only is information grouped and organized rationally, but it’s also presented in a set of different content types that make for intriguing SERPs.
The method utilizes search query logs and knowledge databases like Wikipedia, literally pulling needles out of the haystack of information out there, laying it out in logical sequence, and in alignment with the specific question a user had in mind while seeking the information in the first place.
This may seem magical and impossible – but rudiments of the technology are already in play, and at the rate at which it is growing, this is likely to become reality in the very near future.
As the knowledge graph grows and develops, it may even anticipate and answer subsequent queries – that haven’t yet been voiced! What’s fascinating to SEO consultants, however, is that such a presentation will whet the user’s appetite for more knowledge – which will then be satiated through links to ‘authority content’ from trusted agents.
If you’re a site owner, it has never been more important to be perceived as a trusted authority!
Among the primary goals of semantic search is weeding out irrelevant resources from SERPs and detecting spam to filter it out. Google is relying on trust and proof of authorship to do this job well.
As a web publisher, you have a shot at jumping ahead to the front of the line by semantically marking up your content through following open standards. By linking up your Google+ profile to search listings in the form of ‘Rich Snippets’, Google is trying to correctly identify you as the author of specific content while helping you build relationships with users. Help them out, and you could reap rich rewards.
Identifying a content author or publisher and validating authenticity also helps Google establish provenance. Provenance is defined as a record which details who was involved in producing or delivering a piece of content. This record of the origin and chronology of content helps produce an “author graph” which can allow future ranking to be based not only upon the quality of content, but also on the authority and trustworthiness of the author – a kind of ‘Author Rank’, if you like.
Remember, Google is in the ‘trust and proof’ business!
So, going forward, it is not only about what you say, but who you ARE that will determine where you rank on search results. Through intelligent use of markup tools and open standards, you can provide verifiable provenance data that helps Google recognize your resource as a good match for a particular search or query. This will lead to trust, and favorable search rankings.
Authorship and ‘Author Rank’ are potential areas for exploitation for big wins by search engine marketers. It has implications for your brand, authority and recognition as a ‘thought leader’, both individually and as part of an organization.
Gone are the days when search rankings meant getting more (or even better quality) links. While Google is currently using a range of signals to evaluate the trust level of inbound links, the coming changes are more complex – and harder to game or abuse.
Verifying the author of a link is a shortcut to such validation. Armed with a rapidly growing Knowledge Graph that accumulates humongous data on individuals and content providing entities, Google will place more trust on a link coming from someone they know and trust, and less with one arising from an unvetted resource.
It can get more complex. Individual links can be weighted (algorithmically!) according to the Author Rank of the source from which it arises. This would make it practically impossible to manipulate artificially – at least, not without significant effort and investment. For your links to count, you would have to become trustable in Google’s eyes!
That’s why it’s time to start shifting the way you think about links from ‘where’ to ‘whom’ you get them from. True, the conventional links aren’t going away tomorrow. But the writing is on the wall, and a shift has begun. Being prepared ahead of the coming wave will help you surf it safely.
If all of this sounds too abstract, let’s take a case study and see how an online retailer can benefit from semantic search and the knowledge graph.
Retail outlets often present a slew of data, which is usually cluttered and disorganized, stored across diverse databases and formats. Online retailers must change this to make content available as structured data for search engines to easily retrieve and analyze.
Two popular syntaxes for such presentation are RDFa (used in GoodRelations) and Microdata (used in Schema.org). By injecting semantic markup into webpages, data about these pages is more easily processed by search engines, which in turn helps them provide better answers to queries.
The data may not be in any single format. Business data can include rich media (audio and video), product information, ratings and reviews, contact details, business information for investors, specialty data and more.
Also, the information will need to be retrieved and viewed on a range of different devices – web applications, search engines, navigation systems, tablets, mobile devices, and maps, as well as on all computers.
By using semantic markup, this modification is made seamless and automatic for the online retailer, turning content into a format that search engines simply lap up. And because it helps Google and their ilk deliver a richer experience to their users, your content will be rewarded by higher ranking, and earn greater trust.
There are other advantages to retailers in embracing this tagging and optimization approach such as the ability to send daily offers, menu cards, and business information directly to smart phone apps or other devices – without having to handle difficult customization or prepare different feeds for each.
Just as retailers can get higher visibility on SERPs (with better CTR that translates into bigger profits), every other online business can leverage the benefits of semantic search and Google’s Knowledge Graph by adapting to the need of the hour.
I believe the implications of this shift are quite profound and will impact how we practice our craft, and the skill sets that we will require for SEO.
SERPs cannot be easily manipulated through link management – or, indeed, any other SEO techniques. You might need to untangle a thick web of Google information that’s being used to build this SERP just in order to influence or impact a specific aspect and make it more tantalizing for Google than what it is currently displaying.
This means that once you’re in, the barrier against being unseated is pretty solid – and will take significantly more effort from a competitor. Your authority and trust-backed ranking is easier to defend.
That’s why it is important to start thinking about developing links from beyond a simple paradigm of quality, diversity, quantity and relevance alone. Instead, you need to consider how you might work to build links with an aspect-driven search engine of the future.
Understand that Google (and other search engines) will no longer rely on unidimensional indicators, but will analyze diverse sources including social conversations to deepen its knowledge base and make decisions about trusting you and your content. ‘Author Rank’ will grow in importance and influence the construction of SERPs or influence the order of ranking results.
So, not only will it matter that you have quality content and establish yourself as a trusted authority, but you must also identify fellow experts in your subject, associate yourself and your content with those businesses and resources, and thereby consolidate your position and standing in a semantic search driven web.
Positioning yourself to be the provider of answers that people are seeking will be a huge advantage for every business, both online and offline. Here’s what you can start doing today to get into pole position:
With the field open and devoid of much competition, the time to act is right now. If you wait until others crowd you out of the SERPs, then be prepared for a long, hard, bloody battle to claw your place back into the limelight!