Search Engine Optimization helps you to rank higher up in search engine results by providing answers to online searcher queries. Let’s look at a Brief History of SEO and understand how it has evolved over the past 20 years, so we can get a perspective of how to optimize our site and its content better.
Google’s founders Sergey Brin and Lawrance Page published a paper in the year 1998. “The Anatomy of Large-Scale Hypertextual Web Search”. The emphasis on quality search results for users had started from then on.
The Early 2000s
The key objective of Google at that time was to ensure that search-engine technology focused on providing quality results to the users. Google had issued guidelines for creating quality content. PageRank of a website was measured on the basis of inbound links. The higher the number of inbound links a website had, the better would be its PageRank. Google also provided a toolbar on Internet Explorer for webmasters to check the PageRank effectively. Adwords had also begun during this time and you could see paid results alongside organic search results. At this time link quality was not measured. Webmasters were not really following a ‘White-hat’ way to get rankings as suggested by Google. It is then that Google started off with its algorithm updates.
Florida Update – 2003-2004
Google rolled out the Florida update in the year 2003. This is when quite a few sites lost their search engine rankings. Many sites who were doing keyword stuffing got penalized. 2003 was also the year when Google acquired Blogger.com and created the blog monetizing platform Adsense.
The development did pave the way to a blogging revolution of sorts. But at the same time, there were many blogs which were creating poor quality and thin content and using Adsense to monetize and make quick money. It was 2004 when Google started considering geographic intent when providing results for search queries. This was when local SEO was born. The same year Google also initialized personalized search by utilizing searcher data such as search history and preferences.
No-Follow Links – 2005
This was the year of two key developments: Introduction of no-follow link attribute and Google Analytics.
This year also saw the advent of Google Instant – the technology helped improve search engine results while social signals became an important ranking factor.
Some of the key updates during this time included Jagger and Big Daddy. Jagger was to help Google reduce the low-quality link exchange schemes. It also did away with the anchor text in backlinking. Big Daddy would help Google to better understand the quality and relevance of backlinks.
The Panda Update – 2011
The Panda update from Google aimed at improving the quality of website content. At this time many sites were found creating content farms with low-quality content (using keyword stuffing) to achieve higher search rankings. The Google Search Engine Pages were full of irrelevant results during this time. Many of these unauthentic content sites saw their rankings vanish in this period. These sites were penalized by Google and lost their rankings because of poor quality content. Google also provided some guidelines on how to create quality content. This article on guidance for building a high-quality site, details quite a few points and these, I think, are relevant even today. There were about 15 versions of the Panda update spread between 2011 to 2014.
The Penguin Update – 2012
It was seen that many websites were creating spammy low-quality links for better search rankings. The Penguin update penalized websites with irrelevant and bad quality links. The update dealt with all kinds of sites that purchased links from spammy blogs or directories. Here is a brief history of Google Penguin Updates till date and how you can protect yourself from it:
Penguin 1.1 (March 2012)
This was not exactly an update to Google’s algorithm. This Penguin release punished all those spammy sites which were missed in the original release. At this time, the webmasters who had worked on improving their link profiles saw their rankings go up.
Penguin 1.2 (October 2012)
This release too provided fresh rankings and penalized English as well as Non-English International sites.
Penguin 2.0 (May 2013)
This was a major algorithm update. Google crawlers now started checking websites more thoroughly. They checked suspicious sites beyond their home page and category pages.
Penguin 3.0 (October 2014)
This release too refreshed the rankings and penalized sites which had escaped the earlier updates. Webmasters who had worked on their link profile saw their rankings improve, with this update as well.
Penguin 4.0 ( September 2016)
Penguin now became a part of Google’s core algorithm. It became a real-time update and if you worked on your link profile after this update, you would not have to wait for the next update to see better search rankings. Post this update Penguin would not punish sites with spammy backlinks, rather it would consider spammy backlinks equivalent to no backlinks.
history-of-seo, CCBot/2.0 (https://commoncrawl.org/faq/)
Things to Remember about Penguin
Google Penguin Update
After Penguin, you need to ensure that your backlinks are relevant and come from sites that have high-quality content. You need to have diverse anchor texts for each of your backlinks. You should not use targeted keywords in your anchor texts – rather go for anchor texts that make your links more natural. Buying links and link-building automation are not advisable as well. All you can do is aim for quality backlinks and brand mentions – the trick is to consistently pitch interesting story ideas and tips to journalists.
Introduction to the Knowledge Graph – 2012
Google introduced the knowledge graph to further enhance the quality of search results. It started showing knowledge panels, boxes, and carousels to online searchers. This was another of Google’s technical developments towards making search more intelligent. It was the search engine’s move to go beyond keyword-based search – to a more semantic, meaningful approach. Google now started understanding the meaning of search queries and their relevance to real-world entities. This helped the search engine to predict the intent behind searcher queries and provide the right answers to complex questions from online searchers.
Knowledge Graph Example
Hummingbird – September 2013
The algorithm update from Google was designed to address natural language and conversational search queries. The Hummingbird update came as an improvement in Google’s search technology and would impact 90 percent of the searches worldwide. The key aim of Hummingbird was to understand language and voice search in a very natural way and provide answers that fulfilled searcher intent. The results from local search queries also improved drastically after this update.
Mobilegeddon – 2015
The first mobile-friendly update was rolled out on 21st April 2015 – it was called Mobilegeddon. It required webmasters to ensure that their web pages are mobile-friendly. From here on, mobile-friendliness of web pages would become a key ranking signal. This update affected search rankings on mobile devices only, on all languages globally and applied only to individual pages of a website. After this update, the rankings of non-mobile friendly pages did, in fact, begin to fall (on mobile devices).
Accelerated Mobile Pages (Feb 2016)
Google launched AMP to ensure websites can make their pages load faster on mobile devices more effectively.
Mobilegeddon 2.0 (March 2016)
With this update Google aimed to strengthen their mobile-friendly ranking factor, giving even more importance to mobile-friendly pages.
RankBrain – 2015
The RankBrain update was functional from April 2015 but was officially launched in October 2015. RankBrain uses Artificial Intelligence to make sense of searcher queries. The update is applicable to all languages and to countries across the globe. The technology is most active in case of new and unknown searcher queries. RankBrain basically works as a Natural Language Processor, that attempts to understand the meaning of searcher query just like humans do and then provides the best solution to it.
Possum Update – 2016
This was Google’s local algorithm update that happened in September 2016. The update mainly impacted local search results and results shown through Google Maps. Prior to this update, many businesses that were just outside the city limits found it very difficult to rank for keywords that included the city name. Google Maps also showed them to be outside of the city. All of this changed after the Possum update. This update also ensured that businesses with the same address in a business category were being filtered out in the search results. One more essential aspect of this update was that local search results started depending on the searcher’s physical location. Studies showed that after this updated Google’s organic filter and local search filter work independently of each other.
Over to You
The history of SEO, as you can see, has been quite exciting. SEO will continue to remain important in the times to come as well. With the advent of technologies like augmented reality and virtual reality, local search and visual search will become even more relevant. Content quality, backlink quality, and overall user-experience will remain the key to getting better search rankings. Google will continue to invest in technologies like AI and AR in order to provide a better experience to searchers and maintain its leadership position as the preferred search engine globally.