The History of Google Algorithm Updates
If you want to get noticed online, getting to the top of Google is undoubtedly your number one aim. You can, of course, rent pay-per-click ad space at the top of the SERPs, but, ultimately, your long-term goals will only be served by following SEO best practice advice and achieving top place in the organic search results.
So when the king of online search moves the goalposts, you better pay attention and adjust (if necessary) your strategy to stay on track.
How Often Does Google Make Updates?
In truth, Google tweaks its ranking criteria on a near-daily basis, making subtle changes in its ongoing mission to display only the most useful, relevant content that matches searcher intent. However, every so often the SEO community goes haywire, with reports of huge shifts in overnight rankings indicating a major algorithm update has been implemented.
Dedicated webmasters need to be on their toes and fall in line whenever such an event occurs, but in reality, if you play by the rules the algorithm updates shouldn’t affect you too much. From day one, Google has always prioritised quality and context, so if you make this your mantra rather than trying to fiddle the system, you’re far likelier to perform well in organic search as the algorithms change.
Core Updates
Several times a year, Google will release a “core update” that implement significant changes to how their algorithms work. Unlike some of the other updates on this list, core updates don’t set out to target singular issues within SERPs, but rather operate on a broad spectrum. Core updates are too numerous to list here, so for more information, take a look at Google’s search ranking updates.
Often, website owners find that core updates have a negative effect on their rankings. However, it’s important to note that Google specifically advises not to attempt to combat this directly. It doesn’t mean your content is bad, but simply that other websites are outperforming yours in the SERPS. The good news is, Google has remained consistent over the years on what type of content they consider to be “good” and will rank rank. Focus on good SEO practises and continue to generate quality content, and you should see rankings recover over time.
The Early Years (2003 – 2010)
In retrospect, early Google algorithm updates are of little importance to modern day SEO, implemented at a time when the internet was still discovering what a search engine could be. However, a brief overview makes it clear to see what Google’s goals are, and it is interesting to see how those priorities have shifted over time as technology has grown evermore intelligent.
- Boston: Google’s very first update was announced in February 2023, and announced a commitment to monthly updates to Search Engine Results Pages (SERPS)
- Fritz: The Boston update was soon rendered obsolete by Fritz. Announced in July 2023, Fritz switched Google to incremental index updates, meaning rankings changed every day.
- Florida: Florida, launched in November 2003, was the very first algorithm update, and had a tremendous impact on small businesses, many of whom saw rankings plummet. Florida aimed to prevent manipulation of SEO ranking factors, such as link spamming and keyword stuffing, showing how Google has looked down on these tactics from early on.
- Jagger: This 2005 update was again focused on curbing unscrupulous SEO techniques, including penalising unnatural backlinks, hidden text and delayed redirects, and the use of a large number of doorway pages.
- Bigdaddy: The 2005 Bigdaddy update was implemented to combat link spamming.
- Vince: The Vince update came into effect in 2009, and saw Google favouring sites with good authority and trust. In practise, this meant big brands came to dominate keyword searches.
- Caffeine: In 2009, The Caffeine update improve the speed at which Google found and displayed new content.
- Mayday: 2010’s Mayday update saw higher quality, more relevant results for long-tail keyword searches.
2011 – Google Panda
Arguably one of the biggest milestones in internet history, and certainly one that aided the fledgling content marketing industry of 2011, Panda was effectively launched to tackle ‘thin’ content – poorly-written web pages that manipulated search results by stuffing keywords instead of delivering genuine quality.
In the past, despite attempts to halt this with early updates, it had been possible to rank highly with such keyword trickery, but Panda made this grey area of SEO much more black and white, ensuring that sites which create valuable, informative, high-quality material have a better chance of outranking competitors that take shortcuts.
The message was, and still is, clear: If you want to perform well in organic search, your content has to be up to scratch.
For more information, read our detailed guide to Google Panda here
2012 – Google Penguin
Links pointing to your website have always been a huge part of SEO, the theory being that if people make the effort to link to your content, your output must be pretty good. Thus, search engine bots regard hyperlinks as votes cast by the wider Internet community, so the more ‘votes’ you have pointing your way, the higher you’ll rank.
However, once people cottoned onto this, unscrupulous marketers started paying for swathes of links from dodgy websites, resulting in low-quality sites topping the SERPs, going against the Google mantra for exceptional user experience.
Therefore, Penguin was introduced to tackle link spam, paying close attention to the relevance and subsequent trustworthiness of inbound links. For instance, if an Indian website about SEO was linking to an American retailer of wellington boots with keyword-rich anchor text, there’s clearly something amiss; there’s no relevance between the content or a natural reason for one to be ‘voting’ for another, so Penguin would get in a flap.
Link penalties are now issued to sites with dodgy link portfolios, and if you’re concerned you may have been affected you can check out our guide to documenting and removing bad links.
Read our detailed guide to Google Penguin here
2013 – Google Hummingbird
This was a complete overhaul of Google’s search mechanics, which many have likened to a new engine being installed into a vintage car. Essentially, the update aimed to make search results more “fast and precise”, hence the Hummingbird moniker.
However, the most noticeable refinement was to conversational search; smart devices were on the rise, leading to a huge surge in voice search. This meant searches were becoming more verbose, with people speaking in full sentences rather than the snappy terms we’ve become accustomed to typing.
This meant Google had to consider the context of each word more closely, figuring out how each word fitted into the overall search query.
For example, previously the query “Where’s the best place to eat fish and chips?” would have brought up various results, ranging from helpful restaurant reviews to more generic information about fish and chip recipes, and where fish come from. However, post-Hummingbird, local SEO signals became more important, and this query would now return a map highlighting nearby fish and chip shops.
The algorithm became much smarter at determining what is meant by the whole phrase and how each word within is semantically related to the others, whereas it would previously have focused on the individual keywords “place”, “eat”, “fish” etc., and returned a hodgepodge of results.
Keywords are still important, but ultimately you should be thinking about how to best answer your target audience’s questions rather than what words can you rank for. As mentioned above, producing quality content still allows for keyword optimisation, but your users should come first.
Check out our article on creating better content with Answer The Public to finetune your content strategy.
Read our detailed guide to Google Hummingbird here
2014 – Google Pigeon
As became evident in 2013, local SEO was, and still is, increasingly important. Pigeon took flight in 2014, and the local SERPs saw big changes, with a variety of factors dictating search visibility.
Your NAP consistency is vital, i.e. it’s imperative that your Name, Address, and Phone number is the same throughout your web presence. This means consistency on any directory listings, as well as your website and social profiles. Any mistakes risk confusing the search bots, which means you’ll likely be filtered out of search results.
Completing your Google Business Profile is also key, as this is your route to appearing in the map results. It’s also wise to obtain coverage on local news and business websites, hopefully earning valuable backlinks (‘votes’) to raise your local search relevance. For further insight, check out the ‘Local focus’ section of our article about researching the best websites to publish on.
Read our detailed guide to Google Pigeon here
2016 – Google Possum
This was another local search update that led to webmasters thinking their business profiles had died, when in fact they were ‘playing possum’ – hiding lower down the rankings.
Essentially, the change helped companies on the edge of town compete with city centre rivals. In the past, the algorithm would have favoured those with a BS1 postcode for any Bristol-related searches, but now the general quality of your website is taken into account as well as your geographic location, meaning those on the outskirts can outrank competitors if their content strategy is up to scratch.
Read our detailed guide to Google Possum here
2017 – Google Fred
Fred was a popular change for anyone who’s ever clicked on a promising-looking search result, only to land on a dodgy website littered with ads and insufficient content that doesn’t answer their query. Meaning that the Fred algorithm update will be a popular move for everyone, apart from webmasters who rely on clickbait to earn a living.
Fred dictates that shallow containers for ads will be demoted in the SERPs, so if you host ad space on your site, you better make sure the main body of your content provides genuine quality, and that ads aren’t the main focus of each page.
Read our detailed guide to Google Fred here
2019 – BERT Update
BERT is a natural language processing algorithm (NLP). In simple terms, this update helped Google better understand and contextualise searches. Before BERT, searches displayed results that matched individual words used in the query. With BERT, Google became more adept at picking up the nuances that provide valuable context and deliver more relevant results.
To put this into practice, a searcher looking to travel from the UK to Egypt might search exactly that – “UK to Egypt”! Before BERT, the algorithm, recognising individual words, might deliver results from Egypt to the UK that provide little value to the searcher. BERT allowed Google to look at the search more meaningfully, understand the intention of the searcher to travel from one location to the other, and deliver better results accordingly.
2022 – Helpful Content Update
2022 saw two updates known as the Helpful Content Update. The Helpful Content update was designed to further streamline the SERPs to show more helpful, relevant content. During the update, Google updated it’s long-standing E-A-T guidelines to add an extra E – standing for experience.
Ultimately, the update helped Google to better rank websites with demonstrable good experience, expertise, authority and trust.
Top Tips for Beating Algorithmic Changes
As you may have noticed, these algorithm changes aren’t changing the rulebook. Rather, they are designed to reinforce existing best practice principles and reward websites that abide by them, helping Google improve and build upon standards and guidelines that have been in place since 2003.
Therefore, if you’ve been creating good content and sticking to SEO best practices, you should have little trouble when the next update rolls out.
That being said, there are a few guidelines to bear in mind when working on your SEO strategy.
We recommend service pages of between 300 and 600 words to our web copywriting clients, while blog posts generally perform better when they’re over 1,000 words in length. These parameters aren’t set in stone, but they certainly give you space to say something valuable, while leaving room for natural variations on keywords and phrases.
Always put your users first, and look to create content that answers their queries. If you make quality your policy rather than trying to game the system, Google will find you and you’ll be rewarded with high rankings.
The best way to obtain natural links is to create content that earns attention so, as with the Panda guidelines, quality should be your policy. Producing insightful, helpful resources will see others reference your work, and you can supplement your efforts by submitting guest articles as part of a targeted outreach marketing campaign, which should see you gain citation links from high-quality websites.
In Summary
Ultimately, as the Google algorithm evolves, the emphasis will always be on creating optimal user experience, which means there are no shortcuts; if you want to rank, you have to produce outstanding content that people will find useful and/or enjoyable in some way, which will also naturally earn links.
If you need expert advice on an integrated SEO and content marketing strategy that promises to cut through the clutter on the web, you’ve come to the right place. We know how to get you ranking with quality content and authoritative backlinks that will not fall foul of any future algorithmic updates, so let’s have a chat. Get in touch and ask about our Digital Health Check, offering you a free consultation and initial strategy ideas.
If you have any comments or questions about this post, or would like to discuss a specific issue with your site, please get in touch using the form below.
And connect with us on social media to stay upto date with our latest news: