Late last week Google surprised us with the launch of the December 2020 Broad Core Update, of which we begin to see the effects in SERP – although the movements will only stabilize in a few days yet. Maybe it can be useful to take an overview of the main updates of Google’s algorithm, narrowing the field only to the last decade, to “relive” and remember the great updates that have changed the operation of the search engine and also overturned the SEO.
Google’s algorithmic updates
The search engine is constantly evolving and the work of Google never stops: the broad core update of December is the third “major” intervention of the year, which is accompanied by thousands of minor changes (only in 2019 there were 3200, as publicly said by Google). Many of these interventions refer directly to Google search ranking algorithms and, therefore, also impact SEO activity, in a more or less obvious way.
Even those who have been looking for less time in this sector will have heard of Panda, Penguin, Hummingbird or Rankbrain, which are now milestones in the history of Google: so let’s find out more about these colossal updates and what they meant, thanks to the support of two different articles on Search Engine Journal.
The Florida update and the “no manipulations” message
One of the first updates to have disrupted the SERPs and brought down entire businesses was the Florida Update back in 2003 – a name attributed to the fact that it occurred almost at the same time as the Pubcon conference, indeed in Florida, in which the professionals “noticed” the changes.
According to experts, Florida was “a clear message from Google that using SEO techniques to manipulate search results could have brought consequences” and that was when Google started its battle against blatant manipulation tactics.
Google Panda and the new definition of quality
Many SEO professionals agree to indicate in Google Panda the most impactful update of the recent history of the search engine: it was on February 24, 2011 when the new algorithm came into operation, able to trace in a much better way the “quality” of a content.
According to various estimates, the release of the Panda algorithm affected 12 percent of queries in English, and between 6 and 9 percent of queries in Italian.
The key to its operation was the so-called “quality score” – a quality score awarded to web pages used as a ranking factor (and integrated since 2016 in the list of 200 Google ranking factors). In concrete terms, Panda tried to combat poor quality content – such as duplicate, plagiarized or thin content, and then also user-generated spam and the use of keyword stuffing and over-optimized texts – and propose results that meet well-defined criteria, summarized in the famous 23 questions to make quality sites of Amit Singhal.
From that moment on, SEO had to learn to deal with a new reality, also made of long tail keywords and variants of the main keyword within a content, because Google had proven to the sound of penalties of no longer wanting to reward pages of low-quality unique content.
Google Penguin and the fight against spam links
Instead, spam links and non-natural backlink profiles are the goals of Google’s Penguin algorithm update, launched on April 24, 2012, that has led to the penalization of many sites that based their own strategy of link building on clearly manipulative tactics, made of links containing spam or not pertinent, often with excessively optimized anchor texts.
Penguin has given serious jolts to low-effort link-making techniques, such as buying from link farms and using PBN – which however have never completely disappeared – and according to estimates hit 3 percent of search results.
Here comes Google Hummingbird, the debut of the understanding of search intent
With Google Hummingbird the algorithm takes the first decisive steps towards what is its current operation: the hummingbird helps the search engine to better interpret queries and to provide results that correspond to the search intent, managing to interpret the meaning of the sentence and not only to read the individual terms within the query.
It is the first blow to the old keyword domain: thanks to the processing of natural language (based on latent semantic indexing, recurring terms and synonyms), the Hummingbird algorithm allows a page to rank for a query even if the text does not contain the exact words entered by the user. And this therefore requires rethinking keyword research strategies, extending research to concepts and needs behind keywords and carefully analyzing related queries, synonyms and recurring terms to create content that speaks the language of our audience and is diverse.
The attention to mobile navigation
On April 21, 2015, Google inaugurates a road that still has its relevance, namely the attention to mobile browsing: the Mobile update first moves the focus from a desktop version of the website to that from mobile devices, intercepting the change taking place globally.
It is the dawn of the mobile-first index, with which Google indexes and classifies websites based on the speed and ease of use of their mobile versions, and that from next year will find a further boost in the Page Experience Update.
RankBrain and machine learning
Still in 2015, back in October, debuted RankBrain a machine learning system that is part of the largest algorithm Hummingbird Google, which helps the engine understand the meaning of queries and provide search results with the best match, which is immediately defined as “the third most important ranking factor”.
While there is no precise information on the exact formula behind this important update, it is generally believed that Rankbrain is responsible for customizing a user’s Google search results: Basically, the algorithm manages to go beyond a person’s search query and consider the broader context, such as synonyms, implicit words, and personal search history.
The medic update: welcome to the EAT paradigm
August 1, 2018, remains memorable for many SEO experts and sites: it is the day when debuts the algorithmic update that, almost immediately, is christened “Medic update” because it seems to hit in a remarkable way health sites or in any case those linked to YMYL topics, decisions potentially capable of altering people’s lives (finance, law, education).
Although it has never been officially confirmed, rumors and hints of Googlers have linked this update to the implementation of EAT signals (expertise, authority, reliability) page and site, as required by the Quality Rater Guidelines. A single way to achieve this goal does not exist, although there are various schools of thought on what can serve to improve EAT (for instance, proper use of structured data, brand recognition as entities and other elements).
Google BERT, the latest evolution in the comprehension of human language
We are close to recent history with Google BERT – acronym of Bidirectional Encoder Representations from Transformers – the update of the algorithm started in October 2019 for the SERP USA and then from the following December also in Italy. This system uses natural language processing technology to better understand and interpret query language, with a greater ability to understand each text and to identify entities and relationships between them.
BERT is (at the moment) the landing point of the work already started with the updates Panda, Hummingbird and Rankbrain, which has removed the search engine from the simple “analysis” from keywords to launch it towards the ability to understand many more nuances in both queries and search results.
Today BERT “is used in almost all queries in English” and helps Google deliver “top quality results”: this also means that the search engine can actually reward good writing and effective copy, offering users content with the right context and the necessary insights.
Google’s Broad Core Updates
A separate paragraph should be dedicated to broad core updates, that is, updates of the main algorithm of the search engine that are periodic and frequent throughout the year – just like the last one, last week.
In fact, we do not really know what is behind these updates or which parts of the research are improving: it is possible that they are adjustments compared to previous interventions, or even “smaller update packages” connected together. What is certain is that, in the weeks following the release, the SERPs dance and many sites see oscillations (often negative), and that therefore the arrival of a broad core update means that it is time to “fasten the seatbelts” and probably roll up the sleeves to figure out what is going on and how to regain lost positions.