Algorithms: what they are, how they work, and where we use them

Put us to the test
Put us to the test!
Analyze your site
Select the database

Understanding algorithms is crucial for anyone aspiring to master the dynamics of SEO and digital marketing more generally. Indeed, algorithms underlie many automated processes and play a crucial role in filtering and sorting information on the Web. Search engine algorithms, in particular, may appear to be “mysterious” machines, but technically they are a complex of mathematical formulas, rules and criteria that, when put into a system, determine what most people, starting with ourselves, see when they browse the Web, because they are in charge of analyzing all the web pages in the Index to then proceed to their ranking, and thus to the order in which the various pages appear in the SERPs, the search engine results pages themselves. In a sense, then, algorithms are the backbone of the way search engines such as Google, Bing, and Yahoo! perform the ranking of web pages to be shown in response to a user’s search query, and while it is impossible to fully unravel the “mystery,” today let’s try to understand how these systems work and what impact these technologies have on SEO and online visibility optimization.

w hat are algorithms?

Algorithms are finite sequences of defined instructions or rules that describe a process or calculation to solve a problem or perform a task.

The ideal partner for your SEO!
SEOZoom is the software that simplifies the analysis, monitoring and SEO optimization of every site. More visits, more customers, more revenue.
Registrazione

The word “algorithm” comes from the Latin transcription of the ninth-century Persian mathematician Al-Khwarizmi, long considered one of the first authors to have referred to the concept in his book “Rules of Restoration and Reduction.” In fact, it was later discovered that the earliest notions of an algorithm even predate it and are found in the Ahmes papyri, also known as the Rhind papyri, dated to the 17th century BCE. These documents contain a collection of problems with solutions, including multiplication problems that the writer claimed in turn to have copied from papyri even older than two centuries.

To be useful and effective, algorithms must possess certain fundamental characteristics:

  • Finiteness; they must end after a defined number of steps and must always end.
  • Unambiguity; each step must be clearly and unambiguous, because instructions must not be interpreted in different ways.
  • Input and output system; algorithms must have an initial input and generate at least one output, i.e., have initial data on which to operate and provide results obtained from instruction execution.
  • Determinism; given a specific input, it will always produce the same output through a defined path of steps: there is no element of randomness in the computational process.
  • Generality; it must be designed so that it can solve all problems belonging to a particular category or mathematical expression.
  • Efficiency, in terms of the time and space required for execution, and thus the ability to solve the problem in the shortest time and with the least use of resources.

Their ability to systematize and automate problem solving underlies many modern technologies: in fact, algorithms are used on a daily basis, often imperceptibly, to perform tasks ranging from the simple ordering of a list of names to complex cryptographic calculations required for computer security, via data management and, of course, Web search.

Definition of algorithm: meaning and main characteristics

In the standard definition and in general computer science, then, algorithms are a series of well-defined steps to complete a given operation. But what is an algorithm in simple terms? Think of a cooking recipe, where each step must be followed in the correct order to achieve the final dish: similarly, algorithms act as precise guides that tell the computer (or other system) exactly what to do, step by step, to achieve a given goal.

Algorithms can be simple, such as sorting a set of numbers, or extremely complex, such as analyzing large amounts of data to predict market trends. And they can be represented in various ways, such as as pseudo-code, flowcharts or directly in a programming language. This representation allows complex tasks to be translated into a format that can be executed by a computer, automating processes that would otherwise require considerable human effort.

Their importance lies in their ability to automate processes and improve the efficiency of operations, making it possible to handle large amounts of data in reduced time. These concepts-and these systems-are fundamental not only in the theory of computing, but also in the practice of programming. In computability theory, a problem is considered computable if it can be solved by an algorithm. In programming, however, devising an algorithm involves translating or encoding a complex problem into a sequence of instructions that a computer can execute. This encoding, written in a programming language, represents the logic for processing the problem, making it solvable in an automated way by software.

How many and what algorithms exist and where they are used

Algorithms find application in almost every field imaginable. For example, in software engineering they are fundamental to the development of efficient applications; in the fields of finance, they are used for market forecasting and risk management; in medicine, they support the diagnosis and monitoring of diseases through the analysis of large amounts of clinical data. Algorithms also underpin logistics, optimizing delivery routes and inventory management. And of course, in search engines such as Google, algorithms are central to indexing, ranking, and retrieving information.

Thus, there are countless algorithms, each designed for specific solutions and different applications, and we can classify these systems into different categories depending on their use and the type of problem they solve. In particular, we recognize:

  1. Sorting algorithms. These algorithms are used to organize data in a certain order, for example, the QuickSort algorithm (QuickSort) or the Bubble Sort algorithm (Bubble Sort).
  2. Search Algorithms. Used to find an element within a data structure. A well-known example is the binary search algorithm, which repeatedly splits half of the portion of the data to be searched to find the required element.
  3. Encryption algorithms. Designed for data security, such as the RSA algorithm for asymmetric encryption or AES for symmetric encryption.
  4. Machine learning algorithms. These include algorithms used to build systems that can “learn” from data, such as artificial neural networks, linear regression models and decision trees.
  5. Genetic algorithms. Inspired by the processes of natural evolution, these algorithms are used to find approximate solutions to complex problems through selection, crossover and mutation techniques.

Why is an algorithm important

Algorithms are crucial in a world increasingly dominated by technology and data: without them, it would be impossible to manage and analyze the immense amount of information available today.

We can say without fear of contradiction that they are the soul of modern technology: they enable extraordinary tasks, from simply ordering a list to predicting future events and understanding natural language, and they increase in centrality as technology evolves and applications become increasingly sophisticated.

Specifically, they are central from the perspective of:

  • Efficiency and automation. Algorithms make it possible to automate tasks that would otherwise require enormous human effort, resulting in systems that can operate around the clock without error.
  • Speed and scalability. Well-designed algorithms enable problems to be solved very quickly, critical in critical applications such as financial transactions or real-time management of computer networks.
  • Reliability and accuracy. Algorithms reduce the likelihood of human error and can be thoroughly tested and verified to ensure accurate results.
  • Innovation and progress. Continued research and development of new algorithms pushes forward the boundaries of technology. For example, artificial intelligence algorithms are revolutionizing areas such as medicine, finance, transportation and entertainment.
  • Personalization and improvement of user experience. Algorithms used in social media, search engines and e-commerce services improve the user experience by offering personalized content and suggestions based on users’ interests and behaviors.

How algorithms work: an overview of the basics

Algorithms are designed to follow a series of well-defined, sequential steps to solve specific problems or perform certain operations. To put it another way, each algorithm starts with a set of input data and proceeds through a series of instructions to produce an output: each step represents a well-defined, sequential instruction that guides the process until the desired result is achieved.

Developers write algorithms using programming languages that translate these instructions into a computer-understandable format. The clarity and precision of each step are crucial to the effectiveness and reliability of the algorithm: once coded, the algorithm can be executed multiple times, ensuring consistency and reliability in the results. This principle is especially important in search engines, where accuracy and efficiency are essential to process billions of daily queries.

The implementation of an algorithm often begins with the writing of pseudo-code, a detailed description in natural language of the steps the algorithm must follow. This pseudo-code is then translated into a suitable programming language, such as Python, C++ or Java, which allows the computer to execute the code. During this process, developers must ensure that each instruction is precise and unambiguous so that the computer can follow those directives exactly without misinterpretation.

A typical algorithm life cycle begins with defining the problem and identifying the requirements needed to solve it. Various approaches or strategies that could lead to the solution are then designed. Algorithms are then implemented, tested and optimized through an iterative process that can include several steps, including:

  • Algorithm design. During this phase, a theoretical framework of the algorithm is developed that details the logical steps required to solve the problem. This phase may include flowcharts and other visual representations to clarify the decision path.
  • Coding and implementation. Once designed, the algorithm is translated into code using a specific programming language. This translation must faithfully adhere to the logic defined in the design phase.
  • Testing and debugging. After implementation, the algorithm is tested with different input data sets to verify that it produces the expected results. During testing, bugs or inefficiencies may emerge that require correction and optimization.
  • Optimization. In this phase, work is done to improve the efficiency of the algorithm by reducing execution time and the use of computational resources. Optimization techniques may include the use of more appropriate data structures or the adoption of more advanced algorithms.

The applications of algorithms: where these systems are found

As mentioned, algorithms are key elements in the operation of modern technologies, offering solutions to complex problems in different domains and constantly improving the efficiency, safety and quality of automated operations.

Their mechanism, which transforms inputs into outputs through a structured and repeatable process, ensures consistency and reliability in the results.This ability to automate and standardize operations makes them crucial in numerous applications, which also affect (and impact!) many aspects of our daily lives. For example, when we use social media, an algorithm decides which posts to display in our feed based on our past interests and behaviors; in streaming services such as Netflix, algorithms suggest content based on our viewing preferences. Even in e-commerce, complex algorithms handle everything from product recommendations to secure payment transactions. In each case, the goal is to optimize the user experience and improve operational efficiency.

Examples of algorithms in daily life

Algorithms are therefore ubiquitous and influence numerous activities we perform every day, often without our realizing it.

  • Algorithms in Social Media

Whenever we access platforms such as Facebook, X or Instagram, algorithms play a crucial role in personalizing our feed. Using machine learning algorithms, these platforms analyze our behaviors, interactions and preferences to show us content that we might be most interested in. This process involves evaluating various parameters such as viewing time, likes, comments and shares to determine the relevance of each post.

  • Algorithms in recommendation systems

E-commerce platforms like Amazon and streaming services like Netflix use complex recommendation algorithms to suggest products or content to us. These algorithms analyze our past activities, such as items we have purchased or movies and series we have watched, to predict what we might want in the future. They use collaborative and content-based filtering techniques to improve the accuracy of recommendations.

  • Algorithms in road navigation software and map services

When we use navigation apps such as Google Maps or Waze, algorithms calculate the optimal route taking into account various factors such as distance, travel time, traffic conditions, and road works. These algorithms use graph techniques and path theory to find the fastest and most efficient path from point A to point B. In addition, the use of machine learning algorithms makes it possible to update traffic information in real time.

  • Algorithms in smartphone cameras

Modern smartphone cameras use image processing algorithms to improve the quality of photos. Features such as autofocus, image stabilization, facial recognition and night mode are all powered by advanced algorithms. For example, machine learning algorithms can automatically identify and optimize faces in photos, while other algorithms reduce noise in images taken in low-light conditions.

  • Algorithms in financial services

Algorithms are also widely used in the financial industry. For example, high-frequency trading algorithms are capable of executing millions of trades in the market in seconds, analyzing huge volumes of data to make investment decisions. Algorithms evaluate market trends, financial news and other economic indicators to predict and react quickly to changes in stock prices.

  • Algorithms in fitness and health applications

Many fitness apps, such as MyFitnessPal or Strava, use algorithms to monitor and analyze our physical activities. The algorithms collect data from our smartphone or smartwatch sensors to calculate calories burned, distance traveled, speed, and other performance metrics. This data is then used to provide personalized recommendations and training plans.

  • Algorithms in machine translation systems

When we use Google Translate or DeepL, advanced deep learning algorithms and convolutional neural networks work to translate text from one language to another. These algorithms not only take individual words into account, but also the context and grammatical structure of the sentence to provide more accurate and natural translations.

  • Algorithms in email filtering

Machine learning algorithms are widely used in email filtering systems to identify and block spam and phishing. By analyzing language patterns, IP addresses, and user behaviors, these algorithms can classify emails into categories such as “inbox,” “spam,” or “promotions,” improving our email experience.

  • Algorithms in virtual assistants

Virtual assistants such as Siri, Alexa and Google Assistant use a combination of speech recognition algorithms, natural language processing and machine learning to understand and respond to our requests. These algorithms analyze our questions, extract useful information and formulate relevant answers, making interaction more fluid and natural.

  • Algorithms for encryption and security

Cryptographic algorithms are critical for protecting sensitive information during transmission and storage. Algorithms such as AES (Advanced Encryption Standard) ensure that data such as credit card numbers, passwords and personal information are rendered unreadable to anyone without the decryption key. Another example is RSA (Rivest-Shamir-Adleman), used for asymmetric encryption in secure online communications, such as the HTTPS protocol that protects Web browsing. Hash algorithms, such as SHA-256, play a crucial role in protecting the integrity of data, ensuring that it has not been altered during transmission. Without these algorithms, online financial transactions, email communication, and many other activities would be vulnerable to cyber attacks.

  • Algorithms for industrial automation

Industrial automation relies on complex algorithms to control machinery and production processes, increasing efficiency and reducing downtime. For example, automation control algorithms monitor and adjust machines in real time, ensuring that they operate within specific parameters to avoid breakdowns. Production planning algorithms optimize the order of operations to maximize productivity and minimize downtime. In automotive assembly lines, robotics algorithms coordinate the assembly of vehicles, performing precise tasks such as welding and painting. In addition, predictive algorithms based on machine learning models predict preventive maintenance, reducing costs and preventing unexpected downtime.

  • Artificial intelligence algorithms

Deep learning algorithms and artificial neural networks underlie many of the most advanced applications of artificial intelligence. In speech recognition, algorithms such as those used by Siri and Google Assistant turn voice audio into text, enabling virtual assistants to understand and respond to users’ requests. In facial recognition, algorithms such as those used in surveillance systems and social media identify faces and verify their identities. Machine translation benefits from deep learning algorithms, such as those employed by Google Translate, which analyze and make complex texts understandable from one language to another, taking context into account. As for autonomous vehicles, artificial intelligence algorithms process huge amounts of data from vehicle sensors (such as LIDAR, radar and cameras) to make real-time decisions, ensuring safe and efficient driving.

What search engine algorithms are

Deserving separate treatment are search engine algorithms, which are the systems that are triggered every time we do a search on Google, Bing or any other search engine to return us the most relevant results.

Search engine algorithms can be defined as theset of rules and procedures used by search engines to determine the relevance and importance of web pages in order to rank them in response to a specific search query. It is therefore a technology that touches more closely on our issues, because it is the unique set of formulas that a search engine uses to retrieve specific information stored within a data structure and determine the significance of a web page and its content.

Their process thus involves crawling web pages, indexing them, and ranking them according to relevance criteria: algorithms evaluate hundreds of factors, such as keywords, content quality, backlinks, and user experience, to determine the order of search results. They are therefore critical to the operation of search engines, as they help filter the vast amount of information available on the Internet and provide users with the most relevant and useful results.

They are sophisticated tools that help organize and make accessible the enormous amount of information and content found online, supporting the mission of helping users find the best answers to their questions and thus playing a key role behind the scenes in determining what results we see when we search online.

Come funziona un algoritmo di ricerca

As we know, search engines are essentially answering machines that, prompted by a user’s input, examine billions of pieces of content they have in their repository (the Index) and evaluate thousands of factors to determine which page is most likely to answer the question posed and satisfy the person’s original intent.

In a nutshell, search engine activity consists of three stages:

  • Crawling or scanning. Through special programs called “crawlers” or “spiders,” search engines automatically explore the Web and encounter newly available content (Web pages, PDFs, images, videos, and so on): crawlers follow links from one page to another, gathering information about each resource they encounter.
  • Indexing. Once a page has been discovered, the information gathered by the crawlers is analyzed and cataloged in a huge database, called the Index, hence the name indexing. At this stage, search engine algorithms analyze the page’s content, inbound and outbound links, and other factors to determine what the page is about and how it should be ranked.
  • Ranking or ranking. When a user enters a query into a search engine, the algorithm searches its database for pages that are most relevant to that input: all pages are then sorted, or “ranked,” based on a number of factors, including content relevance, page authority, quality of inbound links, user experience, and many others. The result of this ranking process is the list of search results we see when we perform a search.

Therefore, algorithms are used by search engines to analyze and index the content of web pages, and also to support them in determining the relevance and quality of a web page with respect to a user’s search query. Each search engine creates, manages, uses and updates its specific search algorithm.

Therefore, we can speak of a Google algorithm, a Bing algorithm, or a Yahoo! algorithm, each of which uses particular and complex criteria to do its job, taking into consideration the signals deemed most useful in achieving the goal of providing more accurate and relevant search results.

Google algorithm or algorithms: how many algorithms are there?

At this point it is appropriate to open a small “lexical” parenthesis, but not only. Often, for the sake of simplicity and convenience, we speak of “Google algorithm” in the singular, even if this means exaggerating a much more complex reality. Google in fact uses a number of algorithms and machine learning techniques to analyze and rank web pages, which work together, each with a specific task, to provide the most relevant and useful search results.

For example, Google has historically used an algorithm called PageRank to determine the authority and importance of a web page based on the quantity and quality of links linking it to other pages, while the other algorithm called Panda is designed to penalize sites with low-quality content; these are joined by other algorithms specific to image search, news search, local search, and more, forming precisely the powerful set of “Google algorithms,” at work every moment to answer users’ queries.

For some time now, Google has shared the nomenclature it uses to accomplish these processes, introducing the concept of “Google Search ranking systems,”  which represent every automated ranking algorithm that participates in the work of analyzing the factors and reporting on the hundreds of billions of web pages and other content to be included in the Index, with the goal of “presenting the most relevant and useful results, all in a split second.” Some of these systems are part of Google’s broader ranking systems-the underlying technologies that produce search results in response to queries-while others are involved in specific ranking requirements.

Google algorithms: a general look at the features

Of course, search engine formulas are secret and meant to remain so, but we can still try to describe (a little) how the algorithms work, which at least in broad strokes follow fairly standardized processes.

For example, the top priority is to respond promptly to the user’s needs, making their experience smooth and positive.Google has managed to become the most popular search engine on the planet thanks to its complex algorithms that improve the search process by using sophisticated tactics that provide users with the information they are looking for, and doing it all in very few moments.

Search engine algorithms typically analyze hundreds of factors-specifically, we have seen how Google also uses more than 200 ranking signals to determine which results to provide and in what order-including content relevance, relevant keyword usage, page performance, mobile optimization, backlinks, query context, and so on, which are globally weighted and “transformed” into complex mathematical formulas so that web pages are quickly ranked.

Despite their effectiveness, Google’s algorithms are not without criticism. One of the main controversies concerns transparency: many webmasters and SEO professionals complain about the lack of detailed information about how exactly these algorithms work. This opacity can make it difficult to understand why a site is being penalized. In addition, there is the issue of results manipulation: some accuse Google of prioritizing its own services in search results, which has led to numerous legal disputes. Finally, there is the issue of privacy, with criticism related to the amount of data Google collects to improve its algorithms.

The influence of artificial intelligence in search algorithms

The introduction of artificial intelligence (AI) into search algorithms has been a momentous turning point for the field. The ability to learn and improve autonomously makes AI-powered algorithms much more sophisticated and accurate than their static predecessors. This advancement has not only improved the accuracy of search results but also profoundly affected the approach of SEO strategies.

For example,Google has introduced several innovations based on artificial intelligence to improve the functioning of its algorithms. One of the most important is the aforementioned RankBrain, announced in 2015, which uses machine learning to better understand the meaning behind search queries, rather than relying solely on keywords. This allows it to interpret and respond to complex and novel queries more effectively, improving the relevance of results. Another significant advance was BERT (Bidirectional Encoder Representations from Transformers), introduced in 2019: This AI model allows Google to understand the nuances of natural language bidirectionally by analyzing the contextual relationships between words within a sentence. With BERT, Google has taken a huge step forward in understanding user intent, especially for longer, conversational queries.

More broadly, the integration of AI into search algorithms has several important implications for SEO. First of all, optimizing content for AI algorithms requires a more holistic and user-centric approach, as Ivano Di Biasi also points out in his latest book SEO for AI. It is no longer just a matter of inserting the right keywords; rather, it is critical to understand and anticipate users’ search intent.

One key area is semantic search, which becomes even more important with AI. Algorithms now better understand context and semantic relationships between terms, which means that content must be written naturally and consistently, rather than focused on keyword stuffing.

Another consideration is the improved personalization of search results. With AI, algorithms can adapt to users’ individual behavior and preferences, delivering increasingly personalized search results. This implies that SEO professionals must consider a variety of contexts and usage scenarios in their content.

Algorithms and SEO: how and why these systems impact organic visibility

Search engine algorithms are thus a collection of formulas that are used to determine the quality and relevance of a web page to the user’s query. In simpler words, search algorithms are responsible for deciding which web pages to show at the top of search results. And so, if we want our content to have visibility on Google and achieve good rankings, we must try to get into the good graces of the algorithms, try to identify which criteria they reward, and instead steer clear of objectively forbidden tactics that risk having the opposite effect.

To achieve good results, therefore, it is not always enough to be certain that you know the tactics and tricks of SEO: in order to succeed and work more strategically and scientifically, it is also advisable to have a basic understanding of how the whole system behind SERPs works. In other words, it is useful to have at least a smattering of how search engine algorithms work and, therefore, to find out why some tactics are better than others, while still remaining on topics “shrouded in mystery.”

Indeed, search engine algorithms have a significant impact on SEO and organic traffic, and there is another aspect not to be overlooked: these formulas and recipes are constantly evolving, with frequent and regular changes that serve to improve the user experience and ensure that users always find the best answer to their queries.

It follows, then, that even those working online cannot stop and must continue to keep up with the latest algorithm updates to ensure that SEO strategies remain effective, aiming to keep the quality and relevance of the landing page content high with respect to the needs and intent of the audience and also monitoring other relevant aspects of the pages.

What algorithmic updates are and which ones there are

Google itself reports that it changes its algorithms hundreds of times each year, between more sensitive interventions and changes that instead have no particular impact on SERPs and site rankings.

In fact, not all algorithm updates have the power to significantly affect the results pages, reshuffling the ranking of sites, because in truth there are different types of algorithmic updates launched by Google.

In particular, we recognize three of them:

  • Broad core updates. They are performed several times during the year (usually on a quarterly or broader basis) and adjust the importance of different ranking factors, acting on a broad scale and at a global level.
  • Specific key updates. These are infrequent interventions that affect a specific search algorithm problem or search system, such as Helpful Content for helpful content or Reviews for reviews.
  • Minor updates. Even implemented on a daily or weekly basis, these interventions usually do not create major visible changes to site performance or analytics; they are often small changes that improve the search user experience and do not affect the ranking of high-quality or large-scale Web sites.

Although, as mentioned several times, Google keeps the details of its search ranking algorithm private, we can imagine that it regularly updates its formulas to improve the quality of search results to reflect changes in user behavior and technological advances. No less important, it is also a way to try to stay one step ahead of unfair actors or spammers seeking visibility on its platform.

How to prevent the negative effects of algorithm updates

And so, search engine algorithms are complex mechanisms that are constantly evolving (to boot): this means, first of all, that it is a mistake to think about creating evergreen content that will be successful forever, because subsequent updates risk making our strategy obsolete and therefore ineffective.

On the other hand, however, it is also wrong to focus excessively on search engine algorithms, frantically adapting marketing strategies to every small change: it is more useful and appropriate to look at the broader picture, trying to understand the questions and needs of our audience, because only in this way can we reduce our dependence on algorithm changes and control, at least in part, how much they influence our initiatives.

Updates by search engines and their algorithms can in fact be as unpredictable as the weather: the only thing we know for sure is that they will be there, with effects that can be of various kinds. And so, what we can do is to make sure we are prepared and ready at the right time, working on some key aspects on our pages such as those listed below.

  • Quality first

Despite algorithm updates, one thing remains constant: the quality of content on landing pages has a positive effect on conversions. Search engines, particularly Google, pay close attention to page quality and keyword relevance, and this is the first point to ensure on our site.

  • Beyond keywords

Google algorithm updates tend to shift the focus away from keywords and toward more long-tail phrases and nuances: instead of focusing only on the keyword, we need to consider intent, relevance, and context. So we need to figure out how best to answer the questions posed by our audience and design useful content, working on the entire user journey rather than just individual keywords.

  • Staying informed

In some cases, search engines offer advance notice of an upcoming algorithm update-this gives us time to familiarize ourselves with the new factors and adapt accordingly. In addition, our page also reports all the most relevant Google Updates!

  • Keep calm and analyze returns

When search engines change their algorithms there are moments of chaos for us marketers, although all we can do is implement relevant improvements and follow the latest guidelines. If we use Google Analytics, taking note of when an algorithm update occurred can help us explain any out-of-the-ordinary results, for example, just as SEOZoom monitoring helps us understand if there are sudden fluctuations in ranking.

  • Reagire con attenzione

If a recent algorithm update has hurt our rankings, it is important not to panic. Launching into a mad scramble to adjust strategies and make quick changes with the goal of stemming the bleeding in the rankings could even further damage organic visibility, and the best results will come from taking the time to study the goals of the new update and see how our Web site does not meet these requirements. Most likely, a few simple steps are all it takes to get back on track, although the recovery time may be longer.

L’algoritmo di Google non è una formula

We can explore this topic further by taking advantage of Dave Davies’ work in Search Engine Journal, which tries to explain in a simple and accessible way what the Google algorithm is.

Starting with Wikipedia’s definition, an algorithm “is a process or program that solves a class of problems through a finite number of elementary, clear, and unambiguous instructions,” but there is also a more general meaning, which is “a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer.” In either case, it matters to understand a fundamental point: an algorithm is not a single formula.

For this, Davies uses the example of a menu: as a good Canadian, the expert tells us, he likes to eat the classic “Roast Beef with gravy” accompanied by other ingredients, the preparation of which requires:

  • Roast Beef
  • Horseradish
  • Yorkshire pudding
  • Green beans
  • Mashed potatoes
  • Gravy

The beef roast needs to be seasoned and cooked perfectly, explains Davies, who then adds more details: “the seasoning matched the roast would be an example of a formula-how much of each thing is needed to produce a product; a second formula used would be the amount of time and temperature at which the roast needs to be cooked, given its weight, and the same operation is repeated for each item in the list.”

Thus, “at a very simple level we would have 12 formulas (6 ingredients x 2 – one for measurements and the other for cooking time and duration based on volume), creating an algorithm with the goal of creating one of Dave’s favorite meals,” to which we would add “another formula, to consider the amount of different foods I would want on my plate,” he writes.

And that’s without even including “the various formulas and algorithms needed to produce the ingredients themselves, like raising a cow or growing potatoes.”

But that’s just the algorithm tailored to Davies’ preferences: “We have to consider that each person is different and will want different amounts of each ingredient and may want different seasonings.” One can then customize that algorithm to give everyone the ideal result by adding a formula for each person.

So many algorithms, one algorithm

This long parenthesis seems to have taken us away from the theme of our article, but in fact an algorithm and a dining table have many things in common, and to demonstrate this Davies lists some of the main features of a Web site (limiting himself to 6 for direct comparison):

  • URL
  • Content
  • Internal links
  • External links
  • Images
  • Speed

As already seen with the dinner algorithm, “each of these areas can be further subdivided using different formulas and, in fact, different sub-algorithms.” And so, the first step in understanding how the system works is to consider the presence of not just one algorithm, but multiple algorithms.

However, it is important to keep in mind that while there are “many algorithms and countless formulas at play, there is still only one algorithm”: its job is to determine how these others are weighted to produce the final results we see on the SERP, the expert explains.

According to Davies, “it is perfectly legitimate to recognize that at the top there is one kind of algorithm-the one algorithm to rule them all, so to speak (quoting Lord of the Rings)-but without forgetting that there are countless other algorithms,” which are generally the ones “that we think about when we’re evaluating how they affect search results.”

Google’s algorithm is a perfect recipe

Returning to the analogy, there are a multitude of different features of a Web site that are evaluated, “just as we have a number of food items that we get on our plate.”

To produce the desired result, “we must have a large number of formulas and sub-algorithms to create each element on the plate and the master algorithm to determine the quantity and placement of each element.”

Similarly, when we use the expression Google algorithm we are actually referring to “a massive collection of algorithms and formulas, each set to perform a specific function and collected together by an algorithm-master or, dare I say it, core algorithm to place the results.”

Google’s algorithm examples

The article then mentions some types of Google algorithms to exemplify the discussion:

  • Algorithms such as Panda, which help Google judge, filter, penalize, and reward content based on specific characteristics (and Panda itself probably included within it a myriad of other algorithms).
  • The Penguin algorithm for judging links and identifying spam. That algorithm certainly requires data from other pre-existing algorithms that are responsible for evaluating links and probably some new algorithms charged with understanding common link spam characteristics so that the larger Penguin algorithm can do its job.
  • Activity/task-specific algorithms .
  • Organizational algorithms.
  • Algorithms responsible for collecting all the data and putting it into a context that produces the desired result, a SERP that users will find useful.

The role of entities for algorithms

In this context, one element that is receiving increasing attention is the analysis of entities, which we know to be “singular, unique, well-defined and distinguishable things or concepts.”

Returning to the dinner analogy, Davies shows us what the entities involved are: himself, each member of his family and even his household (an entity in its own right), and then again the roast and each ingredient of which it is composed are other singular entities, and so is the Yorkshire pudding and the flour used to make it.

Google sees the world as a collection of entities, and this was a crucial step in the process of improving search algorithms.

Continuing his argument, Davies explains that at his dining table there are “four individual entities that would have the status of are eating and a number of entities that are eating instead: classifying us all in this way has many advantages for Google over simply evaluating our activities as a series of words.” Each eating entity can be assigned the entities on its plate (roast beef, horseradish, green beans, mashed potatoes, Yorkshire pudding, and so on).

Google uses this same type of classification to judge a Web site. Each Web page is an entity, just like each person sitting at the dining table.

The global entity representing all of them (called “Davies”) would be “roast beef dinner,” but each individual entity representing an individual or a page is different. In this way, Google can easily classify and judge the interconnectedness of Web sites and the world at large.

The work of algorithms with entities

Fundamentally, search engines are not tasked with judging one Web site, but must rank them all. Thus, they focus not only on the Davies entity seen as roast beef dinner, but also on the Neighbors Robinson entity about stir fry. Now, if an external entity known as a Moocher wants to know where to eat, options can be ranked by the Moochers based on their preferences or queries.

All this is important for understanding search algorithms and how entities work: rather than just seeing what a website is about as a whole, Google understands “that my beef roast and beef stir fry are related and, in fact, come from the same main entity.”

And this is also decisive in understanding what it takes to rank, as the article further explains. If Google understands that one web page is about roast beef, while another page that “links to it is about beef dip (typical Canadian sandwich made from leftover roast beef),” it is absolutely important that “Google knows that roast beef and beef dip are derived from the same main entity.”

Applications to determine page relevance

In this way, algorithms “can assign relevance to this link based on the connection of these entities.”

Before the idea of entities entered Search, engines could only assign relevance and relevance based on parameters such as word density, word proximity and other easily interpreted and manipulated elements.

Entities are much more difficult to manipulate: “either a page is about an entity, or it is not relevant.”

By crawling the Web and mapping the common ways in which entities relate, search engines can predict which relationships should carry the most weight.

Understanding the basics of operation

Ultimately, it is important to understand how algorithms work to add context to what we are experiencing or reading.

When Google announces an algorithm update (as in the case of the Google Page Experience), what is being updated is likely a small piece of a very large puzzle; in contrast, broad core updates are significant and far-reaching changes to Google’s core engine, a kind of periodic trimming to ensure that results are always effective.

Getting into this perspective can help to interpret “what aspects of a site or the world are being adjusted in an update and how that adjustment fits into the engine’s big picture goal.”

Registrazione
Your partner for a perfect SEO
Powerful tools, data and insights for your strategic decisions

In this sense, it is crucial to understand how much entities matter in search algorithms today: they have enormous relevance, which is bound to grow, and they in turn rely on algorithms that identify them and recognize the relationships between them.

To name a couple of advantages, “knowing this mechanism is useful for us not only to understand which content is valuable (i.e., that which is closest to the entities we are writing about), but also to know which links are likely to be judged most favorably and heavily.”

It’s all about the search intent

Search algorithms function “as a vast collection of other algorithms and formulas, each with its own purposes and tasks, to produce results that a user will be satisfied with,” Davies writes in the closing lines.

This vast system includes algorithms specifically designed to understand entities and how entities relate to each other in order to provide relevance and context to other algorithms.

In addition, according to the expert, “there are algorithms to monitor just this aspect of the results and make changes where the ranking pages are deemed not to meet the search intent based on how the users themselves interact,” i.e., that analyze search behavior, as we were saying when talking about the search journey.

Because, as even Mountain View’s public voices often remind us, Google’s job is to provide accurate answers to users.

Iscriviti alla newsletter

Try SEOZoom

7 days for FREE

Discover now all the SEOZoom features!
TOP