“In my opinion, search engines like Google or Bing – as we know them today – are already dead and still being used only because there is not yet a high-impact consumer alternative that allows anyone to have access to AI capabilities.” This is one of the introductory, and strongest, passages of “SEO FOR AI. Let’s invent the SEO of the future,” the new book by Ivano Di Biasi, just published with Palladino Editore, with a foreword by Alessio Pomaro.
We publish here a short excerpt from the book, in which our CEO explains precisely his vision on the future of search engines, and therefore of SEO, following the accelerations imposed by Generative Artificial Intelligence systems.
How SEO is changing with AI and how the scene is changing
“Already today, with new models such as GPT-4o, AIs browse the web looking for documents to analyze in real time to provide the answers they did not know.
Current models of AI use search engine results (Bing, in particular) to find the information they need, and there is already internal functionality that interprets our questions, translates them into keywords to search on search engines, performs the search, and processes the results in real time to give us a current and truthful answer.
[…]
There are many players in this very rapid process of change, and each has its own interests to protect.
The search user will have no doubt: he will definitely use a voice or text assistant to search for solutions to any of his problems. Many people already do, and for them there will only be benefits from the new technology.
More than 90 percent of searches happen on smartphones: why open the browser, go to Google, type in keywords, choose a result and read it without knowing whether we will get the answer we are looking for when we could simply ask our smartphone by voice and get an immediate answer?
What’s more, without having to browse websites overcrowded with invasive advertisements everywhere.
AI service providers will certainly have to contend with legal issues, and I believe that every website will have a method to deny AI models from learning from their content, just as Google or Perplexity does with the use of disallow in the robots.txt file, although I see this as a counterproductive choice.
The various AI chatbot