SEO journalism: how information is changing in the age of AI
Inform, report, document: journalism was born with the aim of providing tools for understanding reality. But what happens when it is not only people who read, but also machines? And when it is the machines themselves that decide who will be read and how they will be shown? Today, it is no longer enough to write for the public: we have to deal with new interlocutors — algorithms and artificial intelligence — that select, synthesize, and decide what content to make visible. The entire paradigm of visibility has changed: SEO, artificial intelligence, and advanced linguistic models are no longer external elements; they have become the filters through which every attempt to inform passes—or is shut down. For writers, this means radically rethinking their work, going beyond the classic “well-constructed piece” to make it readable for search engines, understandable for AI models, and suitable for new formats and languages. The challenge is twofold: to regain the attention of increasingly distracted readers and to gain the trust of increasingly demanding algorithms. Two different audiences, but both decisive. The survival of journalism does not depend solely on the quality of content: it depends on the ability to make it visible, accessible, and relevant in a changing ecosystem. And there are no shortcuts: it takes method, it takes strategy, it takes a deep understanding of the new rules of the game.
SEO and journalism: two worlds that chase each other (and collide)
For years, SEO and journalism were considered two disciplines with different objectives. Journalism has the task of informing, reporting on the present, and offering interpretations of reality as it happens. SEO, on the other hand, began as a set of techniques for optimizing digital content, with the task of making it visible through search engines, starting from an analysis of questions already asked by users.
In short, two parallel worlds destined to meet but unable to truly understand each other for a long time. On the one hand, journalism, anchored in the tradition of rigor, reporting, and in-depth analysis. On the other, SEO, which follows quantitative logic, analyzes the past, and moves along the paths of already formulated queries.
The distance was accentuated by mutual distrust: journalism saw SEO as an invasion of technicalities, a threat to the autonomy of writing; SEO considered journalism too slow to adapt to the needs of a web governed by algorithmic search.
When these two universes began to touch, the result was not a harmonious fusion: it was a collision. The first newsrooms to venture into the world of SEO often did so with ill-concealed distrust and superficiality. The logic was simple: clicks were needed, traffic was needed, and to get them, all you had to do was stick keywords everywhere. This gave rise to a mechanical approach, lacking in strategy and often detrimental to the very quality of the information. It was a cultural misunderstanding that left its mark.
Yet, over time, the need to find a balance has become apparent—not through sudden enlightenment, but out of necessity. Because in the digital ecosystem, quality of information and the ability to reach audiences can no longer travel separately. With the gradual transformation of news consumption habits and the growth of the digital ecosystem, journalism has also had to ask itself how to adapt without losing its essence. And even if the integration between SEO and journalism is still imperfect today, there are concrete signs of a cultural change underway, further heightened by the latest technological developments and the artificial intelligence hurricane.
The difficult entry of SEO into newsrooms
It was around the early 2000s when the first optimization practices arrived in Italian newsrooms and, as mentioned, were greeted as a technical nuisance rather than an editorial opportunity. SEO was seen as a contraption, a set of abstruse rules, often imposed by external consultants—self-styled “SEO specialists” or something similar—who applied mechanical criteria without any real understanding of journalistic logic.
The result was a period dominated by glaring errors. Keyword stuffing — the obsessive accumulation of keywords within articles — weighed down the text and undermined its readability. Headlines were designed solely to satisfy algorithms, with broken sentences and unnatural repetitions. Exasperated over-optimization: elimination of stop words, impoverished vocabulary, sacrifice of syntactic fluidity in the name of supposed SEO purity.
But perhaps the most serious mistake was reducing SEO to a simple technical exercise, entrusted to marginal figures or outsourced to external consultants, without strategic integration with editorial work. No thought was given to responding to users’ search needs or building a solid editorial positioning. The only goal was to bend articles to formulas that promised immediate visibility, chasing numbers, clicks, and timely traffic metrics. The result was predictable: a loss of journalistic quality, disaffected readers, and eroded credibility.
Towards a more mature integration
Today, after years of trial and error, some newsrooms have begun to change their approach. Not all of them, not everywhere, and not always systematically. But the signs of cultural growth are real.
According to the 2025 Report of the Order of Journalists, a new awareness is emerging: SEO is no longer seen as just a technical frill, but as a natural component of journalistic writing for the web. It is not a matter of bending content to algorithms, but of designing articles that are clear, relevant, and useful for people and search engines.
The most advanced newsrooms have understood that SEO and quality are not antithetical. On the contrary, well-done SEO—based on real query research, logical text structuring, and natural use of keywords—also improves the reader’s experience. And it is this reader, today as yesterday, who is the true target of journalism.
Another sign of maturity is the gradual extension of SEO beyond the individual article: there is increasing focus on content architecture, topic cluster strategies, and well-thought-out coverage of topics. Writing is not just about attracting random clicks, but about building authority, ongoing visibility, and trust.
The road ahead is still long, of course. Many newsrooms, especially smaller or less digitized ones, are struggling to break out of old patterns. But change has begun. And it will be inevitable, because in an information ecosystem governed by algorithms and artificial intelligence, those who cannot make themselves visible are destined to disappear.
Artificial intelligence and journalism: challenge or alliance?
As a new awareness of the value of SEO finally grew in newsrooms, another element began to emerge, capable of calling everything into question: artificial intelligence. No longer just an external tool, but a real player in the information process, destined to profoundly change the way news is written, distributed, and read.
Artificial intelligence—broadly understood as the set of automation, analysis, and prediction technologies—has long played a role in information flow management: automatic systems select sources, aggregate data, and monitor events in real time. But with the emergence of generative AI — capable of writing texts, summarizing information, and responding to complex queries — the very nature of journalism has entered a phase of transformation.
Writing is no longer the exclusive prerogative of humans. Advanced language models such as those underlying AI Overview or automatic summarization platforms are influencing not only the distribution of content, but also its production and selection. AI is changing audience expectations, shortening consumption times, and imposing new standards of accessibility and immediacy. Journalism, as a critical watchdog of reality, risks becoming a cog in an automated system, where the value of human verification and mediation must be continually reaffirmed.
Journalism therefore finds itself in a new situation: it is not enough to produce quality information, it must also be made interpretable by the machines that now filter access to readers. This raises profound new questions: can the speed of machines be reconciled with the slowness required to verify facts? Can editorial independence be maintained when algorithms begin to mediate content production as well?
In this scenario, artificial intelligence is neither an automatic ally nor an enemy to be fought. It is a factor of structural change, requiring us to rethink methods and goals while safeguarding the fundamental functions of journalism: verification, contextualization, and responsibility. The challenge is not to resist change, but to govern it.
A traditionalist sector faced with innovation
Journalism’s distrust of technological innovation is not a recent phenomenon. Just a few years ago, the typewriter was still used for the professional journalism exam in Italy—a legacy that speaks volumes about how the profession has experienced change more as a threat than as an opportunity.
This slowness has never been solely technical, but cultural. Journalism has historically defended its autonomy through its rejection of technological fads. From initial mistrust of the web in the 1990s, to the late adoption of social networks as a distribution channel, to reluctance to use data analysis and automation tools, every technological leap has been experienced as an erosion of the human function of the profession.
With the arrival of artificial intelligence, this tension has become more acute. Algorithms do not just distribute or suggest content: they now create, synthesize, and personalize it. Tools such as Google’s AI Overview or conversational chatbots are no longer simple accessories, but have become intermediaries of information itself. And this frightens those who have built their professional identity on authority, independent investigation, and human mediation of facts.
The fear is clear: losing control over the narrative, seeing human judgment replaced by an automatic assembly of data. Yet, rejecting technology outright would be a mistake perhaps more serious than adopting it uncritically. The game is being played on the ability to govern AI, to bend it to the goals of good journalism, not to suffer it or reject it on principle.
AI as a lever to improve editorial work
Despite fears, artificial intelligence can become a powerful tool for journalism, provided it is used wisely. Not everything should be seen as a threat: in many cases, AI can free up resources, enhance editorial capabilities, and improve workflows.
Today, there are systems capable of:
- Automating the production of news based on structured data (such as sports results or financial reports)
- Suggest headlines and summaries optimized for online visibility
- Analyze large volumes of documents for patterns and anomalies
- Support fact-checking and source verification activities
- Personalize content based on the real interests of the audience.
The real advantage is not in automation for its own sake, but in the possibility of freeing journalists from repetitive tasks, allowing them to focus on what makes a difference: in-depth reporting, analysis, and original storytelling.
But every opportunity comes with a risk. Excessive or superficial use of AI can lead to standardization of content, reducing journalism to an assembly line of prepackaged news. It can encourage intellectual laziness, lower the quality of information production, and undermine the very authority of the media.
There is also a significant ethical and legal issue: who is responsible for content generated or synthesized by an algorithm? Who guarantees accuracy, verification, and transparency?
According to the aforementioned 2025 Report by the Order of Journalists, the right path is not to reject AI, but to embrace it as a support tool, while maintaining human supervision and responsibility at all times. It is the journalist, not the algorithm, who must remain at the center of the editorial process. In other words, AI can improve editorial work. But it cannot replace journalism.
The publishers’ perspective: declining traffic and sustainability at risk
The relationship with technology is not only complicated within newsrooms, but also in the publishers’ offices, because the centrality of news sites and information blogs has been under pressure for some time. With the introduction of Google News, many publishers saw their direct traffic erode as news items were aggregated and previewed, often without the need to visit the original source. More recently, phenomena such as Google Discover have made traffic even more unstable and volatile: large volumes of visits, but unpredictable, difficult to control, and with little loyalty.
The emergence of generative artificial intelligence has only accelerated a process that was already underway, with prospects that risk redefining the dynamics of online visibility even more radically.
Tools such as Google’s AI Overview and AI Mode summarize search results directly on the results page, further reducing the number of clicks to the source sites. Users, increasingly accustomed to getting immediate and complete answers, stop at the information offered in the SERP, without continuing to explore further.
In this scenario, the very concept of an editorial website or news blog risks losing its traditional function as a central reference point in the digital presence of a publication or author. The question is inevitable: does it still make sense to invest in an editorial website? Or is it time to rethink the way information is built, distributed, and made visible online?
The answer is not simple, but one thing is clear: the website alone is no longer enough. And those who provide information must radically rethink their role in the new digital ecosystem.
Google News: the first paradigm shift in distribution
When it was launched in 2002, Google News introduced a model of online news distribution that marked a turning point: a vertical aggregator, capable of selecting and displaying previews of articles from multiple publications, facilitating user access to information but, at the same time, reducing direct traffic to the original sites, because often the user’s needs were met within the platform itself.
This dynamic redefined the concept of editorial visibility, shifting control of distribution from content producers to technology platforms.
The tension between publishers and Google was evident from the outset. Copiepresse in Belgium (2006) and Agence France-Presse in France (2005-2007) were among the first to challenge the unpaid use of news.
In Italy, in 2009, the FIEG reported Google News to the Antitrust Authority for abuse of a dominant position, and it was not until 2012 that an agreement was reached to temporarily end the conflict, giving publishers the option to select or remove their content and increasing transparency in advertising revenue sharing.
The real rifts came in Germany and Spain. In 2013, Berlin passed a law on related rights (Leistungsschutzrecht) that required platforms such as Google to pay for the use of article excerpts. In response to the legislation, Google limited the visibility of publishers who did not expressly waive compensation. The controversial law was then rejected by the European Court of Justice in 2019 for failure to notify the EU Commission.
In Spain, the situation was even more drastic: the government approved the Canon Aede (2014), imposing the mandatory payment of a royalty even for short quotations. Google preferred to shut down Google News completely in the country in December 2014. The effects were immediate: traffic to news sites, especially those of small publishers, plummeted by up to 20%, and the publishers themselves, a few months later, asked the government to review the legislation to limit the damage.
Meanwhile, France was also trying to regulate the relationship between publishers and platforms and, after lengthy negotiations, the European related right to copyright was introduced, establishing the obligation to pay compensation for the use of content. The controversy reignited in November 2024, when Google launched a new test in eight European countries—including Italy, Spain, and the Netherlands—excluding editorial content from the European Union from Google News, Discover, and even Search results for 1% of users. The initiative was presented as an experiment to measure the impact of news on user experience and traffic to news sites, but it sparked strong reactions in the industry. After a few weeks, the Paris Commercial Court ordered Google to suspend the experiment, imposing a punitive fine of $900,000 per day in case of non-compliance, for violating the commitments made by Google under the European Copyright Directive.
More than an isolated incident, this case highlights a structural problem: the growing dependence of publishers on traffic generated by Google. In many countries, including Italy, a significant portion of news outlets’ online visibility still comes through News, Discover, and Search. Temporary exclusion from results, even for a fraction of the audience, shows how fragile this model based on algorithmic intermediation is.
We cannot ignore the facts (as revealed by the US company itself): every month, over 24 billion clicks are directed to news sites around the world, generating an estimated average value of 7 to 9 euro cents per click in European markets. For many publications, especially local or independent ones, these flows represent a vital component of advertising revenue and online presence.
The problem, which emerged in the early 2000s, is structural: publishers have become increasingly dependent on the traffic volumes generated by Google. The closure of Google News in Spain and the restriction of snippets in Germany have shown how fragile the balance can be. And the 2024 test seems to confirm that, without the support of platforms, the economic autonomy of digital publishing is at risk.
The challenge for publishers is no longer just to obtain compensation for the content used, but to rethink the entire distribution model: diversify sources of visibility, reduce dependence on a single player, and build editorial strategies capable of adapting to an ecosystem where traffic can no longer be taken for granted.
Google’s AI solutions and the new blow to visibility for publishers
Fast forward to today, and the situation looks even worse: after Google I/O 2025, the company released AI Mode in the US, pushing even harder on the accelerator of the (old) Search Generative Experience, a mode in which users receive AI-generated answers directly on the search page, with (almost) no need to consult external links.
Unlike AI Overview — which provides short summaries with links to sources — AI Mode aims to offer detailed answers, reasoning, and in-depth comparisons, making the interaction similar to an intelligent conversation.
For news sites, this is a game-changer. Historically, search engines have provided a significant portion of organic traffic to publishers: according to the Wall Street Journal, 40% of traffic to major news sites comes from Google Search. But with AI Mode, Google aims to answer users’ questions directly, drastically reducing the need to click on external links. Estimates suggest a possible loss of between 20% and 40% of traffic for many publishers.
Data collected from various sources confirm a worrying trend: already today, platforms such as Wikipedia, YouTube, and Reddit are seeing a drop in traffic of between 2% and 6%, while AI-based services such as ChatGPT are growing—with a 15% increase in visits in April 2025, exceeding 5 billion hits and becoming the fifth most visited site in the world.
The ongoing transformation is not only affecting large publishers. The most structured groups — such as News Corp, Axel Springer, and Financial Times — have begun to sign multi-million dollar deals with companies such as OpenAI, securing new channels of monetization: $250 million over five years for News Corp, $30 million over three years for Axel Springer, and between $5 million and $10 million per year for the Financial Times. In contrast, small publications and independent editorial projects remain excluded from these negotiations, exposed to the risk of digital extinction in an ecosystem dominated by a few large players.
The problems, however, are not limited to the loss of traffic. According to an analysis by the Columbia Journalism Review, the artificial intelligence chatbots currently in use—from ChatGPT to Gemini and Claude, from Perplexity to Grok—make significant errors: in 60% of cases, the answers are wrong or lack correct sources. Some models, such as Grok 3, generate up to 154 incorrect URLs out of 200 attempts. And often, they ignore robots.txt protocols, violating the instructions of sites that prohibit indexing or use of their content.
This raises other crucial questions concerning, on the one hand, the quality and reliability of AI-generated responses and, on the other, the sustainability of an information system in which original content risks being cannibalized without proper recognition.
The biggest risk is the progressive standardization of information: fewer visitors mean less revenue, less diverse content, and less pluralism. This is a downward spiral that could compromise not only the digital publishing economy but also the overall quality of online public debate.
News sites and blogs: why they remain essential
For years, websites were a certainty: the reference point for all publishing activities, the gravitational center of their digital presence. Today, this certainty is wavering. New search tools—such as Google’s AI Overview and AI Mode—summarize answers without requiring clicks; SEO seems to have lost its “meaning” if we consider it with the old logic of visibility only on the classic SERP and not in the GEO version. In short, the very idea of editorial websites and informative blogs is under pressure today, and we are led to wonder whether it makes sense to have a website today in general.
The answers, however, are reassuring, because despite the changes, websites remain an essential asset. Not only for content ownership—which is never truly guaranteed on social media and third-party platforms—but because they are still the main tool for building trust and authority.
A well-structured website is an identity platform: a place where the editorial brand can tell its story consistently, without being subject to third-party algorithms. It is also a space for in-depth analysis, allowing for the creation of long, detailed content designed for a more demanding and knowledgeable audience.
According to recent data from Audiweb and analysis by SparkToro, organic traffic from search engines still dominates over generative chatbots. Google generates over 1,600 billion visits per month, while the leading AI chatbots stop at around 47 billion — a huge difference. Even considering the expansion of AI Overviews, the numbers remain clear: 97% of interactions with Google still occur through classic search, with only 3% attributable to AI interactions. As for the decline in visibility caused by Google’s new AI features, data shows that the average CTR of the top organic positions has fallen by around 15-20% for informational queries most covered by overviews — especially for publishers, news sites, and in-depth blogs. But this is only a partial decline: traffic still exists, albeit redistributed and more competitive.
This means that the future is not without websites, but without static websites. It is no longer enough to “be there”: you need to build a website that is designed to be integrable, readable by AI, and optimized to maintain visibility even in a fragmented and competitive environment.
Strategies for a more modern and sustainable publishing model
Dependence on large distribution channels is not just a problem of website traffic: it is a question of editorial autonomy and economic sustainability. If algorithm changes or the introduction of new AI-based search methods can drastically reduce the visibility of content, it is essential for editorial websites and news blogs to build strategies that reduce exposure to risk, with a focus on diversification, adaptation, and consolidation.
Diversifying traffic sources is the first step. It is not so much a question of monitoring other search engines besides Google — such as Bing, DuckDuckGo or, above all, the new AI-based engines — but of thinking about a distributed presence on emerging platforms: TikTok, YouTube, podcasts, newsletters. Not all content is suitable for every format, but adapting language and structure to different channels is becoming a crucial skill.
At the same time, it is necessary to promote direct access to content. Building a stable relationship with the public through proprietary channels — dedicated apps, free newsletters, or online communities — allows you to maintain control over the relationship with your readers, without external mediation. The growth of newsletter subscriptions, a model practiced by platforms such as Substack but also by many traditional publications, shows how loyalty can become a concrete resource.
Many publishers are also rethinking their business model, focusing on paywall or mixed subscription formulas. Publications such as Le Monde and Il Post have built a clear and recognizable editorial offering around premium content, increasing perceived value and user loyalty.
For those operating in smaller niches, or for independent publications, there is growing interest in membership and community models, which transform readers into active supporters through exclusive benefits programs, online events, and early access to content.
Building a more solid publishing future does not come through technical shortcuts, but through the ability to diversify channels, adapt content, and strengthen relationships with readers. Only in this way will it be possible to remain relevant in an information landscape that is changing more rapidly than we have ever experienced before.
Mistakes to avoid in a multichannel approach
Faced with the crisis in traditional traffic, many publishers and bloggers have reacted by chasing their audience everywhere: opening social media profiles on every platform, multiplying formats, and chasing every new digital trend. The result is often fragmentation: many channels, little identity.
The mistake is to think that being everywhere is enough to be visible. In reality, publishing success comes from a coherent multichannel strategy: occupying a few spaces but in a recognizable way, adapting content to the languages of different platforms without distorting the brand.
On a more practical level, another crucial element in ensuring visibility and sustainability for editorial sites is the ability to build a balanced editorial plan: it is no longer useful to focus exclusively on news or trends, but it is advisable to know (and master) the different types of content. Namely
- Evergreen, articles designed to maintain interest and traffic over time, such as guides, in-depth analyses, and thematic FAQs.
- Hype, content related to current events or hot news, capable of generating immediate traffic spikes.
- Seasonal, articles that tap into recurring themes throughout the year, such as holidays, sporting events, and anniversaries.
An effective strategy alternates these three types of content, ensuring both traffic stability (thanks to evergreen content) and the ability to attract new users at key moments (thanks to hype and seasonal articles).
And let’s not forget that experience matters. According to the 2025 Report by the Italian Journalists’ Association, a fragmented and inconsistent digital presence weakens the editorial brand and undermines public trust. On the contrary, those who manage to build a strong and consistent identity—including through their website—are rewarded by both users and algorithms.
The real challenge is therefore to build a distributed editorial system, in which the website is the hub, but not the only point of contact. A digital presence that combines website, social media, newsletters, videos, and podcasts in an integrated and synergistic ecosystem.
The editorial website is not dead. It is just changing its skin. And those who know how to renew, adapt, and integrate it will still play a central role in tomorrow’s information landscape.
SEO for journalism: established techniques and new AI challenges
Returning to the starting point of our article, journalism today can no longer ignore SEO—although a new awareness is needed to keep pace not only with traditional search engines, but also with voice assistants, conversational search, and new generative models.
Strategic writing, as practiced by copywriters, is not a mere accessory skill: it is an integral part of the online writer’s craft. But the logic has changed. Whereas once the obsessive search for exact keywords dominated, today positioning is based on entities, structured data, search intent, and the overall quality of information. And while journalists have been adapting to these new requirements, artificial intelligence has added a further level of complexity: it is no longer just Google we are talking to, but also its generative and synthetic models.
Clear structure, the use of semantic markup, and the production of synthesizable and reliable content are becoming central elements for remaining visible in an ecosystem where direct access to sites can no longer be taken for granted.
How to apply traditional SEO to journalistic content
Writing for the web has never been easy—despite common misconceptions—and writing well for the web has become an even more complex balancing act today. Those involved in information must contend with an ecosystem that has seen a proliferation of filters between producers and readers: search engines, algorithms, artificial intelligence. Yet the challenge remains essentially the same: to reach readers with useful, relevant, and credible content.
SEO is not (and in its “pure” form never has been) a technical formula for tricking search engines, but a discipline that supports the creation of texts capable of standing out in a landscape dominated by competition and information overload.
And today, doing SEO in journalism means going back to basics, but with method.
Let’s start with the most obvious aspect: it is no longer enough to fill articles with keywords repeated ad nauseam. You need to speak the language of users and search engines at the same time, without distorting your writing. It is a question of mindset rather than technique.
As Google’s public voice Danny Sullivan pointed out years ago, “good SEO is writing in a language that meets the language of your readers.” This principle still holds true: effective optimization comes from content designed to respond precisely to real needs, not to force an algorithm’s approval rating.
The work begins with strategic keyword research: not a blind list of keywords, but a reasoned investigation into what users are really looking for and how they talk.
After the research comes the choice of focus: every article must have a clear direction, a clear answer to a latent question in the audience.
Writing for the web today also means knowing how to read the signals coming from users, interpreting their questions, and anticipating their information needs, as suggested by the principles of Readers Driven Journalism. Unlike traditional editorial models, where the agenda was set from above, Readers Driven Journalism starts by listening: it builds content by analyzing what the audience is looking for, asking, and discussing. It is journalism oriented toward real demand, not pre-packaged supply.
This does not mean chasing the latest fad or bowing to the logic of easy clicks. Rather, it means adopting a reader service approach, creating content that has a real impact on people, based on behavioral data and concrete interests. In other words: returning to the original function of journalism, updating it with the tools of the present.
The writing phase must then follow principles of clarity and consistency. The adoption of the inverted pyramid, typical of classic Anglo-Saxon journalism, remains a golden rule on the web: start with the key information, placed at the beginning of the text, and continue with secondary details and insights. Structuring the text into clear paragraphs with effective subheadings helps both the reader and Google to navigate the content, while also catering to the way people read online—namely, with a limited attention span and a tendency to focus mainly or only on the beginning of an article, thus necessitating the immediate conveyance of the essential message. From an SEO perspective, this approach also offers a concrete advantage: it increases the likelihood that the first few lines of the article—those most visible to Google’s algorithms and AI models—will contain strategic keywords and the informational core of the page. It is a technique that combines human readability and algorithmic effectiveness without sacrificing editorial quality.
An often overlooked element is the dynamism of the article. Informative content must live and breathe, adapting to the evolution of searches and interests: even news can “become” ranked content if you understand how to intercept the growing trend.
Finally, internal linking remains crucial, building a network of links between related articles and making intelligent use of authoritative external sources.
SEOZoom tools for SEO journalism
All this can be confusing and daunting, given the amount of work involved. But SEOZoom tools really support every stage of the online writing process, even from a journalistic perspective, and we can identify some of the tools that should be part of the daily toolbox of anyone who manages (or writes for) editorial websites.
The first reference is the Editorial Assistant, an increasingly advanced writing support tool: it analyzes the SEO performance of the text in real time compared to competitors, suggests strategic keywords and topics, and integrates generative AI tools to refine the quality of the article. Thanks to the combination of SEO data and artificial intelligence, it is possible to align content with search intent, enrich it with relevant elements such as FAQs and insights, and build more effective and competitive texts that better respond to the needs of users and search engines.
Tools such as Suggest Article Keywords or the more advanced Investigate Industry allow you to quickly map the semantic territory, identifying the most suitable queries for building targeted content—always remembering to insert keywords organically, respecting the journalistic tone and avoiding forced language.
Through the Question Explorer, you can understand what questions users are asking about a topic and build articles that directly answer those questions, increasing the chances of being considered relevant by Google and — today — by AI Overview.
The necessary change in approach
SEO and journalism are not conflicting worlds. When done well, optimization and good writing reinforce each other. And they help content not only to exist, but to last.
But a different approach is needed, starting with a fundamental aspect: editorial work can no longer focus solely on publishing articles that are “left to die” the next day, following the incessant flow of news. It is necessary to develop a broader vision, capable of observing query behavior even over time. This is where the logic of the rearview mirror comes into play, an effective metaphor for describing the work of retrospective analysis of content performance.
Looking back helps us understand whether a seemingly outdated news item is generating new search trends, whether an isolated article can be transformed into a more solid thematic cluster, or whether it makes sense to build a category hub that organizes and amplifies the visibility of related content. It also means monitoring whether user queries are evolving: if new keywords or specific phrases emerge that the original piece did not pick up on, it is time to reorient the title, update the text, and adapt the content to the new search interest.
But it is not just a matter of individual articles. Advanced and conscious content management also helps to optimize the entire architecture of the site. Too often, editorial sites let articles follow the chronological flow of the CMS, quickly ending up in the inaccessible depths of the most remote pages. Without strategic intervention, articles that are still relevant can lose visibility, and the entire site structure ends up penalizing valuable content in favor of more recent but less searched pieces—the typical problem of a lack of awareness of Google’s crawl budget.
Dynamically managing content exposure priority—updating the homepage and categories with the most searched-for material at that moment—allows you to present Google and users with the best possible version of the site, month after month. It’s no longer just about adding new articles: it’s about keeping quality information alive and building a strategy that can guide the continuous change of the site based on real data and demand trends.
Combining SEO and journalism does not mean betraying the craft of information: it means extending its life, increasing its impact, and amplifying its relevance over time.
How to write AI-optimized content
While Google has always demanded clarity and quality, artificial intelligence is even more demanding. New models of summarization and automatic response—AI Overview and Google’s future AI Mode, SearchGPT, and Perplexity—select content based on criteria that are very different from keyword density alone: consistency, syntactic simplicity, and semantic clarity matter.
Writing for AI basically means:
- Use short sentences and precise vocabulary.
- Eliminate ambiguity and rhetorical superstructures.
- Structure content so that it can be easily broken down, summarized, and summarized.
This is where a markup-oriented approach comes into play: the use of FAQ sections, ordered lists, tables, and diagrams, all solutions that help AI better read and interpret content. No less important is attention to thematic consistency. Content must remain focused, avoiding digressions or dispersion.
Today more than ever, the goal is twofold: to write articles that inform and engage people, but that are also understandable, readable, and usable by those—such as AI—who process text without intuition but with formal rigor.
Here too, SEOZoom already offers a complete ecosystem of solutions for strategic writing. The AI Writer allows you to create content from scratch, optimized already during the generation phase thanks to integration with the platform’s SEO data. The AI Assistant accompanies the writing process by offering real-time suggestions to refine text structure, improve empathy, align content with search intent, and correct any imbalances, while the Generative Filling features help expand sections or integrate new information without distorting the article.
Added to this is the new AI Engine, the first semantic analysis engine designed to verify whether content is compatible with AI engine logic. Thanks to an advanced vector model, AI Engine analyzes text by transforming it into semantic representations and comparing it with real search intent. This way, writers can know in advance if content is likely to be selected by a generative AI system, not just a traditional search engine.
The goal is no longer just to rank well on Google, but to ensure that content is relevant, in-depth, and effectively responds to the expressed—and unexpressed—needs of users, in whatever form it is processed: SERP, AI Overview, chatbot.
This approach transforms the very way we think about SEO: no longer just classic optimization, but work based on concepts, semantic relationships, and multichannel strategies. Writing for AI means building content that is rigorous, engaging, and immediately interpretable by new information distribution systems.
It means writing better, more clearly, with greater attention to the real needs of the reader, whether human or algorithm.
The challenge of relevance in digital information
SEO, artificial intelligence, increasingly sophisticated algorithms: today’s writers must contend with a digital environment where competition is fierce and public attention is an increasingly rare commodity. But there is an even more radical challenge awaiting journalism: the loss of its cultural centrality.
In recent years, news has become just one type of content among many. Alongside in-depth articles, there are short videos, memes, and instant comments. And often, it is not those who inform best who win, but those who entertain the most.
This transformation calls into question the very foundations of journalism: its role as a mediator, its public service function, and its credibility built on rigor and professional ethics. Today, journalists must compete with lighter formats, faster narratives, and the culture of oversimplification that dominates social media.
In this context, quality and relevance can no longer be taken for granted. They must be earned, day by day, piece by piece. And the good news is that it is still possible to make a difference by focusing on what journalism has always been able to offer: comprehensive, authoritative, and transparent information.
Information vs. entertainment: the digital crossroads
As journalists and newsrooms learn to navigate the new logic of SEO and rewrite their content to adapt it to artificial intelligence, an even more radical challenge is emerging. It is not enough to be visible. It is not enough to be read. We need to regain the central role that journalism has gradually lost, overwhelmed by an information ecosystem in which news competes daily with entertainment, oversimplification, and the constant hyperstimulation of social media.
Over the past decade, the perception of journalism has changed profoundly. Whereas traditional media once held a monopoly on credibility, today they must compete for attention with content that rewards immediacy, emotion, and ease of consumption.
Social networks have accelerated this process. Platforms such as TikTok, Instagram, and Facebook offer content designed for quick and superficial consumption: short videos, powerful images, and catchy headlines. In-depth analysis and complexity seem to be penalized: the average user is trained to prefer content that is easy, fast, and immediately gratifying.
Journalism thus finds itself at a crossroads: give in to the logic of entertainment or reaffirm its role as a guardian of quality information.
The data speaks for itself. According to the 2025 Report of the Italian Order of Journalists, the trust rating of traditional news outlets has fallen to 36%, a drop of more than ten percentage points in the last five years. At the same time, the consumption of non-journalistic content for informational purposes is growing: videos by opinion leaders, influencers, and digital creators.
The risk is clear: a disintermediated audience that no longer distinguishes between verified information and opinion content, between news and personal storytelling.
Regaining trust and authority
Regaining public trust is possible, but it requires a profound change in approach. The race for traffic has often led many newsrooms to prioritize quantity over quality; the web has become filled with mass-produced content, optimized for metrics but lacking in real informational value. This drift has contributed to eroding public trust, fueling a widespread feeling that it is increasingly difficult to find reliable and well-curated information online.
Serious journalism must stand out from this indistinct mass. It is no longer enough to publish: you have to publish well. Investing time in researching sources, verifying data, and editing texts is not a luxury, but a necessity. If the landscape is increasingly crowded with automatically generated news and superficial articles, quality can become the lever, the differentiating factor.
Regaining trust also means rejecting the industrial approach to content production and returning to a craft model, where each piece is carefully thought out, constructed, and verified. Only in this way can journalism regain its space and value.
The first strategy is to return to the roots of the profession: completeness, accuracy, clarity. The quality of information cannot be sacrificed on the altar of speed or viral appeal: it means stopping chasing hype at all costs and remembering to build articles that are effective, accurate, and documented responses.
Alongside the quality of content, it is essential to work on transparency. Disclosing sources, explaining methods of gathering and verifying information, and declaring potential conflicts of interest are practices that build trust and distinguish serious journalism from the background noise of the infosphere.
Another key aspect is editorial cohesion. It is not enough to publish authoritative content: you need to build a recognizable identity and a distinctive voice, continuously covering topics within your area of expertise, monitoring competitors, and staying up to date on trends.
Finally, journalism must learn to communicate its authority. Not in a self-referential way, but by leveraging the trust it has built with the public. Newsletters, podcasts, events, social media forums: every channel can be a lever for reestablishing direct, human, credible contact.
Information and democracy: the difficult balance in the age of machines
Moving quickly into a ‘political’ reflection, journalism is called upon to perform a task that goes beyond simply reporting facts, in a world where access to information is filtered by algorithms and artificial intelligence. Today more than ever, informing means mediating between quantity and quality, between speed and depth, between visibility and truth. It is a delicate role, because the way news is selected, processed, and disseminated directly affects the very health of democracy.
Search engines, social networks, and generative AI models influence what content we see, what voices we hear, and what issues end up in the public eye. And these filters do not always reward the most accurate or balanced information. The responsibility of journalism does not end with the production of content: it becomes a critical safeguard, a tool of cultural resistance against disinformation, superficiality, and manipulation.
The difficult balance to be achieved is between adapting to digital logic and maintaining a civil and democratic function: informing in an accessible way without betraying the complexity of the facts; using technology without succumbing to its excesses. And this balance is not just a question of method. It is, first and foremost, a question of responsibility.
Digital disintermediation has weakened the bond of trust between those who inform and those who are informed. Artificial intelligence, by accelerating this process, poses new challenges. It is no longer enough to guarantee that a news item is true: its traceability, attribution, and contextualization must also be guaranteed.
Today, AI can generate, rewrite, and manipulate texts in a matter of seconds, impacting the role of journalists, which is being redefined from content producers to guardians of documentable truth. Verifying sources becomes a continuous and dynamic process. Methodological transparency—making visible the paths through which a news story is constructed—becomes a requirement for credibility.
According to the 2025 Report of the Italian Order of Journalists, one of the main risks associated with the spread of AI in editorial processes is the loss of human control over published content. Generative models can suggest plausible but inaccurate texts, plausible but unverified reconstructions. Journalism, in order to remain journalism, must reaffirm the primacy of human editorial control: no machine can replace the responsibility that comes with signing an article.
This responsibility is all the more important as the public is exposed to information generated or filtered by technologies that tend to simplify, homogenize, and select on the basis of opaque criteria. Journalists must know how to use AI as a tool, not as a shortcut: they must ally themselves with technology without giving in to the temptation to delegate to it the task of reporting on the world.
Educating for critical reading and quality
In addition to producing quality information, journalism today has an additional mission: to educate the public to recognize it.
Digital literacy, considered a secondary issue until a few years ago, has become a democratic imperative. Teaching people to read news critically, to recognize an authoritative source, and to distinguish between documented information and manipulated narratives is essential to maintaining an informed and aware public space.
This educational responsibility requires precise editorial choices: giving space to fact-checking, explaining working methods, and building content that helps the public navigate a complex and fragmented context.
The goal is not only to combat misinformation, but also to educate citizens to be more attentive, more demanding, and more resistant to the superficiality that automation tends to encourage.
Because without an informed audience, even the best journalism risks losing its audience.
The future of information—and of democracy itself—will also depend on this: on the ability to rebuild a culture of critical reading, verification, and complexity. And in this task, journalism still has a lot to say. And to do.
FAQ on journalism and SEO
The transformation we are experiencing is not only technological: it is cultural. Journalism, faced with SEO and artificial intelligence, is facing a challenge that is not only about traffic or positioning, but about its very social function. Knowing how to write for the web is not a simple technical exercise; it is a form of responsibility.
SEO and AI should not be seen as constraints, but as tools. SEO teaches us to be visible while remaining faithful to the need to inform. Artificial intelligence, if governed and not endured, can become a support, not a threat. At the core remains the most important goal: to serve readers by offering quality, verified content designed to foster a critical understanding of reality.
The good news is that, in this complex scenario, there is still room for those who know how to combine method and talent, technique and passion. Journalism is not over. It is just changing shape. And those who know how to interpret this change will still have a central role in building the information of tomorrow.
- What is SEO writing applied to journalism?
SEO writing applied to journalism is the art of constructing informative articles optimized not only for human readers, but also for search engines and, today, for artificial intelligence. It means writing clearly, answering real questions, structuring texts in a way that facilitates indexing and readability, while maintaining rigor, accuracy, and depth.
- How do you write an SEO-friendly article for journalism?
Writing a journalistic article optimized for SEO requires a balance between informational rigor and visibility strategy. It all starts with choosing a topic: it’s not enough to identify an interesting topic, you need to understand what questions users are asking and which keywords best represent that search intent. The title must be clear, relevant, and in line with how people search for information.
The internal structure also matters: well-organized paragraphs, informative subheadings, natural use of keywords, and attention to synonyms and semantics reinforce the coherence of the text. A good SEO article is not a list of keywords, but quality content designed to stand the test of time and interact with search engines without losing the authenticity of the journalistic narrative.
- What is SEO language and how does it fit into journalism?
SEO language is a form of writing optimized to make content easier for search engines to understand. In journalism, adapting this language means using a natural style, avoiding unnecessary technicalities, maintaining accuracy, and following the rules of good writing, without falling into the trap of writing solely to “please” algorithms.
- What mistakes should be avoided when writing SEO for journalism?
The main mistake is to believe that SEO and journalism are two incompatible worlds and to continue treating them as separate entities. Writing with SEO in mind does not mean sacrificing journalistic quality, but integrating it with strategic visibility techniques. Common mistakes include keyword abuse, forced and unnatural writing, obsessive repetition of terms at the expense of fluidity, and underestimating the importance of good editorial structure and effective headlines. It is often forgotten that an article must be designed to be found, not just written. Neglecting to update content or not working on internal links to build solid thematic paths are other weaknesses that undermine the visibility of a piece over time.
- What does Readers Driven Journalism mean and how is it related to SEO?
Readers Driven Journalism is an editorial model based on actively listening to readers’ information needs. SEO helps interpret these needs by analyzing online searches, allowing journalists to produce content that responds in a targeted and relevant way to the real needs of the audience.
- How do you do SEO for a blog or editorial website?
SEO for a blog or editorial website means creating content that accurately answers users’ questions without compromising journalistic quality. It is not a one-off operation, but a continuous process of care and optimization. It requires a strategic approach that integrates journalistic quality and optimization techniques, combining technical expertise and editorial vision. In concrete terms, this means no longer just focusing on the most searched keywords, but developing a coherent information ecosystem with a clear structure, well-defined categories, and thoughtful internal links. Each article must have an effective title, a relevant meta description, and writing that uses keywords and synonyms in a natural way. It is also essential to keep content up to date, monitor traffic, and intervene with revision and relaunch strategies.
- What are the most useful SEOZoom tools for those with an informative blog or editorial website?
SEOZoom offers many tools designed to help journalists, publishers, and digital strategists improve information management and article writing, tailored to each stage of the process: tools to analyze content performance, identify current trends, and optimize editorial efforts.
The Editorial Assistant helps you write texts that are already optimized during the drafting stage, while AI Writer and AI Assistant support content creation and refinement by combining SEO data and artificial intelligence. The AI Engine allows you to check the compatibility of texts with search engines based on generative AI. To identify user needs, Question Explorer explores the most frequently asked questions on each topic, and the SERP monitoring module allows you to observe the performance of competitors and social content in searches. Using these tools is not simply about optimizing content: it means building comprehensive editorial strategies based on real data and geared toward ensuring visibility over time, even in a rapidly changing landscape such as generative search.
- How can you regain the trust of online readers?
By focusing on quality, transparency, and editorial consistency. Creating well-researched content, declaring sources and working methods, and building a recognizable editorial identity are key strategies. Trust cannot be demanded: it must be earned. And in a fragmented ecosystem, it is the real capital that distinguishes serious news outlets from those that are only chasing visibility.
- What impact is AI having on editorial and news sites?
AI is radically changing the landscape of editorial sites. With tools such as AI Overview and AI Mode, Google and other platforms summarize answers directly on the results page, reducing the number of clicks to the original sites. This translates into a decline in traditional organic traffic, particularly for news and information articles. However, data shows that the “classic” search engine still generates 97% of interactions: only 3% go through conversational AI models. The challenge for editorial sites is twofold: to adapt their content to be selectable by AI and, at the same time, to work on building strong and recognizable brands that can attract direct traffic.
- How are SEO strategies for news sites changing with the arrival of AI Overview?
The introduction of AI Overview and, in the future, AI Mode, is radically changing the way content is selected and displayed to users. News sites need to review their SEO strategies, focusing on texts that respond clearly and directly to audience questions and are easily extractable and summarizable by artificial intelligence models. It is essential to structure content with FAQs, data markup, and formats suitable for extraction, as well as working on increasing overall authority according to Google’s E-E-A-T principles (experience, expertise, authoritativeness, and trustworthiness).
- How is journalism changing in the age of AI?
Artificial intelligence is transforming journalism in many ways: automating part of news production; simplifying data research and analysis; changing content distribution logic. But the most important change is cultural: journalism must reaffirm human control, source verification, and editorial responsibility, using AI as a tool and not a substitute.
- What is the future of news blogs in the age of AI?
Informative blogs will not disappear, but they will have to evolve. It will become increasingly important to focus on diversified approaches, such as verticalization of topics, quality and continuous updating of content, and integrated multi-channel strategies (website, social media, newsletters, podcasts). Blogs that offer valuable content that is readable by both users and AI will still be able to thrive, albeit in a more competitive environment.
- Will AI replace journalists?
No, but it will change the way we work. AI is a powerful tool for automating repetitive tasks and analyzing large amounts of data. However, fact-checking, critical judgment, and ethical responsibility remain human skills. The journalism of the future will be increasingly hybrid: human in essence, technological in its tools.