Dear reader, first I apologize for bringing to your attention another post with predictions for SEO and Search in 2025.
However, I ask you to be patient, because in my list of “prophecies” I have tried to avoid, to the extent of my possibilities, proposing things that have already been presented in other articles.
On the contrary, following the philosophy of The Inbounder – the event on Search and Digital Marketing that I organized between 2015 and 2018 – I will try to present ideas that point straight to the future.
Therefore, some of the things I will say may not be very current in the short term, but (trust me) they will be in a much closer future than you might think.
And, as the elderly (and doctors) say, prevention is always better than cure, so starting now to think and act according to even the most futuristic predictions, which I present to you in this article, is the best way not only “to take care of your health”, as we say in Spain, but also to have that competitive advantage over other SEOs and companies that are “lazier” and less inclined to take risks.
Search behaviour (and expanded Messy Middle).
In a sort of unexpected back-to-the-future situation, Search more and more reminds me of the last 1990s.
Even if Google still is the biggest player in Search, people started to use more sources to find information able to satisfy their needs.
In other words, search is conquering and spreading into more environments: Generative Search, social media platforms like TikTok and Pinterest, Instagram YouTube and private messaging platforms like Discord.
This behaviour, which started a few years ago, will consolidate and be adopted by a larger group of people outside of Gen Z, and that now are still reluctant to adopt new ways of searching.
Business companies will need to abandon the established synonymy between SEO and Google because the Messy Middle no longer exists only within the Google ecosystem and abandon the inertial behaviour that drove them during the past two years.
But this will not be enough.
In fact, we will have to analyze the different search behaviours on multiple platforms to understand people’s real search journeys, because the so-called butterfly effect (for example, a viral informational search on TikTok that generates an impact on transactional searches on Google Search and traffic from the Merchant feed) will be something that could become even more common in the coming months.
Google UX and the hyper-personalization of Search.
Nowadays, Google already has a strong “web portal” look and user experience, especially in the case of transactional searches where the Merchant/Shopping features occupy most of the space.
In its need to have people spending most of their Search Journey inside Search (or its ecosystem), Google will expand even more the “portal experience”.
We should expect to finally see true personalized SERPs as described in the last Google I/O relatively soon in 2025.
Two things make me suspect of this:
The recent announcement of having the possibility to see a SERP in its not personalized version, is not at all a new feature, being it the archaeological parameter “pws=0” that has existed since 2009.
But why refresh people’s minds about this option? With my tin foil hat on, I think it probably is because the SERPs will be almost totally tailored over our search data.
The announcement of Sundar Pichai that “Search will profoundly change in 2025” (“agentic” is the keyword in this case, which also was a repeated word at Google I/O 2024.
“Deep Research” and Agentic Search.
The very recent rollout of Gemini 2.0 presented some interesting information related to search.
For instance, in the announcement article, Google says that it is “bringing the advanced reasoning capabilities of Gemini 2.0 to AI Overviews to tackle more complex topics and multi-step questions, including advanced math equations, multimodal queries and coding. We started limited testing this week and will be rolling it out more broadly early next year. And we’ll continue to bring AI Overviews to more countries and languages [my note: EU countries too?] over the next year”.
In another article published about Gemini 2.0, Google presented Deep Research, which nothing is but a sophisticated agent, which assists us in our Search Journey cutting all the steps, many times unfruitful, when looking for information and resources.
In other words, as this video shows better than words, we can ask Deep Research to do the work for us.
The interesting difference with respect to SearchGPT or other AI-based systems is that Deep Research asks us to validate the multi-search plan it elaborates.
Once we refine it and finally approve it, the agent will visit the resources found and create a report based on those sources, which will also present the links for us to review them directly.
Monosemanticity and unequivocal clarity.
The more websites are going to be sources for AI, the more a concept such as “Monosemanticity” will become important.
Monosemanticity can be defined like this (I ask experts in linguistics to not judge me badly): the property of a word, phrase, or token having a single, clear, and unambiguous meaning, reducing the risk of multiple interpretations in language models.
We all know how words tend to have more than one meaning aka they are polysemic (think of the classic example of “apple”).
Context is the only way to minimize the risk of the natural polysemic nature of words.
This is not something new, because the importance of context has always been in the guidelines of good SEO.
However, it will be even more important in the age of SEO for Generative Search (please, do not call it GEO).
Secondly, Monosemanticity also requires clarity in the same act of writing content, which means that too creative, “poetic” and rich in metaphors tone of voice could cause a content to struggle more than another written with a simpler and “Hemingwayan” style (albeit allegories could be still used if finely built).
Is this the “dead” of commercial copywriting made of puns and complex rhetorical figures? Not really. There’s space for both, like when we create a page for AdWords that is not indexable for organic search.
Link graph and Linked Data.
Be aware that link graph is synonymous with “link profile” in the context of my previews, so you to not misunderstand what I am going to say.
Backlinks are implicit in the very nature of the World Wide Web.
Furthermore, they were (and still are) a central element of how Google works.
Finally, we all know that the concept of Authority is largely tied to the link profile, that a website has.
That said, the relative weight of link profiles in the sense of authority transferred between different websites (the classic “endorsement” effect of citation with follow link) is destined to decrease the more generative Search becomes.
Better said, it will be important but no longer “so” decisive.
On the contrary, the relevance of the other nature of backlinks will increase, and it too is implicit in the very essence of the web: the connection between related information points: linked data.
The use of links, regardless of whether they are follow or nofollow, if they are always crawlable by bots, to create associations between different websites (think, for example, of the function of the “sameAs” property of structured data) will be increasingly used to determine the belonging of a web document (and its entities) to a specific knowledge domain.
Be careful! this is not actually new either because it has always been said that the best backlinks are those that come from websites that are thematically like ours.
I’m simply saying that the weight will tend to lean more on the side of their semantic value than on the pure and hard side of “PageRank”, creating a much more balanced value relationship between the two, and the feedback between both natures will be stronger.
A corollary of this reasoning is that factors such as co-citations and co-occurrences will also take on a renewed weight in creating semantic connections between separate entities (websites, concepts, and named entities).
The Return of Ontologies.
Another prediction that features something that is not new, but that will now resume the role that it should have always had: setting the foundations of a correct content strategy, architecture and marketing actions in general and not only in the SEO field.
In fact, if we think about it, when we talk about Topical Authority or when Google invites us with good means (articles on its blog and posts by Danny Sullivan) or bad means (manual penalties) to be consistent with the central scope of our “business”, at the root of these themes we find nothing other than “ontology”.
For those who need it, here is a short definition of Ontology (once again I ask experts to pardon me if it look superficial):
Ontology is a branch of philosophy that studies the nature of being, existence, and reality, focusing on the categories and relationships of entities within a particular domain or the universe.
In a broader context that matters the most for us here, such as computer science or knowledge management, ontology refers to a structured framework for organizing information, defining entities, their attributes, and the relationships between them, often to enable better understanding, reasoning, and data interoperability.
Without defining the limits of the ontology and the “sub-ontological” elements of the domain related to our field/niche, then it is complex to fully design taxonomies and to create the connection between the entities related to our “domain” as a business.
As you can understand, this work of defining our ontology is even more necessary in generative research, because the more it is defined, the clearer the context we offer it to understand our content and how it relates to others and, therefore, to be used as a source to be shown in its responses to searches.
Finally, within the Topical Authority, you can easily understand the importance of Ontology, because if Ontology serves as the backbone for structuring, connecting, and representing knowledge within a domain, then it is essential for achieving topical authority in both human and machine-readable contexts.
Brand SEO.
There is a lot of talk about the renewed importance of creating stronger bonds between the concept of Brand and SEO.
Renewed because, after all, it was important also before. However, in the context of the evolution of traditional search and new search systems, creating clear associations between the Brand entity and the entities we want to be associated with is even more important.
If you notice, this point – after all – is nothing more than a derivative of the previous one on the importance of rediscovering the value of Ontology in the SEO field.
I won’t go into further details, and I invite you to read my guide on SEO and Brand.
The risk that Search will suffer its version of mad cow disease.
This is the prediction that worries me the most not only as a professional but also as a user.
The irruption of AI in content creation has been huge and, unfortunately, most of the content created in a substantially automatic and simplistic way with ChatGPT, and similar systems is terrible.
The problem is that this garbage has flooded the web and, as such, has entered the LLM + RAG ecosystem.
If low-quality content enters the system, then it could eventually be used to generate new content of horrible quality on average and, therefore, start a negative flywheel in terms of the reliability of the information received from these systems.
In other words, we are increasingly feeding the cows (Google, Bing, SearchGPT et al) with protein supplements based on dead cows (AI-generated content), increasing the possibility of a cascading deterioration in quality due to self-contamination loops, with obvious consequences on people’s trust in the truthfulness and reliability of information especially in the fields of education, health, financial information and journalism in general.