This study is built around a theory: that Google AI Mode is not just a test feature; it’s the future of Search. And while that may sound bold, there are now enough signals pointing in the same direction to make a strong case.

Some of these signs are obvious — others are scattered or subtle — but when looked at together, they suggest something bigger: Google is getting ready to replace the traditional search results page with a conversational, personalised, AI-powered experience.

All of them point to a clear direction: Google is not experimenting, it’s evolving. AI Mode isn’t a widget. It’s the next search interface.

Now, some might say this all sounds like another classic SEO conspiracy theory, and, yes, we sometimes wear our tin foil hats a little too proudly. But in this case, the pattern is real, the signals are public, and the implications are too big to ignore.

Google previewed the future of Search at Google I/O 2024, but the early presentation was overshadowed by the clumsy AI implementation we still face today. It seems, however, that Google is genuinely working to turn what felt like a promotional video for investors into reality:

Follow me as I explore how and why this shift is happening and what it means for users, businesses, and the future of SEO.

The signs

AI Mode in Search Labs (SGE → AI Overviews → AI Mode)

Google uses Search Labs as a staging environment for paradigm shifts in Search UX. SGE’s lifecycle shows how features tested quietly there can become mainstream defaults. AI Mode is now undergoing the same lifecycle

Impact:

Area Effect
Websites Loss of control over first-click experience; fewer direct visits
Businesses Users might resolve queries within the AI interface, skipping sites
Traffic Expect further erosion of non-branded organic clicks
Visibility Must focus on structured content and being cited in AIO/AI Mode
SEO Move beyond keyword SEO _ answer-level, source-level, and entity SEO

Gemini as foundation (moving from 2.0 to 2.5)

Gemini isn’t “just a model”; it’s Google’s AI operating system, meant to unify behaviours across Workspace, Android, Chrome, and now Search. With 2.5, we expect:

  • Better multi-modal understanding.
  • Tighter app integrations.
  • Faster personalisation inference.

Impact:

Area Effect
Websites Need to ensure content is legible to multimodal AI (text + images)
Businesses Integration points (Gmail, Maps, Search) become AI surfaces
Traffic Some journeys may never touch your site if Gemini completes them (agentic potentialities)
Visibility Importance of entity-level clarity and context
SEO Emphasis on feedable, explainable content (structured, annotated)

Search Journeys + Hyper-Personalisation

Search is no longer a session of one-offs. 

It’s a journey memory graph with history, preferences, and inferred goals. Gemini’s “Memory” lets AI Mode recall what users have searched for or interacted with.

Impact:

Area Effect
Websites Repeat traffic may decline unless site becomes part of the journey loop
Businesses CRM alignment with Google data becomes vital
Traffic One-click, one-session traffic model is dying
Visibility
SEO Structured guides, tutorials, goal-based content win over generic posts

Gemini Memory + Contextualization

Gemini’s “Memory” allows persistent knowledge about a user to shape responses and so offers context-aware, goal-based suggestions

It means future queries aren’t isolated but interconnected.

Impact:

Area Effect
Websites Your past interaction with a user may determine future AI mentions
Businesses Brand storytelling and recurring content gain more weight
Traffic New users harder to reach unless you enter their journey early
Visibility High intent, contextual entry-points become more valuable
SEO Optimize for long-term relationships, not just one-time rankings

AI Overviews rolled out globally (with translated sources too)

AI Overviews are already translating English-language sources into localised results (e.g., in Google.es).

This creates a semantic layer over the multilingual web.

Impact:

Area Effect
Websites Content in English can still be surfaced in other languages
Businesses Multilingual SEO fully integrated with semantic SEO
Traffic English-dominant sites get backdoor reach, but also lose brand trail
Visibility
SEO Geo-localization signals and perfect content localization become even more important

ccTLDs Sunset + Geolocation by settings

Google’s decision to deprecate ccTLDs (e.g., .es, .fr) as search determiners signals that Search is now user-centric, not geo-IP centric. AI Mode will show content based on user settings and behaviours.

Furthermore, as any international SEO can confirm, consolidating visibility into a single domain name is an operation that offers enormous savings in terms of management, maintenance, updating, and costs.

Impact:

Area Effect
Websites Local SEO without local TLDs = more reliance on structured data
Businesses Need Business Profiles + content targeted by intent and use
Traffic Local traffic may become more volatile
Visibility Region-based SERPs disappear, replaced by user-context SERPs
SEO Go beyond hreflang _ lean into location + intent signals

Performance Max and AI Max for Search Campaigns + monetisation alignment

AI Mode isn’t a monetisation problem for Google; it’s a monetisation opportunity. Performance Max is already AI-native, and Google recently announced AI Max for Search Campaigns with these words:

To help you maximize performance now, and get your campaigns ready for the ever-evolving Search experiences of tomorrow, it’s key to adopt the latest AI-powered ads solutions. That’s why we’re introducing a new, one-click feature suite — AI Max for Search campaigns.

 AI Mode provides far richer context for targeting and conversion.

Finally, Google will maintain the “Sponsored” label in AI Mode also for legal reasons. This does not mean that it will stop experimenting with the sometimes questionable designs of that label.

Impact:

Area Effect
Websites Direct conversions may bypass your site entirely (e.g. local actions)
Businesses Ads may appear as part of the AI conversation, not just results
Traffic Less organic room if AI absorbs intent → more “pay to play”
Visibility Non-paying sites may see steeper declines in certain verticals
SEO Integrate SEO with PPC strategy (hybrid visibility model)

MUM + Shopping Graph + Lens Integration

MUM (Multitask Unified Model) supports cross-modal search: voice, image, video, and text combined.

Gemini and AI Mode are its real-world implementations. This brings visual and shopping queries into the AI flow.

Impact:

Area Effect
Websites Visual-heavy content becomes searchable by AI, not users
Businesses Shopping Graph favors feed-driven, structured product listings
Traffic Non-schema-rich e-commerce sites will be filtered out
Visibility Shopping and visual brands must optimize for entity-based discovery
SEO Product schema, image metadata, object detection optimization for visual search

Disappearance of “10 blue links” even from AI Overviews

Classic Search has already been marginalised:

  • AI Overviews rarely show 10 clickable results (about an average of 5 according to this study by Advanced Web Rankings).
  • Most links are embedded as citations or side-references.
  • AI Mode pushes organic links below the fold.

This reflects Google’s shift from position-based ranking (Page 1, Position 3) to semantic relevance + citation-level trust.

Impact:

Area Effect
Websites Need to earn citation-level trust in Gemini, not just top-10 status
Businesses Branded content may only be visible when explicitly asked for
Traffic Long-tail traffic and “discovery clicks” reduced sharply
Visibility Focus on being cited vs. being clicked
SEO New goal: “Be the source Gemini quotes”: high E-E-A-T, clean markup

Chrome, Gmail, and Workspace unify around Gemini

Google is orchestrating a convergence across its ecosystem:

  • Gemini sidebar now lives in Chrome, Gmail, Docs, and Sheets.
  • Prompts in Gmail (“draft reply”) or Docs (“summarise meeting”) feed into Gemini.
  • These behaviours start to shape what users expect in Search.

This means Search becomes part of a broader assistant workflow; not the first stop, but a supporting action inside a multi-app journey.

Impact:

Area Effect
Websites Less chance to “intercept” top-of-funnel discovery
Businesses Need to structure assets for use inside Gemini (e.g. product sheets, templates)
Traffic High-funnel searches may move out of Search into Assistant & Workspace
Visibility Influence occurs earlier (in Gmail or Docs, for example) not just via SERPs
SEO Content must be reusable, modular, and embeddable in workflows. Stricter collaboration with email marketing

Discover on desktop: maybe the most relevant sign of all.

The rollout of Google Discover on desktop is not just a UI feature, but also it signals a strategic convergence between:

  • Predictive content delivery (Discover).
  • User intent modelling and memory (Gemini/AI Mode).
  • Search interface redefinition.

It turns the desktop homepage into a push-based experience, much like the mobile Discover feed, which has been quietly training Google’s predictive recommendation systems for years.

Why does this matter for AI Mode?

Discover is pre-search

It operates without a query, suggesting topics, stories, and videos before the user types anything

This is exactly how:

  • Gemini Assistant is being trained to anticipate user needs.
  • Memory and context inform proactive help.
  • Hyper-personalised journeys are launched even before intent is explicitly expressed.

Desktop is where high-value search happens
Rolling this out on desktop, where users perform complex, high-converting tasks, means:

Predictive systems are no longer limited to entertainment or news.

They now shape research, buying decisions, and professional tasks.

This builds the behavioural base for AI Mode to feel natural when it begins prompting you mid-task (e.g. “Do you want me to summarise this doc?” or “Want to add this to your itinerary?”).

Predictive Search is now central

Google is quietly making search predictive by default, and Discover on desktop is a training layer for that.

Combined with Gemini’s memory and cross-app presence:

  • AI Mode can eventually replace search sessions with micro-suggestions.
  • Intent will be inferred before the query.
  • Results will be surfaced as conversations or nudges.

Feedback loop: Discover → Memory → AI Mode

  1. Discover shows a news story about Japan.
  2. You click and read → Google logs interest.
  3. Gemini now infers a travel journey.
  4. AI Mode will later pre-fill suggestions: “Flights to Tokyo”, “Best time to visit Japan”, “Currency conversion tips”.

Impact:

Area Impact
Websites Content must be snackable, visual, and click-worthy for Discover surfacing
Businesses Branded content may enter users' journeys before they even search
Traffic Search-independent traffic source grows, bypassing SERPs entirely
Visibility You must earn presence in both pull (search) and push (Discover)
SEO Optimize for interest alignment, topical freshness, and visual signals (titles, images, schema)


What could be a map of the search journey?

Choosing the Search Journeys

User intent layer

Where input begins, shaped by personalisation and context.

  • Expressed query: Text, voice, image (Lens), gesture, or mixed-mode input.
  • Contextual memory: Past searches, preferences, location, devices.
  • Multimodal input: Uploading photos, describing needs, using camera (Google Lens).

Implication

Businesses must make content searchable not just by text, but by image, voice intent, and contextual signals (localisation, previous interests).

AI processing layer (Gemini + MUM)

Where the user signal is interpreted, enriched, and modelled.

  • Goal interpretation: Gemini infers the task (e.g., planning, comparing, booking).
  • Memory application: It tailors response to past history (e.g., preferred stores, past answers).
  • Multimodal understanding: Combines query + image + product need (e.g., “what charger for this?” with photo input).

Implication

Content must map clearly to intent types and goals, not just keywords. Semantic clarity, product graph integration, and schema matter deeply.

AI Mode response layer

The user sees a task-aware, conversational interface, not a list of links.

  • Conversational response: AI explains, suggests, and clarifies through dialogue.
  • Dynamic follow-ups: The system prompts further refinements or questions.
  • Citation-based answers: Sites appear if they’re deemed trustworthy, not ranked.

Implication

To be surfaced, a site must become part of the trusted response flow. That requires strong E-E-A-T signals, authority, and entity clarity.

Business impact layer

How does this shift affect your strategy?

  • Visibility shifts: From position-based (Page 1) to trust/citation-based.
  • Traffic changes: Discovery drops, but conversions may rise from deeper user intent.
  • Winning requires: Being embedded in the journey, not just available at the start.

Implication

SEO must evolve to Entity + Intent Optimisation, focus on being reusable by Gemini, and invest in structured content that aligns with tasks, not just topics.


Gemini Memory + Contextualization + Hyper personalisation, and the risk of the “search echo chamber”

Memory of an Artificial Intelligence

In a recent interview to Financial Times, Elizabeth Reid (Head of Google Search) said that Search “will get more personalised over time, not just in the results, but in how you learn well.” 

The radical change of AI Mode is represented by the following three characteristics:

Gemini Memory stores:

  • User preferences (e.g., “I prefer vegetarian options”).
  • Past interactions (e.g., “I’ve already researched Greece travel in March”).
  • Task progression (e.g., “I’m planning a move”).

Contextualization layers real-time information:

  • Device, location, time.
  • Recency of previous queries.
  • Session continuity.

Hyper-personalisation:

  • Google doesn’t want you to just get “the best answer”; it wants you to get your best answer.
  • Over time, AI Mode can learn to speak in your style, recall your routines, and prioritise content aligned with your values or history.

If left unchecked, this system could:

  • Reinforce confirmation bias (e.g., only showing fitness advice from keto sources if that’s what you’ve clicked in the past).
  • Limit exposure to alternative viewpoints, publishers, or minority knowledge.
  • Prevent the discovery of new or disruptive information.
  • Create gated informational realities, where two users with similar queries get radically different content with no visibility into what’s omitted.

If this already is a problem in classic search, and well-known by all the players in the LLM-sphere (see this research by Anthropic), in AI Mode search it would be even worse, because it would not be just ranking that’s personalised, but the very generation of the answer.

How Google could mitigate the echo-chamber risk

This is a historical worry for Google. Already in 2009, it filed a patent (analysed by Bill Slawski in this post), which introduced the concept of Search Entities that can create a “probability score.” This “probability score” can determine if a web document that is not on a domain present in our search history can be presented to us, to mitigate personalisation.

In 2022, then, Deepmind presented this study: “Building safer dialogue agents“.

DeepMind’s Blue Sparrow is an AI chatbot designed to provide helpful, accurate, and harmless responses. To ensure safety, Sparrow follows a set of predefined rules, such as avoiding threatening or hateful language and not impersonating humans.

Therefore, it is reasonable to think that Google will continue its policy of mitigating the echo-chamber risk.

Here’s how Google might architect protections — some of which we already see emerging:

Transparent source citations

AI Mode increasingly shows sources with summaries. Google could:

  • Require visible citations in every AI answer.
  • Let users expand to see “alternative sources” (e.g., opposing views, different domains).

This provides epistemic accountability: users can check where the info came from, even if it was summarised or synthesised.

“Global vs Personal” toggle

Like how YouTube lets you pause Watch History or Discover to change recommendation settings, Google could:

  • Offer an explicit toggle: “Use Personal Context” vs “Show Neutral Results” (as Google already offers with the link “Try without personalisation” at the bottom of the SERPS).
  • Let users view results through someone else’s lens (e.g., incognito-like or, via settings, regionally calibrated).

This gives users agency over personalisation, which is key to avoiding blind spots.

Memory Settings + Editable Profile

Google already allows users to:

  • Review and delete Gemini Memory items.
  • Customise preferences (e.g., brands, dietary restrictions).

Future features could include:

  • Bias monitoring dashboards.
  • Warnings like “You tend to see sources from X. Want to explore others?”.

This makes personalisation visible and tunable, which reduces its hidden bias effect.

Inbuilt diversity functions (like Knowledge Panels)

Google could:

  • Ensure that AI answers always include contrasting perspectives.
  • Inject fact-checking or opinion flags into AI Mode (like the “About This Result” feature)

This allows for multi-polar representation in AI answers, like showing both scientific consensus and community scepticism in health searches.

Personalisation fallback hierarchy

In mission-critical queries (e.g., health, finance, law), Gemini may:

  • Override personalisation if it would lead to dangerous or misleading outcomes.
  • Use a “least biased” answer pool for these topics.

This mirrors how Google’s core algorithm applies different quality standards to YMYL (Your Money, Your Life) topics.

Another solution that is both a symptom of Google going in the direction of AI Mode as the future new default and a possible solution for avoiding echo-chambers is the “Web” search filter presented in the Search Menu for a time now.

If this filter, the “without personalisation” link option and other measures hypothesised will be maintained (and I think they will be) and introduced, then there will be these implications for SEO and Content Strategy:

Feature Impact for Websites & SEO
“Web” filter Sites must retain strong HTML-based crawlable pages, not just feeds or APIs
“Without personalization” link Your content must compete in a neutral, non-contextual SERP, so foundational SEO (E-E-A-T, schema, link equity) remains essential
Toggle = trust choice Brands that are too narrow may vanish in depersonalized views
Dual-track optimization Optimize for both: personalized inclusion (via context) and general web ranking (via semantic clarity and links)


Search Features likely to be integrated or rebuilt inside AI Mode

A woman standing in the kitchen while preparing a recipe, and consulting Google Search in big screen device close to her

Featured Snippets

Integration Likely? YES

Why: Gemini is a featured snippet generator, just on steroids. It already synthesises top answers into conversational summaries.

AI Mode behaviour: Featured snippets become part of multi-source citations or dynamic summaries that adjust to your follow-up prompts.

People Also Ask (PAA)

Integration Likely? YES, but transformed

Why: PAA reflects user curiosity and follow-ups. That’s core to how AI Mode operates.

AI Mode behaviour: PAAs become conversational follow-up options, shown inline or suggested interactively based on memory and inferred journey.

Knowledge Panels / Entity Boxes

Integration Likely? YES

Why: Gemini needs strong entity understanding to answer accurately. Knowledge Panels provide that structured info.

AI Mode behaviour: Info from Panels becomes part of AI’s internal grounding but also appears in sidebars or as visual cards when answering about people, places, brands.

Images (with Lens integration)

Integration Likely? YES, and Lens is already integrated

Why: AI Mode supports multimodal input. Lens queries (images, scans) feed directly into AI reasoning.

AI Mode behaviour: You can upload or photograph something, and AI Mode provides answers, translations, shopping results, or summaries.

Google Translate

Integration Likely? YES, silently built-in.

Why: Gemini already translates AI Overview sources in the background. Translate powers global accessibility in AI Mode.

AI Mode behaviour: Content is translated on the fly, with no user action required.

Search Filters/Search Menu (Images, News, Videos, Books)

Integration Likely? YES, but redesigned

Why: Vertical filters serve different content types; AI Mode will still need to distinguish them.

AI Mode behaviour: These become “follow-up actions” rather than tabs (e.g., “Want to see images?” or “Would you like current news on this topic?”).

Topic Filters (a.k.a. Dynamic Search Refinements)

Integration Likely? YES, but deeply transformed

Why: These filters are dynamically generated based on Google’s understanding of user intent and content relationships. They help users narrow, broaden, or shift their search focus within a topic cluster, often without typing a new query.

In AI Mode, this behaviour maps well to conversational refinement and intent disambiguation.

AI Mode behaviour: Instead of UI tabs, Topic Filters will likely evolve into suggested follow-up prompts or contextual nudges, such as:

  • “Are you looking for symptoms or treatment?”
  • “Do you want to compare prices?”
  • “Would you like to explore similar destinations?”

These are already emerging in Gemini and AI Overviews as intent-splitting scaffolds or exploration suggestions.

Example evolution:

  • Before: [mental health] → [topic filter: “for teens”, “at work”, “articles”]
  • After: “You asked about mental health. Would you like resources for teenagers or workplace stress?”

Shopping Filters (in SERP sidebar for transactional intent)

Integration Likely? YES, but deeply adapted into AI-driven product exploration

Why: These filters (e.g. Brand, Price, Material, Size, Ratings) are tied to structured product data in Google’s Shopping Graph. They help users refine choices quickly in high-intent purchase journeys. These aren’t just UI elements; they represent product taxonomy and commerce ontology, which AI Mode still needs for decision support.

AI Mode behaviour: In AI Mode, traditional filters will likely:

  • Be replaced or supplemented by natural-language refinement prompts (e.g., “Are you looking for something under €100?” or “Prefer organic cotton or synthetic?”).
  • Appear as context-aware follow-up questions in shopping journeys
    Be dynamically tailored by Gemini based on your preferences, history, or inferred persona.

This mirrors how tools like Gemini in Shopping already ask guiding questions and show cards like “Compare by material” or “Most-reviewed under €50”.

Example evolution:

  • Before: [running shoes] → sidebar filters for Brand, Price, Colour.
  • After (AI Mode): “Here are the top-rated running shoes. Want something from Nike or Adidas? Or filter by price range?”

Organic Merchant Features in Search Results

Popular Products

Integration Likely? YES, as structured cards or AI-generated rankings

Why: This is an entity-powered module that aggregates products across retailers based on popularity, reviews, and availability. It already draws from the Shopping Graph.

AI Mode behaviour: will likely evolve into:

  • AI-ranked product carousels with conversational context (e.g. “Here are popular eco-friendly running shoes under €100”).
  • Dynamic follow-up prompts like “Want to compare by cushioning or durability?”

Fast Pickup or Delivery

Integration Likely? YES, but more personalised

Why: This is based on real-time inventory and merchant feed data (e.g., via Google Merchant Center and Local Inventory Ads).

AI Mode behaviour:

  • Integrated into AI answers: “You can get this at Decathlon in Valencia with same-day pickup.”.
  • Combined with memory/personalisation, AI could prioritise retailers or fulfilment options based on past behaviour or preferences.

Explore Brands

Integration Likely? YES, as brand-driven clusters or cards

Why: These help users navigate product discovery via brand affinity. Useful in ambiguous, early-stage shopping journeys.

AI Mode behaviour:

  • Presented as cards or AI-generated summaries (e.g., “Nike is best known for cushioning tech, while Hoka focuses on long-distance comfort”).
  • Follow-up prompts: “Want to see Adidas alternatives?”.

Product Knowledge Graph Panels

Integration Likely? YES, and supercharged.

Why: These panels consolidate structured product data (GTINs, specs, variants, reviews, images, availability). They are essential for product disambiguation, a key AI challenge.

AI Mode behaviour:

  • Will likely persist as interactive product entity cards within AI Mode.
  • Possibly enriched with multi-modal content (images, videos, even user reviews or comparison tables).
  • Used to ground product comparisons in AI-generated responses

Local Pack (Maps), Hotels Module, Top Sights, and Places Sites

Local Pack (Maps)

Integration Likely? YES, core to local intent, but likely more conversational.

Why: The Local Pack is powered by Google Business and the Maps platform. It’s critical for any query with local or near-me intent (restaurants, services, stores).

AI Mode behaviour:

  • Still shown, but more integrated into natural responses: “Here are three highly rated Italian restaurants nearby. Do you want directions or a reservation?”.
  • May include rich cards with ratings, photos, hours, and booking links.
  • It will depend heavily on real-time location data and personal preferences (e.g., previous visits).

Hotels Module (Google Travel vertical)

Integration Likely? YES, but deeply embedded in AI trip planning

Why: Hotel results are already structured and entity-based, often pulled from Google Travel. Users expect filters like dates, rating, amenities, etc.

AI Mode behaviour:

  • Becomes part of a travel-planning conversation: “Looking to stay in Madrid in June? These 4-star hotels with breakfast included are popular. Want to filter by price or location?”.
  • Likely paired with suggested itinerary planning, “places to eat nearby,” etc.

Top Sights

Integration Likely? YES, reframed as AI-curated discovery prompts

Why: “Top Sights” is a list of prominent attractions surfaced from Google’s Local Search Graph, based on popularity, reviews, and user behaviour. It appears in tourism and “things to do in…” queries, especially on mobile.

AI Mode behaviour:

  • Will appear as AI-generated recommendations, integrated into conversational trip planning
    Example: “Top sights in Lisbon include Belém Tower and Jerónimos Monastery; would you like opening hours or add them to an itinerary?”.
  • May evolve into clustered suggestions based on interests: “Are you more into history, viewpoints, or family-friendly attractions?

Places Sites

Integration Likely? POSSIBLE, but may be de-emphasised or reframed

Why: “Places Sites” is a carousel of third-party discovery platforms (e.g., TripAdvisor, Culture Trip, Booking) offering alternatives to Google Maps or Hotels. It appears when Google detects a need for external recommendations or user reviews beyond Google’s ecosystem.

AI Mode behaviour:

  • May be reframed as “Want to explore more reviews or guides?” prompts.
  • Could be integrated as citations in AI-generated summaries, especially if AI aims to reflect a variety of sources.
  • However, Google may prefer to retain users within its own Travel/Maps ecosystem, so this module could be less prominent in AI Mode unless needed for trust or diversity (or legal reasons in the European Community).

Discussions and Forums

Integration Likely? YES, possibly more important in AI Mode, but selectively surfaced

Why: This module surfaces content from platforms like Reddit, Quora, Stack Overflow, and niche forums. It’s Google’s response to:

  • Growing user demand for “authentic, human” answers.
  • Rise of queries with added modifiers like “Reddit,” “forum,” or “real opinions”.
  • The limitations of thin affiliate/commercial content in answering nuanced or trust-based queries.
  • It’s also a strategic hedge against AI hallucinations by grounding AI responses in real human discussions, Google builds perceived trust (well… this will be so if Google improves the retrieval from forums).

AI Mode behaviour: Will persist and expand, but as grounding or justification layer.

  • Forum discussions will likely be cited in AI Overviews when opinion, diversity of experience, or authenticity is important: “Many Reddit users suggest using X for better results…”.
  • Google may integrate them as “Perspectives”, FAQ cards, or follow-up questions: “Want to see how others troubleshoot this?” → leads to curated threads.
  • Expect a tighter focus on “trusted communities” with moderation and structured conversations.

Risks and constraints:

  • SEO concern is valid: these often obscure brand or publisher results, especially for long-tail queries.
  • Forum content may be prioritised over blog posts that lack original experience or community validation.
  • AI Mode will hopefully suppress low-quality or off-topic discussions, giving space only to high-signal, high-engagement threads.

Videos & Shorts

Videos (standard YouTube + embedded video content)

Integration Likely? YES. It will remain an essential content format, but the interaction model will shift

Why

  • Videos are a primary way users learn, compare, and experience (especially in how-to, reviews, and tutorials).
  • Videos are entity-rich and highly structured via timestamps, captions, chapters, etc.
  • They are directly owned by Google (YouTube), making them strategically integrated.

AI Mode behaviour:

  • Videos will be shown as “interactive answers” or “explainer follow-ups”: “Want to watch how this is done?” or “See a video review?”.
  • Gemini may auto-summarise video content, extract steps, or jump to a relevant timestamp (already tested).
  • Featured videos will likely appear inline, attributed, and contextual within AI answers rather than in isolated carousels.

YouTube Shorts

Integration Likely? YES, especially for mobile, exploration, and quick answers

Why:

  • Shorts represent Google’s answer to TikTok and Reels in mobile-first discovery.
  • They are bite-sized answers for queries like product demos, hacks, opinions, or trends.
  • Shorts are a new type of vertical that merges video with social + search intent.

AI Mode behaviour:

  • Likely to be integrated in conversational UX: “Want a quick video on this?” → shows Shorts.
  • May appear in carousel or swipe formats, especially in lifestyle, fashion, food, beauty, DIY, etc.
  • Gemini could even say: “Here’s what creators are saying in <60s clips”.
  • Shorts could be clustered thematically (e.g., “3 tips in 3 Shorts”).

Recipe Rich Results

Integration Likely? Yes, with transformation

Why:

  • Recipes are highly structured content, which makes them ideal for Gemini to parse and summarise reliably.
  • Google has heavily promoted Recipe schema for years, and many sites already comply.
  • Cooking is a task-based intent (e.g., “make vegan lasagna”), which aligns perfectly with AI Mode’s focus on goal completion.
  • Gemini can already generate step-by-step instructions from structured inputs, turning Recipe pages into conversational walkthroughs.

AI Mode Behaviour: Instead of showing a carousel or card stack, Gemini will likely:

  • Summarise the best-matched recipe in natural language (e.g., “Here’s a quick vegan lasagna with 20 mins prep time”).
  • Include key attributes inline (time, calories, rating).
  • Let users ask follow-ups: “Can I replace ricotta with tofu?” or “What’s a gluten-free version?
  • Possibly present multiple options as conversational suggestions like “Would you like a 30-minute version or one with fewer ingredients?”.

What could disappear:

  • Traditional recipe carousels may be downgraded or buried.
  • Less visible brand presence, because users may interact with the AI’s summary without clicking through to your site.
  • “Position” becomes less important than being the one Gemini quotes.

Top News

Integration Likely? YES, it will be core to real-time and sensitive queries, but more contextualised in AI answers

Why:

  • “Top stories” blocks are driven by Google News and rely on publisher trust signals, index freshness, and structured data (e.g., NewsArticle schema).
  • It’s Google’s way of grounding answers in journalistic sources, especially for volatile, controversial, or evolving topics (e.g., politics, disasters, health updates).
  • The feature helps satisfy user intent for recency, which static AI-generated content alone cannot reliably fulfil.

AI Mode behaviour: Will remain but appear as news-grounded components inside AI Overviews or as “current updates” prompts.

Examples:

  • As of today, the conflict has escalated; according to Reuters and BBC…”.
  • Would you like to see recent headlines about this?” → leads to inline or expandable news cards.

X/Twitter: “Latest posts from…” 

Integration Likely? YES, conditionally, but it will depend on licensing, trust, and AI grounding needs.

Why:

  • This feature pulls in real-time social commentary directly from X (formerly Twitter) accounts, often shown in a carousel format or inline stream.
  • It addresses recency, public opinion, and “what’s happening now” — things that static content or even traditional news sites may lag on.
  • It’s used for celebrity queries, political figures, live events, and brand sentiment.

AI Mode behaviour:

Conditional integration, likely to appear:

  • As part of grounded citations in AI answers where public reaction matters (“Following the announcement, many users on X expressed concerns; here are a few perspectives…”.
  • In “Live Reactions” cards or summary clusters for trending events (e.g., debates, product launches, disasters).
  • Possibly filtered or curated to emphasise verified, high-authority accounts (e.g., official organisations, influencers).

However, it is not guaranteed, due to:

  • Licensing tensions between Google and X.
  • Quality variability (misinformation, toxic content, spam).
  • Gemini may favour Discussions and Forums in cases where X content is unreliable or volatile.

The measuring problem: current metrics don’t map to AI Mode / AI Search

How to measure AI Mode?

In the classic SERP:

  • Impressions = appearance in ranked results.
  • Clicks = explicit user action.
  • CTR = signal of relevance.
  • Queries = keyword-level user intent.

In AI Mode:

  • No ranking positions.
  • No clear “pages” or “clicks”.
  • Answers may cite a site, but the user never visits it.
  • Interactions are multi-turn, memory-driven, and contextual.

That means Search Console in its current form won’t be enough (well, it is not enough already, as we SEOs said to Googlers in every Google Search Central live event).

Below, I present my thoughts and “desires”.

What Search Console Would Need to Adapt

Citations instead of Impressions

  • Metric: “Appeared in Gemini response” or “Cited in AI Answer”.
  • Type: Textual, inline, or reference-style mention.
  • Impact: Like featured snippet appearances, but without clicks.

New “Surface” dimensions

  • Metric: Surface = Classic SERP / AI Mode / Overview / Discover / News / Images & Lens.
  • Purpose: Helps isolate where the user interacted with your content.

Memory-based Discovery

  • Metric: “Appeared due to personalisation/memory”.
  • Purpose: Tracks when your content is selected not because of query relevance, but because Gemini remembered it as useful to the user.

Query-less interactions

  • Metric: Prompt-triggered discovery (e.g. from Discover, Workspace, Gemini chat).
  • Purpose: Content exposure without a search query (zero-click intelligence).

Structured Content Utilisation

  • Metric: “Schema used in AI summary”, e.g. FAQ, HowTo, Product, Review.
  • Purpose: Helps webmasters understand which structured formats are fueling Gemini answers.

Will Google update Search Console?

Most likely yes, but incrementally, and not transparently at first.

At least, this is what I understood when attending the Search Central Live event in Madrid and hearing Moshe Samet (Product Manager at Google) say: “AI Overview data is not going to be available yet”.

That “yet” is the keyword that makes me hope for the introduction of this data point.


AI Mode is the natural evolution of Google Search

Google AI Mode - The Movie

It is not just a reaction, but a continuation of a trajectory that began over a decade ago.

Why?

Since 2015, Google has consistently shifted from keyword matching to semantic understanding:

  • RankBrain → semantic intent
  • Neural Matching → concept-to-concept connections
  • BERT & successors → contextual comprehension of language
  • Passage Indexing → fine-grained information access
  • MUM → multimodal, multitask search
  • Gemini → integration of generation, memory, and multimodal interaction

Each of these steps incrementally replaced the classic retrieval stack with something closer to how humans process questions, context, and tasks.

Therefore, AI Mode is the inevitable next layer of this progression: a dialogue-based, personalised, task-aware, and predictive interface.

The Role of ChatGPT, Perplexity, and Claude

They were a shock to the system, but not a course change. Rather, they forced acceleration.

What their rise did:

  • Reset user expectations about what “search” could look like (conversational, contextual, instant).
  • Applied pressure on Google to productize Gemini quickly and publicly.
  • Created a narrative that Google had to counter, not with theory, but with a functioning LLM-powered interface (AI Mode).

But even before their appearance, Google already:

  • Created Transformers, without which ChatGPT and the other LLMS would not have been developed.
  • Had MUM in production.
  • Had trained multiple versions of BERT and T5.
  • Was deploying Transformer models in Workspace, Search, and Ads.

The AI Search shift was underway. What OpenAI and others did was compress Google’s timeline from “maybe 2026–27” to “now or never.”

Ray Kurtsweil

AI Mode was implied in these words said by Ray Kurtsweil in an interview for The Guardian in 2014:

When you write an article, you’re not creating an interesting collection of words. You have something to say, and Google is devoted to intelligently organising and processing the world’s information. The message in your article is information, and the computers are not picking up on that. So we would like to have the computers read. We want them to read everything on the web and every page of every book, then be able to engage in an intelligent dialogue with the user to be able to answer their questions […] We are going to encode that, really try to teach it to understand the meaning of what these documents are saying.

AI Mode is not a pivot, it’s a culmination.

What began with semantic indexing and neural understanding has matured into a search interface powered by memory, personalisation, and generation.

The LLM race didn’t cause this. It simply made it urgent.

A final note

I cannot end this long post without first warmly thanking Garrett Sussman, both in clarifying and deepening certain aspects of the section dedicated to Echo Chamber risk and – thanks to his acute observations – in reviewing other parts of the article.
If you are among those who do not know him, Garrett is to be counted among the best experts in Behavioral Search (I recommend you read this article and watch this video).

Share if you care