The predictions I made in December 2024 have materialised into a stabilised but divided reality.
Google retains 95% of the desktop search market, yet user behaviour has fundamentally shifted. 37% of US desktop users now also turn to ChatGPT for complex tasks; not replacing Google, but using both. ChatGPT functions as a “second brain” for synthesis, while Google remains the go-to for navigation.
The core thesis for 2026 is “integration, not replacement.” Users delegate complex synthesis to agents while using Google for navigation. Success requires optimising for both the infrastructure (Google) and the synthesiser (AI agents), moving beyond keywords to managing your organisation’s entire knowledge graph.
The 6 Strategic Predictions for 2026
Based on the market evolution observed in 2025, these are the six defining shifts that will shape the search landscape.
1. Expanded Agentic Search (Personalised & Integrated)
We are moving beyond “chatbots” to deeply integrated personal agents. Just as Google integrates Gemini into Workspace, users expect search to “know” them. For instance:
- SaaS/B2B Impact: Agents will ingest documentation to answer technical queries (e.g., “How do I configure this API for my specific tech stack?”).
- Publisher Impact: Discovery agents will curate personalised “morning briefings” summarising content, challenging subscription models (for this reason, Google is signing deals with publishers.)
2. The Necessity of Proprietary Knowledge Graphs
To feed these agents, brands must speak their language. Agents do not “read” pages like humans; they ingest structured relationships between entities.
The Shift: A website is no longer just a destination; it is a structured database. You must maintain a proprietary knowledge graph that explicitly maps how your products, authors, and solutions relate to broader industry concepts.
3. Brand Signals as the Ultimate Trust Heuristic
In a world flooded with synthetic content, “Brand” becomes the primary filter for AI quality. Agents will not evaluate information based on backlinks alone, but on entity authority.
Data Reality: With 27% of US searches ending in zero clicks, the “impression” is the conversion. If an agent summarises an answer, your brand must be the trusted source it cites.
4. The Dawn of “AI Core Updates”
Just as Google launched Panda and Penguin, AI model providers will launch “AI Core Updates” to combat programmatic SEO spam. These updates will weight information genealogy, rewarding the originator of a fact, not the synthesiser.
5. Google “Web Guide” Integration (The New Default)
Google will likely graduate its “Web Guide” experiment to the main search view, a hybrid interface blending AI-organised topic clusters with organic results.
Yes! I wrote Why AI Mode Will Replace Traditional Search as Google’s Default Interface in May, but things have evolved in the past 4/5 months… and Web Guide, blended with classic SERP features, is a good compromise between AI Search and Classic Search and, especially, a better option for Google to not overly disrupt its advertising model.
6. Multimodal & Omnichannel Visibility
Visibility is no longer text-based. Discovery happens across video, social, and visual search simultaneously.
Data Reality: YouTube and Reddit are among the top downstream destinations from Google Search in both the US and Europe, proving that “human” and “visual” content now drive significant post-search engagement.
Download the infographic with my six predictions for AI Search in 2026!
Actionable Strategy & Tactics
How to translate these predictions into specific, high-impact actions for 2026?
A. Infrastructure Strategy: “The Proprietary Knowledge Graph”
(Addressing Predictions 1 & 2)
Strategic Context:
Structured data (Schema.org) is just the label; the knowledge graph is the map. Agents need to understand the logic of your business, not just the syntax.
- Build Your Internal Ontology (The Logic Layer):
- Action: Map your internal data relationships before you touch code.
- Example SaaS: Link Feature → enables → Use Case → solves → Pain Point.
- Example Publishers: Link Author → has expertise in → Topic → cited in → Article.
- Tactic: Use graph databases (like Neo4j) or vector stores to structure your content internally. This ensures that when an agent crawls you, it sees a coherent web of logic, not just isolated pages.
- Action: Map your internal data relationships before you touch code.
- The “Agent API” Endpoint:
- Context: Agents prefer structured feeds over scraping HTML.
- Tactic: Create a public-facing API or specific “Agent Feed” that delivers your core data (pricing, specs, headlines) in clean JSON format.
- Example B2B: Make your “Integrations” list machine-readable so an agent knows exactly what software you connect with.
- Example B2B: Make your “Integrations” list machine-readable so an agent knows exactly what software you connect with.
B. Content Strategy: The Architecture of Authority
(Addressing Predictions 3 & 5)
Strategic Context:
To survive query fan-out (where Google/AI breaks a query into sub-intents), you cannot rely on loose blog posts. You must engineer a content hub that mirrors the AI’s understanding of a topic.
- Optimising for query fan-out:
- Adopt the Architecture of Authority workflow to build high-performance content hubs that satisfy AI sub-intents.
- Step 1 (360-Degree Audit): Analyse the “AI Answer” for your target topic. What sub-topics does Gemini/ChatGPT always include? (e.g., if searching “CRM migration,” AI always mentions “Data Mapping” and “Downtime”). If you lack content for these essentials, you have an expertise gap.
- Step 2 (Micro-Moments Blueprint): Map your content to user intents using the micro-moments framework.
- I want to know: (Definitions, Theory) -> Cluster Page A
- I want to do: (Strategy, Process) -> Cluster Page B
- I want to go: (Exploration) -> Subpillar
- I want to buy: (Comparison, Pricing) -> Pillar Page
- Step 3 (Linking Matrix): Engineer a cluster-to-cluster linking strategy. Don’t just link randomly. Link logically to guide the AI through the funnel (e.g., link “Data Migration Guide” → “API Documentation” → “Pricing”). This teaches the AI the relationships between your concepts.
- Example: SaaS & B2B: “Documentation as SEO”:
- Tactic: Your help centre is your most valuable asset. Rewrite documentation to be “Agent-Readable” (Q&A format, clear prerequisites). Agents use docs to solve user problems; if your docs are good, the agent recommends your tool.
- Example Publishers: “Primary Source” Pivot:
- Tactic: Stop churning commoditised news summaries. Pivot to Investigative Data and Unique Commentary. An AI can summarise a press release, but it cannot interview a whistleblower or analyse a private dataset.
C. Omnichannel Strategy: “Platform-Native” SEO
(Addressing Prediction 6)
Strategic Context:
Discovery is fragmented. You must optimise for the platforms where users actually spend time.
- Example B2B & SaaS: The LinkedIn & Reddit Strategy:
- Data Reality: Reddit is among the top downstream destinations from Google Search.
- Tactic: Treat Reddit threads and LinkedIn Pulse articles as “landing pages.” Optimise them with the same rigour as your blog.
- Action: Have product experts answer technical questions on Reddit using your official brand account. These threads often rank higher than your own FAQ pages.
- Example Publishers: Video & Visual Entry Points:
- Data Reality: YouTube is among the top downstream destinations from Google Search.
- Tactic: Turn every major report into a YouTube video. Video cannot be fully summarised by text LLMs yet, forcing the user to engage with your brand asset.

Download the infographic of my recommended strategy and tactics for 2026
Part 3: Deep Dive — The “AI Core Update” & Governance
(Addressing Prediction 4)
We are approaching a significant anti-spam correction in the AI space. Here is what that looks like and how to survive it.
The Problem: Programmatic AI Spam
By 2026, vast swaths of the web will be programmatic content generated by AI to game AI. Agents will struggle to differentiate truth from hallucination.
The Solution: Provenance Scoring
AI models will likely introduce updates that weight information genealogy.
- Origin Verification: Does this fact originate here? Or is it a summary of a summary?
- Penalty: Sites that only summarise other content will see their citation weight drop to zero.
- Reward: Sites that provide primary data (original surveys, proprietary datasets, human-verified tests) will be treated as seed nodes.
Defensive Strategy: The Human Verification Protocol
- Authorship Verification (Publishers/B2B):
- Rigorously link every piece of content to a real human expert with a LinkedIn profile and digital footprint. Anonymity is a trust killer for agents.
- Primary Data Publication (SaaS/B2B):
- Publish data that only you have (e.g., “State of the Industry” reports based on your user data). LLMs hallucinate facts but cite unique datasets.
- C2PA / Content Credentials:
- Where possible, implement cryptographic provenance or verified human markers to distinguish your content from synthetic noise.
Download the infographic of my suggestions for preparing for future “AI Core Updates”
Closing Perspective
The era of “tricking the algorithm” is going to end because the algorithm is becoming an agent. In 2026, you cannot easily manipulate an agent that cross-references facts against multiple sources and surfaces inconsistencies.
The winners will be the entities, aka real brands, with real data, validated by real humans across multiple platforms. Classic organic traffic may decline, but the influence of these entities will shape the answers the world sees.
Visibility is the first conversion; traffic becomes the second. Influence translates into awareness, recognition, memorability, and trust, which ultimately drives both branded search and direct engagement, whether that’s a user visiting your site or asking an agent to complete a transaction on your behalf.
