Close Menu
SkytikSkytik

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    At Least 32 People Dead After a Mine Bridge Collapsed Due to Overcrowding

    November 17, 2025

    Here’s how I turned a Raspberry Pi into an in-car media server

    November 17, 2025

    Beloved SF cat’s death fuels Waymo criticism

    November 17, 2025
    Facebook X (Twitter) Instagram
    • About Us
    • Contact Us
    SkytikSkytik
    • Home
    • AI Tools
    • Online Tools
    • Tech News
    • Guides
    • Reviews
    • SEO & Marketing
    • Social Media Tools
    SkytikSkytik
    Home»SEO & Marketing»Google research points to a post-query future for search intent
    SEO & Marketing

    Google research points to a post-query future for search intent

    AwaisBy AwaisJanuary 26, 2026No Comments4 Mins Read0 Views
    Facebook Twitter Pinterest LinkedIn Telegram Tumblr Email
    Google research points to a post-query future for search intent
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Google is working toward a future where it understands what you want before you ever type a search.

    Now Google is pushing that thinking onto the device itself, using small AI models that perform nearly as well as much larger ones.

    What’s happening. In a research paper presented at EMNLP 2025, Google researchers show that a simple shift makes this possible: break “intent understanding” into smaller steps. When they do, small multimodal LLMs (MLLMs) become powerful enough to match systems like Gemini 1.5 Pro — while running faster, costing less, and keeping data on the device.

    The future is intent extraction. Large AI models can already infer intent from user behavior, but they usually run in the cloud. That creates three problems. They’re slower. They’re more expensive. And they raise privacy concerns, because user actions can be sensitive.

    Google’s solution is to split the task into two simple steps that small, on-device models can handle well.

    • Step one: Each screen interaction is summarized separately. The system records what was on the screen, what the user did, and a tentative guess about why they did it.
    • Step two: Another small model reviews only the factual parts of those summaries. It ignores the guesses and produces one short statement that explains the user’s overall goal for the session.
      • By keeping each step focused, the system avoids a common failure mode of small models: breaking down when asked to reason over long, messy histories all at once.

    How the researchers measure success. Instead of asking whether an intent summary “looks similar” to the right answer, they use a method called Bi-Fact. Using its main quality metric, an F1 score, small models with the step-by-step approach consistently outperform other small-model methods:

    • Gemini 1.5 Flash, an 8B model, matches the performance of Gemini 1.5 Pro on mobile behavior data.
    • Hallucinations drop because speculative guesses are stripped out before the final intent is written.
    • Even with extra steps, the system runs faster and cheaper than cloud-based large models.

    How it works. Intent is broken into small pieces of information, or facts. Then they measure which facts are missing and which ones were invented. This:

    • Shows how intent understanding fails, not just that it fails.
    • Reveals where systems tend to hallucinate meaning versus where they drop important details.

    The paper also shows that messy training data hurts large, end-to-end models more than it hurts this step-by-step approach. When labels are noisy — which is common with real user behavior — the decomposed system holds up better.

    Why we care. If Google wants agents that suggest actions or answers before people search, it needs to understand intent from user behavior (how people move through apps, browsers, and screens). This research moves this idea closer to reality. Keywords will still matter, but the query will be just one signal. In this future, you’ll have to optimize for clear, logical user journeys — not just the words typed at the end.

    The Google Research blog post. Small models, big results: Achieving superior intent extraction through decomposition


    Search Engine Land is owned by Semrush. We remain committed to providing high-quality coverage of marketing topics. Unless otherwise noted, this page’s content was written by either an employee or a paid contractor of Semrush Inc.


    Danny GoodwinDanny Goodwin

    Danny Goodwin is Editorial Director of Search Engine Land & Search Marketing Expo – SMX. He joined Search Engine Land in 2022 as Senior Editor. In addition to reporting on the latest search marketing news, he manages Search Engine Land’s SME (Subject Matter Expert) program. He also helps program U.S. SMX events.

    Goodwin has been editing and writing about the latest developments and trends in search and digital marketing since 2007. He previously was Executive Editor of Search Engine Journal (from 2017 to 2022), managing editor of Momentology (from 2014-2016) and editor of Search Engine Watch (from 2007 to 2014). He has spoken at many major search conferences and virtual events, and has been sourced for his expertise by a wide range of publications and podcasts.

    future Google intent points postquery research search
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Awais
    • Website

    Related Posts

    Why customer personas help you win earlier in AI search

    March 18, 2026

    SEO Test Shows It’s Trivial To Rank Misinformation On Google

    March 18, 2026

    Google expands Personal Intelligence to AI Mode, Gemini, Chrome

    March 18, 2026

    Google AI Overviews Cut Germany’s Top Organic CTR By 59%

    March 18, 2026

    Google says AI Mode stays ad-free for Personal Intelligence users

    March 18, 2026

    Search Referral Traffic Down 60% For Small Publishers, Data Shows

    March 18, 2026
    Leave A Reply Cancel Reply

    Top Posts

    At Least 32 People Dead After a Mine Bridge Collapsed Due to Overcrowding

    November 17, 20250 Views

    Here’s how I turned a Raspberry Pi into an in-car media server

    November 17, 20250 Views

    Beloved SF cat’s death fuels Waymo criticism

    November 17, 20250 Views
    Don't Miss

    Why customer personas help you win earlier in AI search

    March 18, 2026

    Buyers ask a question. You answer it clearly. That’s the premise behind the “They Ask,…

    Broccoli Confetti Rice Recipe | Epicurious

    March 18, 2026

    SEO Test Shows It’s Trivial To Rank Misinformation On Google

    March 18, 2026

    Bridging Facts for Cross-Document Reasoning at Index Time

    March 18, 2026
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    How a Neural Network Learned Its Own Fraud Rules: A Neuro-Symbolic AI Experiment

    March 18, 2026

    Google says AI Mode stays ad-free for Personal Intelligence users

    March 18, 2026
    Most Popular

    13 Trending Songs on TikTok in Nov 2025 (+ How to Use Them)

    November 18, 20257 Views

    How to watch the 2026 GRAMMY Awards online from anywhere

    February 1, 20263 Views

    Corporate Reputation Management Strategies | Sprout Social

    November 19, 20252 Views
    Our Picks

    At Least 32 People Dead After a Mine Bridge Collapsed Due to Overcrowding

    November 17, 2025

    Here’s how I turned a Raspberry Pi into an in-car media server

    November 17, 2025

    Beloved SF cat’s death fuels Waymo criticism

    November 17, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    • Disclaimer

    © 2025 skytik.cc. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.