Google Unveils Next-Gen AI Mode in Search at I/O 2025

Share


At its annual I/O 2025 developer conference, Google announced a sweeping upgrade to its AI Mode in Search, reimagining the traditional query box as a dynamic, multimodal assistant. Powered by a customized version of Gemini 2.5, AI Mode now offers an array of advanced features designed to enhance how users interact with information, bridging the digital and physical worlds through natural language and visual inputs.

Initially rolling out to U.S. users via Search Labs, the enhanced AI Mode introduces an upgraded version of AI Overviews and a new feature called Deep Search. This capability runs hundreds of background queries to generate a richly detailed, fully cited report, enabling nuanced answers to complex, multi-layered questions.

Among the most notable innovations is Search Live, a real-time feature that uses the device’s camera to analyze surroundings and deliver spoken responses. Whether identifying landmarks or evaluating storefronts, users can engage conversationally with the AI as it interprets live scenes.

Further advancing Search as a proactive assistant, Google is integrating agentic capabilities through its Project Mariner initiative. These tools support real-world actions such as making restaurant reservations, booking appointments, and preparing event ticket purchases. While the AI does not yet finalize transactions autonomously, it handles all preparatory steps—filling out forms, comparing prices, and navigating platforms like Ticketmaster, StubHub, Resy, and Vagaro.

Google emphasized a more personalized and action-oriented future for Search, with AI Mode leveraging user history and connected Google apps to offer tailored suggestions. Users will retain control through robust privacy settings, including options to disable contextualization.

The update also introduces powerful shopping functionalities. AI Mode now supports visual product searches, real-time price tracking, and a new virtual try-on feature. By uploading a full-body image, users can see how clothing items from various retailers would fit their frame. This is powered by a proprietary AI model trained to accurately reflect body proportions and fabric behavior. Backed by Google’s Shopping Graph—comprising over 50 billion listings—AI Mode is positioned to deliver high-precision results even for vague or complex product queries.

While these features are currently limited to U.S.-based Search Labs participants, Google has confirmed plans to integrate select capabilities into the global core Search experience in the near future.


Recent Random Post: