
Key Points
- Gemini Live will now highlight objects on your screen
- ย Integration with Messages, Phone, and Clock coming soon
- ย New audio model adds human-like speech and tones
- Rolling out with Pixel 10 on August 28, Android first
Google is giving its AI assistant, Gemini Live, a big update that will make it more interactive, more useful, and more human-like than ever.
Starting next week, Gemini Live will gain the ability to visually highlight things in real time using your phoneโs camera.
On top of that, itโs getting better at speaking, more connected to your phoneโs core apps, and rolling out new features with the launch of the Pixel 10 on August 28.
Source: Google – Techtoken
This is not just another AI upgrade. It shows how fast the AI assistant race is moving, and how companies like Google are racing to keep up with growing demand, just like Metaโs new AI translations for Facebook creators, Meta AI Translations for Creators,s or OpenAIโs expanding monetization strategy with the ChatGPT Go Plan.
Letโs break down whatโs new.
Gemini Live is becoming an even more helpful, natural and visual assistant:
๐ New visual guidance: Now, when you share your camera, @GeminiApp not only sees what you see, but can highlight things directly on your screen
๐ฃ๏ธ More natural and expressive speech โ with improvedโฆ pic.twitter.com/J2ObmfIL8Lโ Google (@Google) August 20, 2025
Gemini Live Can Now Highlight Items Using Your Camera
The star of this update is visual guidance.
Imagine youโre trying to find a specific item from a cluttered toolbox. Now, instead of just describing the item, Gemini Live will visually highlight it for you using your phoneโs camera.
It places a box on your screen to show you exactly what it’s talking about. This takes AI assistance to a whole new level of clarity and speed.
18, Google Gemini.
๐ค Virtual Assistant.
Google’s AI assistant. It replaces the Google Assistant on your Android phone, allowing you to get help writing, summarizing information, and planning with Google Maps and Google Flights. pic.twitter.com/NGNXpla9iO
โ Edward ๐ฆ (@whosEFM) August 20, 2025
This feature launches on Pixel 10 devices on August 28, with a wider Android rollout happening at the same time. iPhone users can expect the update โin the coming weeks,โ according to Google.
Itโs a step closer to real-world interaction,ย much like how AI models are starting to shape business valuations, like Cohereโs $6.8B valuation with AMDโs backing. Cohere Valuation Soars.
Source: Google – Techtoken
Seamless App Integration with Messages, Phone, and Clock
Gemini Live is also getting deeper access to your phoneโs most-used apps.
Letโs say youโre using Gemini to ask for directions, but realize youโre running late. Instead of exiting and manually opening your messaging app, you can just say:
โThis route looks good. Now, send a message to Alex that Iโm about 10 minutes late.โ
Gemini Live will then draft and send the text for you. Soon, it will also be able to make phone calls, set alarms, and manage your reminders.
Gemini Live will soon be able to highlight exactly what you need to know, empowering you to learn and accomplish more ๐
Learn more: https://t.co/XjeZ8ZbK2n
โ Android (@Android) August 20, 2025
This type of multitasking is what makes AI assistants truly valuabl, not just for tech-savvy users, but for anyone who needs help organizing their day. Itโs part of a broader trend in AI tools becoming more user-centric and built into daily workflows.
Just as ChatGPTโs mobile app is bringing in millions in revenue thanks to voice features and smart upgrades, ChatGPT App Revenue, Google is aiming to boost user engagement with similar tools inside the Android ecosystem.
Rick is explaining a new Gemini Live feature โ you can now share your camera feed to Gemini Live, and you can get visual overlays to point out, for instance, which exact title in a pile of books you should be pulling out. Say you’re pointing your camera (with Gemini Live) at yourโฆ pic.twitter.com/HKyLQU2WoW
โ Engadget (@engadget) August 20, 2025
A Voice That Sounds More Human Than Ever
To make everything feel smoother and more natural, Google has upgraded Gemini Liveโs audio model. Now, it can understand the tone of your questions and respond in a way that matches your mood or the topic.
Key upgrades include:
-
Adjustable speaking speed (faster or slower based on your preference)
-
Natural rhythm and pitch that mimic human speech
-
Accents and characters for storytelling (yes, even pirates and historical figures)
If youโre stressed, Gemini might speak more calmly. If you want a fun story, it could even change its voice to match the vibe.
This adds a level of personality and presence that makes the assistant feel more real,ย something ChatGPT and others are also exploring through emotion-aware models.
You can now share your camera in conversations with Gemini Live to chat back-and-forth about what you see and get real-time advice. @GeminiApp can also point things out too. #MadeByGoogle pic.twitter.com/XjU8RSFMgG
โ Saurabh Tiwari (@saurabh_ai_news) August 20, 2025
With rising global interest in conversational AI, this upgrade positions Gemini Live to compete strongly in the growing AI assistant market.
However, not everyone is convinced about the explosive AI growth; some leaders, like OpenAIโs Sam Altman, have even warned about a possible AI bubble AI Bubble Altman Warning. Whether or not that happens, Googleโs pushing forward.
What It Means for Android and iOS Users
Gemini Live is moving beyond the role of a simple chatbot.
Now itโs a context-aware, camera-powered, real-time assistant that can talk, point, explain, and act,ย all in one smooth experience.
If youโre an Android user, youโre in luck. These features will start arriving with the Pixel 10 launch on August 28, and will then expand to more Android devices.
iPhone users wonโt have to wait too long either,ย Google says iOS support is coming in the weeks that follow.
As AI becomes a more natural part of our lives, whether through messaging, work tasks, creative tools, or storytelling,ย itโs clear that Gemini Live is designed to fit right in.
Itโs smart, itโs visual, and now, itโs starting to sound a lot more like you and me.