Click to Skip Ad
Closing in...

6 new Gemini Live features Google announced at I/O 2025

Published May 20th, 2025 1:45PM EDT
Gemini Live
Image: Christian de Looper for BGR

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Google unveiled Project Astra last year at I/O, showing off nascent AI technology that allows mobile users to talk to Google’s AI in real time using conversational language. You might ask the AI to find stuff on the web for you or share your camera and screen so it can see what you see and provide guidance.

Some of those features are available via Gemini Live, Google’s AI-powered assistant for Android and iPhone. But Google isn’t stopping there. It announced several new Project Astra tricks coming to Gemini Live soon, in addition to making its best feature free for Android and iPhone users.

Camera and screen-sharing go free

Having the AI look at what you see in real life or on your screen is a key function for any AI-powered assistant. Google wants Gemini to be more powerful than ever, so it’s making the camera and screen-sharing features in Gemini Live free for all Android and iPhone users. The feature will start rolling out on Tuesday.

Gemini Live features.
Gemini Live features. Image source: Google

Gemini Live will also integrate with more Google apps soon, starting with Google Maps, Calendar, Tasks, and Keep.

Google demoed new Project Astra capabilities at I/O 2025 by showing a video of Gemini Live helping with everyday activities, like fixing a bike.

Finding manuals online and scrolling for information

In the video, the user asks Gemini Live to look for the manual of the bike he’s repairing. The AI browses the web, finds the document, and asks what the user wants to see next.

The user then tells Gemini Live to scroll the document until it finds a section about brakes. The Android phone’s screen shows Gemini Live doing just that and finding the information.

This kind of agentic behavior suggests Gemini Live will be able to access specific information online, even within documents.

Find the right YouTube clip

The user then asks Gemini Live to find a YouTube video that shows how to deal with a stripped screw. Gemini delivers.

Look for information online while looking at your camera

The chat continues with the user asking Gemini Live to find information in Gmail about the hex nut size he needs, while showing the AI an image of the available parts in his garage.

Gemini Live: Camera-sharing and multimodal functionality.
Gemini Live: Camera-sharing and multimodal functionality. Image source: Google

Gemini Live surfaces the information and highlights the correct part in the live video feed. It’s a mind-blowing feature to have at your fingertips.

Making calls on your behalf in the background

Next, the user asks Gemini Live to find the nearest bike shop and call them to ask about a specific part.

The AI doesn’t respond immediately, since the call involves a third party. But Gemini Live tells the user it will follow up with the info once it has it.

Gemini Live: Making a call on behalf of the user.
Gemini Live: Making a call on behalf of the user. Image source: Google

The user keeps talking to Gemini Live while the AI handles the call in the background.

Once the call is done, Gemini Live provides the needed info while continuing to manage other tasks in parallel.

Handling multiple speakers without losing focus

While Gemini Live is on the phone with the bike shop, the user asks a follow-up question about the manual. At the same time, someone else asks the user if they want lunch.

Gemini Live pauses but keeps track of everything. Once the user replies to the lunch question, the AI resumes the conversation about the manual without missing a beat.

Context-aware online shopping

At the end of the clip, the user asks Gemini Live to find dog baskets for his bike. The AI surfaces suggestions that would fit his dog, clearly recognizing the pet from Google Photos or past interactions.

Gemini Live can also make purchases, likely through Project Mariner. While we don’t see this in action, when the AI confirms the bike shop has the needed tension screws, it offers to place a pickup order.

These new Gemini Live features won’t roll out immediately. Google is collecting feedback from trusted users first. Once ready, the features will be available on mobile devices and Android XR wearables.

As a longtime ChatGPT user, I can already say I envy these capabilities. Hopefully, OpenAI is working on something similar.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2007. When he’s not writing about the most recent tech news for BGR, he closely follows the events in Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming new movies and TV shows, or training to run his next marathon.