Google has just officially announced the roll out of a powerful Gemini AI feature that means the intelligence can now see.
This started in March as Google began to show offGemini Live,but it’s now become more widely available.
Before you get too excited though, at this stage at least, it’s only available on theGoogle Pixel 9andSamsung Galaxy S25.
Up until now Gemini has been a little limited, albeit in an impressive way. It’s been able to understand voice, images, PDFs and even YouTube videos. Now, thanks to Project Astra, Gemini can see what’s on your screen too.
This means you can simply give the AI access to your screen and then ask questions about what’s going on for you and it will be able to understand and answer.
Perhaps even more usefully, you can share your rear camera with Gemini to talk about what you’re seeing in the physical world too.
Sound familiar? Yup, this is very similar to the techApple Intelligencewas being teased as getting last year. Yet Apple has been rumoured to be struggling with this release and we may have to wait untiliOS 19, or longer, before we see it arrive on iPhones.
How to get Gemini Live activated on your phone
One way to open this is to launch the Gemini overlay and select the “Share screen with Live”.
Another way is to launch Gemini Live then select the screen share icon.
In either case there is a small red timer icon at the top of the screen to show you’re being viewed and listened to by Gemini Live, that you’re able to tap into for more details.
The whole experience is a bit like being on a call with a real person – blurring the lines between human and AI ever further.