
Camera and Screen Sharing is rolling out to iPhone and Android devices from today.
It allows you to point your phone at an object in the real-world and have a conversation with Gemini about it.
Google has introduced a new feature for Gemini Live that will describe your surroundings using just your phone.
Camera and Screen Sharing enables Android and iPhone users to point their phone cameras at items in the real world and ask Gemini questions about them. And using Gemini Live, it'll then explain the object in a conversational form. You can even ask follow-up questions to better understand the subject.
The feature was first announced in April, but it was revealed during Google I/O 2025 that it's finally available on Android devices and iPhones.
What is Google's Camera and Screen Sharing?
If you enable Gemini Live to access your phone's camera, you just need to point your device at items, buildings or anything else you need information on. You can then ask Gemini a question about it, such as get an opinion on the colour of an item of clothing you want to buy, and you should receive an answer.
Gemini will recognise the object, so you can then use conversational talk to have a chat about it.
Screen sharing is also possible, so you could take a grab of a shopping site, say, and ask questions about gift choices or price matches.
Is there anything else from Google I/O I need to know about?
Google I/O is still underway at the time of writing, but there have been some great announcements about Gemini and enhancements made to the AI system.
We've also learned about real-time translation from Spanish to English and vice versa, which is rolling out to Google Meet. More languages are being added in the coming weeks.