1
Feature Story
Google's Gemini Live Can Now Read Phone Screens, Camera Feeds And Answer Queries
Mar 24, 2025 · ndtvprofit.com
The new capabilities allow users to share their screens with Gemini Live and initiate video streams, enabling the AI to scan environments and answer questions about what it observes. The AI's conversational aspect was demonstrated through its ability to engage in organic discussions, adapting to new questions and building on previous answers. However, some AI features may not be available on the Gemini Nano-Powered Google Pixel 9a.
Key takeaways
- Google's Gemini Live can read phone screens and camera feeds and answer queries in real time, but is currently available only to select subscribers of the Google One AI Premium plan.
- The features were developed by Google's 'Project Astra', led by Demis Hassabis, aiming to create a multi-modal AI assistant.
- The AI can process audio, video, images, and text, providing real-time answers and suggestions.
- Google demonstrated these capabilities at the Mobile World Congress 2025 and released videos showcasing Gemini Live's conversational abilities.