
Google has rolled out new artificial intelligence (AI) features to Gemini Live which allow users to “view” your screen or through your smartphone camera and answer questions about either in real-time.
Speaking to The Verge, Google spokesperson Alex Joseph confirmed that the features come nearly a year after Google first demonstrated the “Project Astra” work that powers them.
Related: Gemini Live expands Android support with Astra-powered camera, screen sharing
According to a Reddit user Kien_PS, the feature showed up on their Xiaomi phone. On Monday, March 24, the same user shared the video below demonstrating Gemini’s new screen-reading ability.
It’s one of the two features Google announced in early March would “start rolling out to Gemini Advanced Subscribers as part of the Google One AI Premium plan” later in the month.
On the other hand, the other Astra capability rolling out is live video, which enables Gemini to interpret the feed from your smartphone camera in real-time and answer questions about it.
To note, these two Gemini features are only available to the Gemini Advanced subscribers currently.
However, the company has not shared any details on when and if it will be expanded to the free tier.
Related: Google rolls out ‘Canvas’ and ‘Audio Overview’ to Gemini