Today’s Google Pixel event showered us users with helpful AI features. The phones are coming with the Tensor G4 processor made with AI in mind, and Google’s own Gemini models, which are clearly being put to good use.
Capture the best selfie despite poor vision or blindness
The first of these four new accessibility features is called Guided Frame, designed to help guide people with blindness or low vision towards taking the best selfie.Guided Frame will use speech to guide the user, making sure they’re pointing the camera at themselves and framing the image in the best possible way. In addition, the feature will suggest whether the user needs to tilt their face up or down for a great shot, before auto-capturing it.
On top of that, the user will be informed if the lighting is poor, or if any other adjustments need to be made, in order to take the best selfie.
The feature will be available from the camera settings, for quick access.
Use your Pixel phone’s camera to understand your surroundings
New features are coming to the Pixel-exclusive Magnifier app, which was initially designed to help users with poor vision to read small or distant text, or see details from their surroundings clearer, by zooming in.
Now the Magnifier app can be used to easily search and find specific words in your surroundings, be it a menu item, or flight information, for example.
In addition, Google says the Magnifier app will also let users:
- Use a picture-in-picture format so you can see both the bigger context of what you’re looking at along with the details on your screen. For example, if you’re at a deli counter and want to take a closer look at the menu board, snap a photo and use picture-in-picture to check out all of your options without losing your place.
- Choose the best lens for the moment in your app settings, whether you’re zooming in with a macro lens to read the fine print or busting out the wide-angle lens to get more context.
- Put the camera on you! Turn on Selfie Illumination to use your front-facing camera as a mirror for your touch-up.
A new Live Transcribe mode – only on foldable phones, though
Android folding phones are getting a new Live Transcribe dual-screen mode, which gives users real-time transcriptions of speech and sounds around them, but now with transcriptions for both sides.
So today, we’re launching a new dual-screen mode for foldable phones, including the new Pixel 9 Pro Fold. With dual-screen mode, you can easily set your phone in a tabletop posture on any surface for better visibility of transcriptions. Now everyone around the table can follow the conversation — whether you’re attending a meeting or having a dinner conversation with friends. – Google
More Live Caption and Live Transcribe languages and supported locations
Korean, Polish, Portuguese, Russian, Chinese, Turkish and Vietnamese are now coming to Live Caption, which delivers captions for anything your phone hears, whether from the world around you or a movie you’re watching.
Even in Airplane mode and without an internet connection, Google says it’s possible to use Live Transcribe in up to 15 languages. With an internet connection, that’s up to 120!
👇Follow more 👇
👉 bdphone.com
👉 ultraactivation.com
👉 trainingreferral.com
👉 shaplafood.com
👉 bangladeshi.help
👉 www.forexdhaka.com
👉 uncommunication.com
👉 ultra-sim.com
👉 forexdhaka.com
👉 ultrafxfund.com