Google expands its AI accessibility features across Pixel and Android

Google expands its AI accessibility features across Pixel and Android


Summary

  • Google’s Pixel 9 event showcased commitment to accessibility with AI updates for Pixel and Android devices.
  • New features like Guided Frame and Magnifier app enhance user experience for visually impaired individuals.
  • Live Caption and Live Transcribe tools expand to include dual-screen mode and support for seven new languages.




Google’s Pixel 9 event is underway, and apart from the annual bombardment of buzzwords and hardware to go with it, the company is also demonstrating its commitment to making tech accessible to all. As part of a new blog post, the tech giant has detailed four new AI-powered updates for Pixel and Android accessibility, including the likes of enhanced camera support, and more languages for its Live Caption and Live Transcribe features.

Related

Android 14 accessibility features: What’s new and changed

More options have arrived for low-vision and hard-of-hearing users


Google’s Guided Frame feature, which was first released with the Pixel 7 series back in 2022, is an accessibility feature part of Pixel’s TalkBack function that offers audio and gesture-based instructions to help you frame the perfect shot.

The feature is especially useful for visually impaired Pixel users, helping them frame not just themselves for selfies, but also pets, food, documents, electronic devices, and vehicles.

Previously limited to TalkBack, the tool can now be enabled from directly within the camera settings, and according to Google, it should now be able to “better focus on subjects even in complex scenes,” offer face filtering in group shots, and provide an overall improved object recognition experience.


Elsewhere, Google’s Magnifier app, which uses your phone’s camera to enhance and enlarge the world around you, is also getting smarter. Now, right within the app’s camera viewfinder, users will be able to find specific words. This can be especially helpful if you want to look at something specific on a food menu, or “if you’re looking for your flight’s departure time at the airport.”

A new PiP mode in the app will also allow you to have a zoomed in view while retaining the details on the screen, paired with an option to select which lens you’d like to use for magnification.


Live Caption and Live Transcribe expansion

Related

How to use Google Live Transcribe

Google makes assistive tech simple

Google’s Live Caption and Live Transcribe accessibility tools, although similar, serve different purposes. Live Caption focuses more on providing real-time captions for sound on your device, while Live Transcribe focuses on transcribing sounds around you, like during conversations.


For Live Transcribe, Google is launching a foldable-exclusive dual-screen mode feature. According to Google, this should help all speakers in a conversation see their own transcriptions during a conversation. “With dual-screen mode, you can easily set your phone in a tabletop posture on any surface for better visibility of transcriptions.

Live Transcribe Dual Mode side-by-side on a Pixel 9 Pro Fold.

Source: Google

Now everyone around the table can follow the conversation — whether you’re attending a meeting or having a dinner conversation with friends,” wrote the tech giant.

Related

Google’s Live Caption may become more personalized with emotional intensity settings and emoji

Yelling with emphasis


On the other hand, Google is expanding Live Caption support to seven new languages, namely: Korean, Polish, Portuguese, Russian, Chinese, Turkish, and Vietnamese. These seven languages can also be used for offline Live Transcribe.

Screenshots of Live Captions supported languages and the feature in action.

Source: Google



Source link

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *