Global Accessibility Awareness Day takes place this week, and Apple is one of the companies that is taking advantage of the moment to show new things, mainly in terms of inclusion and innovation, bringing a series of new features to its products and services.

This year, the company’s announcements cover a variety of areas, from eye-tracking technologies to musical haptics capabilities, aiming to not only improve the experience for users with disabilities, but even enrich the lives of users in general.

Read too:

The return of cable TV? Comcast launches package with Netflix, Apple TV+ and Peacock at a reduced price
Apple’s first unionized store approves strike against the company

Built-in eye tracking on iPhone

One of the biggest new features presented by Apple is integrated eye tracking for iPhones and iPads. With this technology, people with motor disabilities gain a new way of interacting with their devices, using only eye movement to navigate the software. This innovation eliminates the need for additional hardware, making accessibility more widely available and integrated into existing devices.

The system uses the front camera of compatible devices, such as iPhones and iPads with an A12 chip or later, to track the user’s eye movements. Through the so-called “Pin Control”, users can look at elements on the screen, such as applications and menus, and select them simply by holding their gaze on the desired item for a while. This functionality can be essential to make life easier for people with motor disabilities, but it also ends up offering a more intuitive and efficient user experience for all users.

Additionally, Apple is working to ensure that eye tracking is fully integrated into the third-party app ecosystem, ensuring a consistent and accessible experience across the entire iOS and iPadOS operating system. It is worth remembering that Apple already allowed eye tracking on iOS and iPadOS, but only with the use of connected eye detection devices. Now, what’s new is the possibility of doing this without any type of extra hardware.

Apple introduces more personalized vocal shortcuts

Apple introduces more personalized vocal shortcuts

Another highlight of Apple’s announcements are custom vocal shortcuts, which aim to offer a more convenient and efficient way to interact with iOS and iPadOS devices hands-free. With this functionality, users can create personalized voice commands to perform a variety of tasks, from opening applications to performing specific actions within supported applications.

The system uses artificial intelligence present in the devices to create personalized models for each user, ensuring a precise user experience adapted to each user’s individual needs. These shortcuts can be configured to respond to specific keywords, phrases or even expressions, offering a highly personalized way of interacting with the device.

When you use one of these options, Siri will understand the order and execute it automatically, without having to ask Siri first to be ready.

Additionally, Apple is introducing “Listen for Atypical Speech” functionality, which uses machine learning to recognize unique speech patterns and customize speech recognition accordingly. This not only improves speech recognition accuracy for users with speech impairments or atypical speech, but also provides a more personalized experience tailored to each user. The idea is very similar to Google’s Project Relate, which also aims to use technology to help better understand people with atypical speech.

This feature was built in partnership with the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign, which also works in collaboration with other giant technology companies such as Google itself and Amazon. for the development of resources and products for this purpose.

Musical haptics brings a new dimension of musical experience

GIF showing how Apple's Musical Haptics work

For people who are deaf or hard of hearing, music can be a powerful form of expression and connection. With this in mind, Apple is introducing musical haptics into Apple Music and other apps, offering a richer, more immersive music experience.

This technology uses haptic vibrations synchronized with audio to create a more complete sensory experience, allowing people who are deaf or hard of hearing to experience music in a whole new way, through touches, textures and vibrations. Additionally, Apple is making this technology available as an API to developers, allowing them to embed musical haptics into their own apps and services, further expanding the reach of this innovation.

Apple also brings more accessibility to CarPlay

For drivers with disabilities, the driving experience can be challenging. With this in mind, Apple is introducing a series of updates to CarPlay, aiming to make the user experience more accessible and inclusive.

One of the main new features is improved voice control, which allows drivers to control their devices simply by speaking to them. Additionally, Apple is introducing color filters and support for bold and large text, making it easier for visually impaired drivers to view on-screen menus and alerts. Other features also arrive to help with this process, such as support for larger and bold text.

Another innovation is the introduction of sound recognition, which alerts drivers to external noises, such as sirens or horns, to ensure safer and more conscious driving. This is because when identifying these external sounds, the system will display a visible alert at the bottom of the screen, so even if the driver doesn’t hear it, he can see it.

carplay showing noise alert

For those who experience discomfort due to motion sickness when using their iPhones or iPads while in moving vehicles, Apple also announced a new feature that promises to bring partial relief to this sensation, called Vehicle Motion Tips. As motion sickness generally occurs due to a sensory conflict when observing static content while the vehicle is moving, this new feature aims to improve synchrony between the senses involved, through the display of dots on the screen.

image of how Apple's Vehicle Movement Tips work

When activated, this functionality will position these points along the four edges of the screen, oscillating in response to detected movements. For example, if the vehicle moves forward or accelerates, the points will tilt backward, simulating a reaction to the increase in speed in that direction.

Apple towards a more inclusive future

In addition to the news already announced, the company revealed that it is working on more features that will arrive later in its products to further increase accessibility and inclusion. Among what will come in the future are, for example, Live Subtitles for VisionOS, a reader mode in Magnifier and even compatibility for Braille and a virtual trackpad for Assistive Touch.

It is not yet known for sure when the new features will begin to reach Apple devices, but this will generally happen in future versions of its operating system, iOS. As the company’s WWDC developer conference is taking place in a few weeks, it is very likely that the company will bring more details and presentations about these tools there.

Fonte: engadget

Source: https://www.hardware.com.br/noticias/2024-05/apple-anuncia-controle-de-iphone-so-com-os-olhos-e-mais-recursos-de-acessibilidade.html



Leave a Reply

Your email address will not be published. Required fields are marked *