Sunday, December 3, 2023

Apple reveals early iOS 17 features, including creating a duplicate of your voice

Manzana has announced today a series of new accessibility features available for the iPhone and iPad, and that will arrive, as indicated by the company, next year; most likely along with iOS 17. It is, therefore, the first functions that the company reveals of the next update for the iPhone. The new benefits, specifically, are intended to improve the day-to-day life of those with cognitive, visual, hearing or mobility problems. They also combine both hardware functions and hardware and machine learning.

One of the new accessibility features that will arrive with iOS 17 is that of Assistive Access. It is a way that “distills” the applications, keeping only the essential functions and adjusting some elements, such as the size of the buttons and text, etc. With Assistive Access, for example, the Phone and FaceTime app merge into just one with both features. The camera one receives a single mode and a large shutter, or the Messages one shows only an emoji keyboard and the possibility of sending a video.

Apple has also announced live speech for iOS 17, a new accessibility feature for people who cannot speak or are at risk of losing the ability to speak. Among them, for example, those diagnosed with ALS (amyotrophic lateral sclerosis). With Live Speech, in particular, users simply type on the screen and have iPhone speak the text aloud. The function also includes a section in which it is possible to save frequent or commonly used phrases to activate them immediately and make the conversation more fluid.

Live Speech in iOS 17 will also allow you to create a “personal voice” that the iPhone can later reproduce the user’s own voice. To do this, and for 15 minutes, the user must read and record different phrases to train the iPhone so that it can dictate any other phrase with its voice.

Other accessibility improvements coming soon to iOS 17

The iPhone Magnifier app in iOS 17, on the other hand, will be able to read aloud the buttons that the user is pointing with their finger. This is a function for people with visual problems. It also makes use of “input from the camera app, LiDAR scanner, and machine learning,” Apple says. The person only has to focus the camera on the button and indicate with their finger. The iPhone will then detect the text and dictate it aloud.

There are a number of other accessibility features coming to the iPhone and iPad soon as well. They are the following.

  • In iOS 17, hearing impaired users will be able to pair Made for iPhone hearing devices on a Mac.
  • voice control in iOS 17 it will add phonetic suggestions for text editing. In this way, they will be able to select the correct word if it is very similar to another.
  • The switches for Switch Control can be used as game controller.
  • It will be possible to pause moving images, such as GIFs, in apps like Messages or Safari.
  • VoiceOver It will allow you to adjust the speed at which Siri speaks, choosing between speeds of 0.8x to 2x.

Also in Hypertext:


Please enter your comment!
Please enter your name here

Latest article