Artificial intelligence from Apple for iPhone, iPad and Mac that speaks and detects people

  • Artificial voices based on texts can be generated and it is so real that it seems that the person speaking was the owner of the mobile

Apple has new accessibility features with the iOS 17 operating system that benefit those with cognitive or physical disabilities with “Assistive Access” mode.

Visual and cognitive assistance
This feature empowers functions with a simple interface and expanding the sizes of images and texts to reduce the amount of information displayed, for example, in the gallery, when taking photos, making and receiving calls, listening to music, and sending text messages.

On both iPhone and iPad, users will be able to see an increase in the size of fonts and app icons. All these functions will also be active when making video calls to facilitate video communication.

Also, you can choose between a mosaic mode on the home screen or a list mode to display the applications according to the user’s particular taste.

speaking assistance
Apple designed a function called “Live Speech” with which iPhone, iPad and Mac users can write a text and then it will be read by the device.

This feature is activated simultaneously during voice calls and FaceTime.

In the same way, for those who could eventually lose the ability to speak, an additional function called Personal Voice will be enabled that analyzes the user’s voice and creates a virtual one that is close to the original version.

The recordings, which include around 150 sentences that must be read by the owner of the mobile, will serve to feed artificial intelligence in charge of using these audio clips to faithfully represent the characteristics of a person’s voice.

In case the user has compromised his ability to see or has lost it completely, iOS 17 will offer a feature called “Point and Speak” in which the cell phone camera will open and the system will begin to read the text aloud that the finger has been pointed.

This function will be incorporated into the Lupa application and can be activated for other uses, such as the detection of people, doors, image descriptions and others that can help navigate.