Our voice will also be imitated by Apple products, but they will do it for a good cause

A host of new features make iPhone and iPad more accessible.

Nowadays, accessibility is not only an important topic in the design of our physical environment, but also in the design of electronic devices, since these gadgets are an integral part of our lives, and in many ways they can be of even greater help to people with disabilities than to healthy members of society.

That’s why Android and iOS smartphones already have a lot of accessibility functions to help break down barriers, and Apple is expanding the repertoire with a series of brand new options.

The people of Cupertino announced the software innovations, which were developed in cooperation with members and representatives of the disabled community, as a precursor to the WWDC developer conference, which takes place from June 5 to 9. However, the actual start of the functions is expected later in the year, Apple has only given the public a brief glimpse for now.

The series opens with Assistive Access, intended for users with cognitive problems, which radically simplifies the user interface of iPhones and iPads, condensing the basic functions of the devices into five applications. This makes it easier for the target audience to access calls, messages, and the camera, photos, and music. It is important that the applications are not just enlarged clones of their normal version, but also have unique functions, for example, Messages includes a clean emoji button row for simple communication.

Live Speech provides a solution for people with speech problems, which reads out the sentences that have been typed during phone and video calls (FaceTime) on demand. Here, users can even save certain expressions in advance, making it easier to use them again. And what is quite impressive is that the Personal Voice function that complements Live Speech will be able to imitate our own voice, after only 15 minutes of read-aloud setup.

The artificial intelligence-based option is intended by Apple for those who will lose their voice due to a disease (e.g. ALS), so they can preserve it for later. Point & Speak, which complements the Magnifier’s capabilities, is similarly spectacular, and by activating the camera and the LiDAR scanner found in more expensive iPhones/iPads, it can write out what functions the buttons of another electronic device perform when you point to them.

In addition to these, Apple is preparing additional innovations. For example, hearing aids with the Made for iPhone license can be directly connected to Mac computers, any button on the iPhone and iPad can be turned into a virtual game controller with Switch Control, and the speech speed of the Siri voice assistant can also be fine-tuned (0.8-2x tempo setting).

It is about the company’s accessibility developments In the relevant Apple Newsroom blog post you can read more.

Are you more seriously interested in IT? You can find our news and analyzes for IT and infocommunications decision-makers here.

Source: PC World Online Hírek by pcworld.hu.

*The article has been translated based on the content of PC World Online Hírek by pcworld.hu. If there is any problem regarding the content, copyright, please leave a report below the article. We will try to process as quickly as possible to protect the rights of the author. Thank you very much!

*We just want readers to access information more quickly and easily with other multilingual content, instead of information only available in a certain language.

*We always respect the copyright of the content of the author and always include the original link of the source article.If the author disagrees, just leave the report below the article, the article will be edited or deleted at the request of the author. Thanks very much! Best regards!