iOS 14 lets deaf users set alerts for important sounds, among other clever accessibility perks – TipsClear
The latest version of iOS adds some smart features for use by people with hearing and vision impairments, but some of these may be useful just about anyone.
The most compelling new feature is probably sound recognition, which creates a notification whenever the phone detects a long list of common noises that users may want to know about. Sirens, dog barks, smoke alarms, car horns, doorbells, running water, equipment beeps – the list is very extensive. A company named Furenexo made a device that was built earlier this year, but it was good to build.
Users can get notifications to visit their Apple Also, if they want the oven to reach temperature, they always want to check their phone. Apple is working on adding more and more people and animal sounds, so there is room to develop the system.
The utility of this feature is obvious to people with hearing impaired, but it is also good for those who get lost in their music or podcasts and forget that they let the dog out or are expecting a package.
Also new in the audio department is what Apple is calling “personal audiograms”, which rely on a custom EQ setting based on listening to different frequencies. It is not a medical device – it is not intended for diagnosis of hearing loss or anything – but a handful of audio tests can tell if certain frequencies need to be increased or drenched. Unfortunately this feature only works with Apple-branded headphones, for some reason.
Real-time text conversation is an accessibility standard that basically sends text chat over voice call protocols, allowing seamless conversation and access to emergency services for uninterrupted people. This has been supported by iPhones for some time, but users no longer need to be in the calling app to work – when you play games or watch videos, calls and notifications will appear in the conversation.
A final feature intended for use by the hearing impaired is an under-hood change for group FaceTime calls. Usually the video switches automatically to whoever is speaking – but of course the sign language is silent, so the video does not focus on them. As of iOS 14, anyway, in which case the phone will recognize the motions as the signal language (though no specific signal) and switch the view to that participant.
Apple’s accessibility features are solid for those with little or no vision, but there is always room to grow. The voiceover, the smart screen-reading feature that has been around for over a decade now, is enhanced with a machine learning model that can recognize more interface items, even if they are not properly labeled, and third-party In apps. Material too. It is making its way to the desktop, but not yet.
iOS’s descriptive chops have also been upgraded, and by analyzing the contents of a photo it can now relate to them in a richer way. For example, instead of saying “two people sitting”, it could say, “Two people sit and drink at a time,” or “Instead of a dog in a farm”, “a field player on a golden day.” A golden retriever. ” “Well, I’m not 100% sure it can find the right breed, but you have an idea.
Magnifier and rotor controls have also been elevated, and large portions of Braille text will now be auto-paned.
Developers with blindness will be happy to hear that Swift and Xcode have received several new voiceover options, as well as ensuring that common tasks such as code completion and navigation are accessible.
Back tapin ‘
“Back Tap” is a new feature for Apple devices, but is familiar to Android users who have seen it on Pixel phones and other devices. This enables users to activate shortcuts on the back of the phone two or three times – super easy to implement a screen reader while your other hand is holding a dog leash or cup of tea.
As you can imagine, the feature is useful to just about anyone, as you can customize it to do all kinds of shortcuts or tasks. Unfortunately, this feature is now limited to phones with FaceID – which leaves iPhone 8 and SE users, among others, out in the cold. It is hard to imagine that no secret tap-detection hardware is included – it is almost certain that it uses accelerometers fitted with iPhones from the beginning.
Apple is no stranger to mortgaging certain features for a particular reason, such as a notification extension that is not possible in a new brand such as SE. But it is unusual to do so with a feature of accessibility. The company did not calculate the possibility that the back tap would make its way to button-bearing devices, but would not commit to the idea. Hopefully, this useful feature will be more widely available soon, but only time will tell.