Apple revealed it has been working on improving cognitive, hearing, and vision accessibility of its products and unveiled new features that should help people with disabilities. The first of those is Live Speech, allowing nonspeaking people to type instead of speaking during calls.
Personal Voice will create a model of a synthesized voice, while Detection Mode is for people who are blind or have very impaired vision.
The features will arrive later this year, revealed Apple without providing a detailed schedule. We expect them to make their way to iOS 17 and iPadOS 17, as well as some…