Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I checked the magnifier app on my iPhone 6s and couldn’t find anything related to narrating what it sees.

It’s very possible that recognition happens locally, Apple has been making efforts in this direction for other functions (photos) but maybe the 6s is just too old and can’t do it.



As far as I understand, the narration is handled by the ‘voiceover’ feature, which is a generic accessibility feature, precisely the one that reads screen contents and controls. In the demo, Kristy taps the screen each time to invoke the voiceover description.

In fact, I'm not sure that the magnifier is necessary at all for the descriptions: this page says that Voiceover simply does that in the camera app: https://support.apple.com/en-gb/guide/iphone/iph37e6b3844/io...

And this page says that ‘image descriptions’ in Voiceover options should be turned on for that: https://support.apple.com/en-us/HT211899

It seems that Voiceover is available since iOS 12: https://support.apple.com/en-gb/guide/iphone/iph3e2e415f/12....

iPhone 6S shipped with iOS 9 but is upgradable to the current 15.6. Though features may indeed depend on the particular phone, i.e. the hardware.

Also, apparently Voiceover changes the way the phone should be controlled, to its own set of gestures—so it's not a feature to use just with the camera. However, it sounds to me like blind users would find it helpful anyway.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: