>You may have a much better time with an iphone/ipad/ipod touch
The biggest problem with modern intelligent phones, specially apple ones, is that there is zero tactile feedback on the screen vs good old buttons or keyboard.
While this is true, it turns out the lack of buttons isn't as limiting as you might think. You may not have haptic feedback, but you still have the ability to explore the screen using your spacial/tactile senses. By touching the screen and dragging your finger around you can explore what is displayed on the screen. The phone will read aloud the label of what ever item your finger is touching.
For an much more in-depth and eloquent discussion of touch screens and accessibility, you should check out this paper, slide rule which is the research that the VoiceOver design is based on.
Thanks for the reference. Has there been research on radial menus for this use case? E.g. touch anywhere in the middle of the screen to define the center of the radial menu, then move in a circle to discover the radial menu options. This would reduce the surface area to be traversed. If we ever get haptic feedback on mobile devices, this could be used to signal transition between "pie slices" of the radial menu.
Have you used VoiceOver? Both iOS and OS X versions have a rotor menu that activates with a two finger touch and twist motion to give access to various sets of items.
I've enabled VoiceOver for testing but had not seen that gesture, thanks for the pointer. It is useful but slightly awkward as you need both fingers to retain contact with the screen while rotating, which is challenging for a 360 degree rotation :)
You don't have to do full rotation in one go. It's hard to explain, but you basically need to do this (assuming right hand): put your thumb and index finger on the screen, and then do a swipe left motion with your index finger, while having thumb in place. Then lift the index finger, move it back to starting position and repeat. With that gesture you can advance the rotor by one position at time.
I was surprised by this too, but iDevices are extremely popular with blind people.
This is one of the few places where I give iDevices a lot of credit. Apple knew that touchscreens have certain disadvantages in that regard, and put a lot of thought into features that overcome those disadvantages. The result is something that's in many ways more usable for a blind person than a full-sized computer.
Source: A family member volunteered to help blind students for a while.
Don't take this the wrong way, but it's obvious from this comment that you've never spent any time working with a blind person on an interface. This idea that a blind person would need tactile feedback comes from a sighted person imagining what it would like to be blind, not the experience of actual blind people with buttons vs touchscreens.
This is just not as big of a problem as you think it is. Most blind folks I know rave about their iPhones and generally don't feel limited by it like they might have with their BrailleNotes or mediocre screenreaders/a11y tools on android. Many iOS apps also have a fairly accessible default state, assuming devs didn't go overboard with custom everything up the wazoo with no care for accessibility.
I used to work with some blind individuals, and one thing that's hard to relate to is how much more attention they give to things that sighted individuals do not, such as sounds, spatial relationships, etc. I wouldn't think they'd have any problem, for example, opening a specific app on their iPhone solely by its position on the home screen.
Wouldn't you just use Siri for that? I have enough apps that it's mostly too hard remembering where all but the most commonly used live, so I just tell Siri to open the app. Works like a charm.
The biggest problem with modern intelligent phones, specially apple ones, is that there is zero tactile feedback on the screen vs good old buttons or keyboard.