Thursday, October 28, 2010

The Poor State of Android Accessibility

The Android mobile platform really excites me. It is open (which cannot be said of the iPhone) and is incredibly successful in many respects. I would almost certainly choose an Android phone... except for the poor state of Android accessibility.

Note: I will primarily discuss access for blind users here, since that is what I am most familiar with. However, some of this applies to other disabilities as well.

In the Beginning

In the beginning, there was no accessibility whatsoever in Android. It would have made sense to design it from the start with accessibility in mind, which would have made it much easier, but as is sadly so often the case, this wasn't done. Nevertheless, many other platforms have managed to recover from this oversight, some with great success.

Eyes-Free Project

Then came the Eyes-Free Project, which created a suite of self-voicing applications to enable blind users to use many functions of the phone. Requiring blind users to use these special applications limits the functionality they can access and completely isolates them from the experience of other users. This is just a small step away from a device designed only for blind users. I guess this is better than nothing, but in the long-term, this is unacceptable.

Integrated Accessibility API and Services

With the release of Android 1.6 came an accessibility API integrated into the core of Android, as well as a screen reader (Talkback) and other accessibility services. A developer outside Google also began working on a screen reader called Spiel. This meant that blind users could now access standard Android applications just like everyone else.

Unfortunately, the Android accessibility API is severely limited. All it can do is send events when something notable happens in the user interface. An accessibility service such as a screen reader can query these events for specific information (such as the text of an object which has been activated), but no other interaction or queries are possible. This means it isn't possible to retrieve information about other objects on the screen unless they are activated, which makes screen review impossible among other things. Even the now very dated Microsoft Active Accessibility (the core accessibility API used in Windows), with its many limitations and flaws, allows you to explore, query and interact with objects.

Inability to Globally Intercept Input

In addition, it is not possible for an accessibility service to globally intercept presses on the keyboard or touch screen. Not only does this mean that an accessibility service cannot provide keyboard/touch screen commands for screen review, silencing speech, changing settings, etc., but it also makes touch screen accessibility for blind users impossible. A blind user needs to be able to explore the touch screen without unintentionally activating controls, which can't be done unless the screen reader can provide special handling of the touch screen.

Inaccessible Web Rendering Engine

The web rendering engine used in Android is inaccessible. In fact, it's probably impossible to make it accessible at present due to Android's severely limited accessibility framework, as a user needs to be able to explore all objects on a web page. This means that the in-built web browser, email client and most other applications that display web content are inaccessible. This is totally unacceptable for a modern smart phone.

IDEAL Apps4Android's Accessible Email Client and Web Browser

IDEAL Apps4Android released both an accessible email client and web browser. The accessibility enhancements to the K9 email client (on which their application is based) have since been incorporated into K9 itself, which is fantastic. However, access to the web still requires a separate "accessible" web browser. While other developers can also integrate this web accessibility support into their applications, it is essentially a set of self-voicing scripts which need to be embedded in the application. This is rather inelegant and is very much "bolt-on accessibility" instead of accessibility being integrated into the web rendering engine itself. This isn't to criticise IDEAL: they did the best they could given the limitations of the Android accessibility API and should be commended. Nevertheless, it is an unsatisfactory situation.

More "Accessible" Apps

There are quite a few other applications aside from those mentioned above that have been designed specifically as "accessible" applications, again isolating disabled users from the normal applications used by everyone else. Again, this isolating redundancy is largely due to Android's severely limited accessibility framework.


Unfortunately, even though Android is open source, solving this problem is rather difficult for people outside the core Android development team because it will require changes to the core of Android. The current accessibility framework needs to be significantly enhanced or perhaps even redesigned, and core applications need to take advantage of this improved framework.


While significant headway has been made concerning accessibility in Android 1.6 and beyond, the situation is far from satisfactory. Android is usable by blind users now, but it is certainly not optimal or straightforward. In addition, the implementation is poorly designed and inelegant. This situation is only going to get messier until this problem is solved.

I find it extremely frustrating that Android accessibility is in such a poor state. It seems that Google learnt nothing from the accessibility lessons of the past. This mess could have been avoided if the accessibility framework had been carefully designed, rather than the half-done job we have now. Good, thorough design is one of the reasons that iPhone accessibility is so brilliant and "just works".


  1. Hi, Interesting post. I agree with your overall asessment of the bolt-on apps that the Eyes-free project started with. I much prefer a true robust accessibility API and to use the same applications as everyone else.
    I did find your comment about the API lacking enough abilities to enable touchscreen accessibility interesting. I am not familiar enough with Android's API to agree or disagree however I have a question. If this is so, how is the "touch exploration" feature on Motorola's Droid2 working? I've not used it, only read about it on the eyes-free blog, and the post does point out it works in only limited context. But perhaps there is potential here after all?

  2. Today I had gave a presentation on mobile a11y for web apps and thought I give a demo of talk back or other eyes free stuff. However I could't get anything working on my dell streak tablet (no keyboard). I got basic operation with native apps but nothing useful in the browser. I admit I spent little time and android 1.6 may be lacking (though the a11y api came in then). @ppatel was critical about android a11y at a11ydc recently so I was inTerested in your assesment here.

    gaps are the a11y api and apps and browser as you say.

    So perhaps we should get some community action and not just rely on google? But then google don't encourage it with their shown and tell release process. still I'd like to see great switch and other alternative input support for projects like tekla. I'd be interested in working om it

  3. @Travis: Very good question. I wondered this myself when I saw the post about the touch exploration feature on the Droid2 a while ago. Manufacturers can customise the UI, so I suspect they've modified some parts of the UI to allow this, although I have no way of knowing for sure. The disadvantage of doing it this way is that it may only work in limited contexts and of course won't work on all Android phones. Still, kudos to Motorola for doing this.

    @Steve: The problem is that anyone who wants to work on this would need a developer phone, as they'd need to customise the main firmware. We could just use the emulator, but this isn't conducive to "real world" testing and development. Also, getting such changes back into the core may be fairly difficult due to Google's processes. Regarding alternative input, Android already supports the implementation of alternative keyboards by third parties, so alternative input methods can already be added without changes to the core.

  4. I'm not surprised one bit about the lack of accessibility on the Android phones, I tried out a Motorola Charm, nice phone, with both a touchscreen & keyboard, I had hoped that it would replace my ever reliable Motorola Q (which is totally accessible with Mobilespeak), nope, not at all, I tried for over 2 weeks, to no avail, I sent it back, my hopes crushed, Since then I have settled for a Apple IPhone 8GB 3GS, I sorta love it, atleast it has Voiceover, but still, not all apps are made accessible with it, I've found several that arent.

  5. Thanks, Jamie, for posting this. It is important that this post came from an experienced programmer, like yourself, and is in no way a bunch of ungrounded complaints.
    Let's see what Android 3.0 brings to the world of accessibility.

  6. @Jamie re dev phone - good point and I'm forgetting my background in embedded. Sure some of the stack can be worked on on an emulator and the emulator migh be enhanced for a11y, but yes much would need real device. Sponsorship?

    re changes to core - quite. Hence my comments about the 'show and tell' release process. For open source it's pretty closed development from Google.

    re alt input - yes basic HID is there, but I'm not sure it's good enough for all use cases. It's high on my list of things to explore. And then the events need to map through to the browser.

    re browser - I wonder Mozilla fennec will pan out. I can't try it as I'm stuck on Android 1.6 till Dell manage to get their act together or I root it. Another thought is perhaps webkit based browser would be good using the work Joanie has been doing for Orca a11y?

    re API - for goodness sake, we need something in the AT/SPI 1A2 mould. That should have been obvious.

    @vic I'm not holding my breath. Do you know something? As Jamie indicates it's a big problem and complex and we've no evidence that Google are into it. Sure Charles and TV have a few sweetners - but we could have a great opensource platform for a11y and AT giving people main stream devices that meet their a11y needs and not 'special' kit that sucks.

    My view is that web apps are a critical part of the a11y future for many reasons. Sorting that problem needs a big stack solution (markup -> browser -> API -> AT -> user) and solving that should hit many native app issues too.

    One thought is I could start a Mozilla drumbeat project for folks to rally around and let Google know there is a need and opportunity here. That might just get some action? Any thoughts?

  7. @Steve:
    As far as I know, Android's in-built web rendering engine is based on Webkit. Webkit has a core accessibility abstraction, but this needs to be exposed to the platform accessibility framework.
    Re API: I think we can do better than IA2/ATK/AT-SPI. Binary interfaces are a pain, as changes cause backwards compatibility issues. However, it needs to expose at *least* that much info.

  8. @James re: APIs - interesting thoughts there. I recall 1st rule of COM (binary) - don't morph an interface - create a new one.

  9. @Steve: Very true. However, the problem is that if you want to replace a method that was badly designed in the first place, you inevitably end up subclassing the old interface, so you have a lot of never-used methods lying around. Also, every time you add a new interface, you have to create a new COM proxy dll, etc. It just gets messy.

    Anyway, the specific implementation details aren't so important as long as the API has all of the required functionality. My point was that while we want all of the functionality of ATK/AT-SPI/IA2, there's no reason to be restricted by those, either.

  10. @James 'My point was that while we want all of the functionality of ATK/AT-SPI/IA2, there's no reason to be restricted by those, either.'

    Agreed - nothing is fixed after all. Thinking of new touch interactions like pinch

  11. Just found this Blog.
    What a great post.
    We are in the UK and design and make products for the blind and visually impaired and have been investigating the Android/linux for use in the VI world.
    All our work is on Apads not phones. Currently quite a few phone apps will work well on the pad, but fail in the basic accessibility requirement of object voicing.
    There is not even a shut function in Talkback!
    The iphone app VoiceOver appears to be the ideal solution but as Jamie says the Accessibility API is just not up to the level of being usable for application development.
    As it seems impossible to contact any one in the Android Developer API team we will look into seeing if we can pull some linux to work the problem.

  12. What do you say? It is too good for the blind, so it is not good at all?

  13. Interesting article, and a pity that the framework cannot support A11y correctly. It would be best for Google to abandon the existing'bolt-on' framework and use a better API.

  14. Screen Size 10.1 Inch
    Resolution Pixels 1280x720
    Backlight Type LED
    Aspect Ratio 16:9
    Screen Surface Glossy
    Data Connection 40-Pin
    Application Laptop or Notebook
    AUO B101EW02 V.0 LCD screen LCD screen is the most important component of laptop.Maybe Other laptop parts or components could be replaced or repaired easily at low cost.but for LCD screen,in most cases it only can't be compatible problem. For solve the following problem,broken lcd screen,bad pixels,white lines,color,shine screen.Replace A LCD screen is only solution.Sometime,you need to make sure anti-static steps prepared before remove old screens and check new lcd screen parameters match your original LCD screen. so sugguest buyers need buy it at professional laptop screen website.
    The LCD screen review from
    10.1" wide screen has 3 kinds different resolution 1024 x576 1024x600 and 1366x768. the previous 2 kinds we called standard screen, and the later one we called HD version. Something we need to note is that once you put a screen 1024x600 on a 1024x576 laptop, you may see one black line on screen at bottom, it seems strange and looks not funny. 40 pin connector and LED backlight are the same feature of this 3 kinds screens. So they have good compatibilty each other, when you select correct resolution. 10.1 HD version is for some user like movie with netbooks. personally, 10.1 HD is not good , if you need HD , why not purchase 15.6 or 14.1 HD version? In addition, we have to refer to 10.2 wide screen here , because by statistics , many laptops do a mix 10.1" 10.2 " lcd screen in same laptops. for example HP Mini 1000 , you may find the screen may be N101N6-L01(40Pin) or CLAA102NA2CCN (30pin) or other screens. Some repair man will tell you the differenence if you want to do repair, but if you plan to do it self, that is you need to pay some attention. In a word, 10.1" , 10.2" are mainly part of netbook market.,up to 55% At last , 10.4" 10.6" wide screen is popular LCD screen for tablets in past 1-3 years. they usually has resolution at 1280*768 or 1280*800 . at very high price , good quality , LED or single CCFL lamps backlight, 14pin connector, 20 pin connector and 30pin connector. that you need to take care when you want do any repair or DIY replacing. Fujitsu tablet 10.6" or SONY 10.6 PCG-TR is the sample.

  15. It's not that Google didn't have an a11y team at all. They just haven't yet built an interface for screen readers that is as powerful as even MSAA, and far behind IA2/ATK and Microsoft UI Automation. I was hoping I might be able to help them build such infrastructure, but so I couldn't even get an interview. I have to wonder if they saw my vision impairment as a reason to not go forward with the process. This sort of inability to hire solid vision impaired programmers might be part of the reason Google a11y is where it's at today.