Accessibility Technology

The CSUN 2013 Web Track Mega Post

Greetings, fellow web accessibilistas!  (not to be confused with accessiballistas, the little-known and even less-documented accessible siege engine of yore).

As you may have gathered if you followed my live blog posts a couple weeks ago, my interest in attending the CSUN 2013 conference was almost exclusively web-related.  Now that it’s been a couple weeks and I’ve had some time to reflect, I figured it would be a good idea if I consolidated everything into one mega-list.  This year, there were several times when I wish I could have been in two places at once.  Hopefully this gives you a pretty representative sampling of what was on offer web-wise this year.  Follow me @paulschantz for more web-related topics, including accessibility, project management, web development and design philosophy, thoughts on working in higher education, bad clients, off-color humor, and other ephemera.  Enough self-promotion…on with the list!

Pre-Conference Seminar:  Google Accessibility

Day One:  February 27, 2013

Day Two:  February 28, 2013

Day Three:  March 1, 2013

Accessibility Technology

Google Accessibility – Partially Digested Observations

Holy moly, that was an information-packed session today!  And, what a difference from last time I saw Google at #CSUN.

I saw Google’s presentation on accessibility when I attended the #CSUN conference (I believe) four years ago.  At that time, I got the impression that Google was “phoning it in.”  The reps they sent at that time were clearly lower-level, more tech-oriented staff and didn’t present their story to the #CSUN audience in a compelling or memorable way.  If I were cynical, I’d say their approach at that time smacked a bit of technical smugness.

Fast forward four years…

Today, there were no fewer than 15 people from Google, including product managers from their core applications, internal accessibility evangelists, and development staff.  They’re not messing around now.  Here are some of the standout items, from my perspective:

  1. Google’s development team is working hard to standardize keyboard navigation across their core applications.  This is huge, and will pay big dividends for all users in the very near future.
  2. For obvious reasons, Calendar was not mentioned much.  To Google’s credit, they did not evade critical questions.  Calendaring is freakin’ hard – my team made a public web calendar for the CSUN campus a few years back, and I can assure you that that effort was no joyride:
  3. Google acknowledges the problem of keyboard shortcut collisions.  Sorry folks, there are no “standard” keyboard shortcuts but the ones that have come about due to historical cruft of the software industry.  People using niche apps will unfortunately be caught in the lurch.  This isn’t all bad though, because…
  4. …Google’s larger plan is to have their entire ecosystem in the cloud.  Like it or not, this is the future of computing.  This hearkens back to a conversation I had with my Computer Science colleagues regarding “the cloud” about five years ago.  My question back then was “what happens when everything is available in the cloud?”  Answer:  “we pick and choose those services that we need and trust.”  Google is building those services today, and from what I can see, I trust that they’re working on it.  BUT…we have to continue to push for more accessibility.  If we don’t evangelize and make it a priority, it just won’t happen.
  5. Speaking of evangelism, I get the distinct sense that the push for accessibility within Google is an uphill battle at times, but the organization is really starting to “get it.”  Working as a director of a web development team in the CSU with responsibilities around ensuring accessibility on my campus, I can relate.
  6. The advances in accessibility built into the Android OS (Accessibility Service Framework and APIs) are downright impressive.  The work around creating an intuitive “navigation language” alone merits a gold star in my opinion.
  7. Google’s position of supporting current browser version “minus two” is a goddamn blessing and should be shouted from the mountain tops.  I feel very strongly about this, and have written a browser support statement to clarify why I take a similar position with my team’s work:

Maybe it’s just me being cynical again, but I could kind of sense a faint hint of technical smugness today.  Its character was different though, and I think that comes from the audacious scope of what Google is trying to do as a company.  When you throw around statements like “the web is our platform,” I guess it’s hard to be humble.

Accessibility Technology

Google Accessibility Update – Live Blog Post 9 – Fireside Chat Q & A

  • Will docs provide icons for people with cognitive disabilities?  It is being prioritized “in context” with all the other basic needs of blind and low-vision folks.
  • What is level of accessibility on Android of calendars when scheduling events?  Currently, best option is to use the agenda view.  All basic features work with TalkBack.  Gestures also work…nothing needs to be done any differently to make things work here.
  • TalkBack settings can be set to “audio duck” to give audio focus to events (say, new message notification) when listening to music, television, etc.
  • Are there plans to release tools to help developers automate analysis / do accessibility testing?  We’re looking at doing this in the future, (possibly) through tools that would cause builds to fail.
  • When will all “default” Android apps will be talkback accessible?  CASEY BURKHARDT answered this – we evangelize heavily inside Google, we know Gallery  in particular is a problem.  Most of the apps are good out-of-the-box.
  • How do you work with people who are recently vision-impaired or have limited computer abilities?  We’re not requiring people to have previous knowledge of interacting with machines, but are instead making interfaces that are intuitive for humans.  For example, iOS simplicity being used by children (like my 5 year old).
  • Will Android support other physical devices such as wireless (bluetooth, NFC, security dongles, etc.) devices?  If there are accessibility APIs, then we can bind to them.  Our goal is to enable all of these things, but we probably won’t scale it up on our own…tell us what you need and we can extend the platform to make it happen.  For example, there is NATIVE OTG support for many Android devices.
  • There was a vague question about date / time support in HTML5 (sorry, I didn’t catch the entire question):  support of HTML5 date / time controls are in latest OS version.
  • Can you talk about CourseBuilder?  TV RAMAN:  our team works in research, but we are working with that team, especially the component that allows users to create their own content.  (Paul’s 2 cents:  you can build a content management system, but people can abuse it and put completely inaccessible content into it…can’t really help that.).

TV RAMAN closed the event, thanking everyone for attending.

Accessibility Technology

Google Accessibility Update – Live Blog Post 8 – The Android Platform

Android Platform

  • Gabe Cohen, Product Manager
  • TV Raman, Lead, Google Accessibility
  • Casey Burkhardt, Software engineer, TalkBack

TV RAMAN Background: we want to make it easy for mobile app devs to make their apps accessible.

  • First Android handset delivered in 2008
  • Accessibility API version 1.0 and TalkBack published in August 2009
  • Use what we build (I call this “eating your own dog food”)
  • Was originally focused on devices with a physical affordance on the device itself (i.e. keyboards, buttons).
  • Added gestures and actions
  • Velocity of accessibility development is proceeding with same velocity of each new OS release
Talked briefly about some important dates and things that happened in 2012, highlighted momentum of the platform (1.3 million activations a day and climbing).  Wow, isn’t that an impressive number?
June 2012
  • Been on team since June 2012
  • Nexus 7 launch
  • Jelly Bean 4.1
  • Project Butter (performance improvements in touch and visual frame rates)
  • Google Now (shows you things the OS thinks you may need at any given time – i.e. “context aware” information and functions)

October 2012

  • Nexus 4
  • Nexus 10
  • Jelly Bean 4.2
  • Multiple Users
What were Google’s goals as they pertained to accessibility in 2012?
  1. Enable completely unassisted setup
  2. Faster system navigation
  3. Reliable UI navigation
  • Easy Activation: 2 finger long-press in setup wizard
  • Accessibility Focus:  persistent focus for next action / navigation, double-tap anywhere to launch focused item
  • Reliable navigation:  navigate forward and backward with a swipe, invoke system functions with gestures
  • Fine-grained text traversal control
  • TalkBack support for the above
  • Support for Braille accessibility services  (get BrailleBack from Google Play)
  • Accessibility shortcut:  toggle talkback from anywhere using power menu
  • Screen magnification:  triple tap to zoom in and interact while panning, triple tap and hold to temporarily zoom
Android Accessibility Platform (from slide)
Accessibility Service Framework

  • Software navigation of app controls and text
  • Enables:  talkback, brailleback, test automation
  • Open APIs for developers
Accessibility APIs for Apps
  • Standard controls and layouts “just work”
  • Allow custom traversal of app content
  • Allow custom UI controls
  • Design guide, checklist
  • Evangelism
APIs are “built-in” to allow developers to build apps that work with Google Now.  Not designed specifically for people with disabilities, but for everyone.  He showed how tools can be accessed via “linear access” and “random access” methods via touch and gesture.
Accessibility can be enabled and disabled quickly via “breakout menus,” which looks like a circle with 3 settings arranged around it (“radial menus”), plus there are landmarks in the corners of the device.  Force feedback – that is, shaking the device – can pause TalkBack.  This option is also available via the breakout menus.
“Local menus” are like context menus.  They allow access to types of structured content, sort of like navigating via headers (h1, h2, etc.) in web pages.  Types of content might include blocks of text, lists of controls, list item sections, and so on.
TV Raman believes that the Android platform has only touched on about 20% of its potential.  There’s tons of potential for growth here.
Showed off some neat gestures, which demonstrated how the device “remembers” navigation.  This feature helps eliminate the need to remember how to traverse deeply nested menus after you’ve left that option tree.  No additional developer input is needed to make features like this work, which I must admit is pretty cool.
Provided a demonstration of docs with mobile device, using a Google doc of questions that had been asked earlier in the day.
MOBILE DRIVE AND DOCS (directly from slide)
Accessible versions of mobile drive
  • Drive is accessible with Braille support
  • Native versions of Docs & Sheets launched in the last year
Still rely on platform support for building most accessibility functionality
  • Best experience on Android is with JellyBean
  • Experience is sometimes constrained by the platform APIs
Live demonstration of Google Drive on an Android (Galaxy Nexus)
  • Can explore through both touch and swiping gestures to:  traverse, select, and open items
  • Navigation through lists works the same way
  • Showed how to share doc with others and provide sharing options (read only, edit, etc.)
  • Showed basic editing of docs, with “full control” of native doc
  • Table was listed at the top of the document to demonstrate table editing feature and TalkBack reading level options (which can themselves also be controlled by swiping gestures)
  • Basically, you don’t want to edit massive documents on a mobile device, but it’s definitely possible to make simple- to medium-sized edits.
  • SpreadSheet editor is coming along
NEAR TERM AND LONG TERM (directly from slide)
Near Term
  • Ensure new featuers are accessible and conform to protocol on each platform
  • Integration testing to prevent regressions
Long Term
  • Continue to build out mobile functionality to match Drive & Docs on the web
Braille Device Support (this feature is pretty badass in my opinion)
  • Makes accessible to deaf-blind users
  • Functions as both a screen and a keyboard
  • Accessibility actions can be triggered via Braille keyboard (BrailleBack)
  • Will be available on ChromeOS soon
  • New version coming out tomorrow on Play, lets devs see “on screen” what the Braille output looks like
  • Also has TTY and high-contrast support
CASEY BURKHARDT demonstrated low vision support
  • Settings to enable magnification feature and turned on gesture support
  • Enabled a specific gesture to turn on magnification with a gesture
  • Can temporarily enable these features to scan things quickly
  • Explicitly does NOT magnify the keyboard (becomes a usability nightmare)


  • Are improvements being made to the main googleVoice interface?
  • When will all default Android apps will be talkback accessible?  Gallery and calendaris notoriously badf
  • What’s the name of the feedback list serve you mentioned earlier?
Accessibility Technology

Google Accessibility Update – Live Blog Post 7 – Accessibility Guides for End Users

Brand New Accessibility Guides for end users announced:

  • Best practices
  • Keyboard shortcuts – not in table format, so it’s easier to navigate

Product managers specifically asked the audience to provide feedback on these guides, they will update based on your feedback!  Contact them here:


  • How often will this document be updated?  Pretty much all the time, but definitely when there are new feature releases.
  • Where do we find this published?  Help Center, Google list serves.
  • Can changes to this document be announced to organizations rolling out Google Apps?  (I think this was a GREAT question).  We will review this with our deployment teams.
  • Any other screen readers beyond ChromeVox?  We’re working on it.
  • Are keyboard shortcuts specific to ChromeVox?  Those with a ChromeVox keys are noted in help guide.
  • Use of “Shift-Question Mark” key will provide list of keyboard shortcuts

TV Raman commented on use of online help:  it should be available BEFORE user begins using the tool.  We will do our best to keep this documentation as up-to-date as possible.


  • What does near and long term mean?  Near = now.  Longer term = we’re working on it, 6-12 months.
  • What about Calendar updates?  This was not covered earlier. These updates fit into the longer term bucket.

Betsy Roman introduced – runs the literacy program at Benetech, and described the Bookshare reader tool.  Helps commercial industry make textbooks accessible.  In short, it’s a reader tool.  I have to admit that I was completely unaware of this effort. There is work coming out of the W3C about adding speech APIs to HTML.  As this is “baked” many exciting new features and technologies will be made available into Google products. Chrome is only browser thus far to have implemented any of these APIs.  Demonstrated a biology textbook using the Bookshare web reader (Chrome extension).  Still working on MathML.


  • How many titles are available?  All Bookshare titles (apparently a thousand-ish).
  • Is there voice search?  Not yet.
  • Is there rich text navigation?  Yes
  • Can you change speed of reader?  Yes
  • Are there bookmarks available?  Not yet
  • For Betsy:  how well has this BookShare tool been embraced by the publishing industry?
  • Can email be deleted in Gmail with one keystroke?  Yes, it’s the pound key, or A for archive.
  • Will ChromeVox be built as a desktop app?  No, it will probably not be built as a standalone screen reader app.
  • Which combination of OS / browsers are tested?  Last 2 versions on Windows, Mac, and Linux.  Including screen readers Jaws, NVDA and most recently WindowsEyes.  We work to make Chrome and ChromeVox most accessible first.
  • Is there a mode that can spell out words?  Reading level can be changed on both phones and desktop.
  • How do you avoid keyboard shortcut collisions?  We try to use same shortcuts other productivity software uses.  Trouble is that we compete with a lot of different things on your computer.  We sometimes have to deviate from “standards.”
  • Can you talk about standalone Google forms?  It’s accessible in the same way as our 3 editors.
  • Can you talk about App Engine?  Can a blind developer develop accessible apps using this platform?  Yes.  However, content creators still have the ability to make content that’s NOT accessible.
  • Is accessibility only for core apps?  Google is committed to building accessibility into all apps.  For practical purposes, it seems that the answer now is yes.  Engineers are now building in automated testing into build process.

TV Raman:  in the web app world, we need to standardize on keyboard combinations.  Which web app will want which keys?  This is an important question.

UPDATE:  Shawn from Google Docs tweeted this while he was sitting two chairs away from me: “Clarification: we also support and test with VoiceOver, not just JAWS, NVDA, ChromeVox.”



%d bloggers like this: