Accessibility Technology

Google Accessibility Update – Live Blog Post 9 – Fireside Chat Q & A

  • Will docs provide icons for people with cognitive disabilities?  It is being prioritized “in context” with all the other basic needs of blind and low-vision folks.
  • What is level of accessibility on Android of calendars when scheduling events?  Currently, best option is to use the agenda view.  All basic features work with TalkBack.  Gestures also work…nothing needs to be done any differently to make things work here.
  • TalkBack settings can be set to “audio duck” to give audio focus to events (say, new message notification) when listening to music, television, etc.
  • Are there plans to release tools to help developers automate analysis / do accessibility testing?  We’re looking at doing this in the future, (possibly) through tools that would cause builds to fail.
  • When will all “default” Android apps will be talkback accessible?  CASEY BURKHARDT answered this – we evangelize heavily inside Google, we know Gallery  in particular is a problem.  Most of the apps are good out-of-the-box.
  • How do you work with people who are recently vision-impaired or have limited computer abilities?  We’re not requiring people to have previous knowledge of interacting with machines, but are instead making interfaces that are intuitive for humans.  For example, iOS simplicity being used by children (like my 5 year old).
  • Will Android support other physical devices such as wireless (bluetooth, NFC, security dongles, etc.) devices?  If there are accessibility APIs, then we can bind to them.  Our goal is to enable all of these things, but we probably won’t scale it up on our own…tell us what you need and we can extend the platform to make it happen.  For example, there is NATIVE OTG support for many Android devices.
  • There was a vague question about date / time support in HTML5 (sorry, I didn’t catch the entire question):  support of HTML5 date / time controls are in latest OS version.
  • Can you talk about CourseBuilder?  TV RAMAN:  our team works in research, but we are working with that team, especially the component that allows users to create their own content.  (Paul’s 2 cents:  you can build a content management system, but people can abuse it and put completely inaccessible content into it…can’t really help that.).

TV RAMAN closed the event, thanking everyone for attending.

Accessibility Technology

Google Accessibility Update – Live Blog Post 8 – The Android Platform

Android Platform

  • Gabe Cohen, Product Manager
  • TV Raman, Lead, Google Accessibility
  • Casey Burkhardt, Software engineer, TalkBack

TV RAMAN Background: we want to make it easy for mobile app devs to make their apps accessible.

  • First Android handset delivered in 2008
  • Accessibility API version 1.0 and TalkBack published in August 2009
  • Use what we build (I call this “eating your own dog food”)
  • Was originally focused on devices with a physical affordance on the device itself (i.e. keyboards, buttons).
  • Added gestures and actions
  • Velocity of accessibility development is proceeding with same velocity of each new OS release
Talked briefly about some important dates and things that happened in 2012, highlighted momentum of the platform (1.3 million activations a day and climbing).  Wow, isn’t that an impressive number?
June 2012
  • Been on team since June 2012
  • Nexus 7 launch
  • Jelly Bean 4.1
  • Project Butter (performance improvements in touch and visual frame rates)
  • Google Now (shows you things the OS thinks you may need at any given time – i.e. “context aware” information and functions)

October 2012

  • Nexus 4
  • Nexus 10
  • Jelly Bean 4.2
  • Multiple Users
What were Google’s goals as they pertained to accessibility in 2012?
  1. Enable completely unassisted setup
  2. Faster system navigation
  3. Reliable UI navigation
  • Easy Activation: 2 finger long-press in setup wizard
  • Accessibility Focus:  persistent focus for next action / navigation, double-tap anywhere to launch focused item
  • Reliable navigation:  navigate forward and backward with a swipe, invoke system functions with gestures
  • Fine-grained text traversal control
  • TalkBack support for the above
  • Support for Braille accessibility services  (get BrailleBack from Google Play)
  • Accessibility shortcut:  toggle talkback from anywhere using power menu
  • Screen magnification:  triple tap to zoom in and interact while panning, triple tap and hold to temporarily zoom
Android Accessibility Platform (from slide)
Accessibility Service Framework

  • Software navigation of app controls and text
  • Enables:  talkback, brailleback, test automation
  • Open APIs for developers
Accessibility APIs for Apps
  • Standard controls and layouts “just work”
  • Allow custom traversal of app content
  • Allow custom UI controls
  • Design guide, checklist
  • Evangelism
APIs are “built-in” to allow developers to build apps that work with Google Now.  Not designed specifically for people with disabilities, but for everyone.  He showed how tools can be accessed via “linear access” and “random access” methods via touch and gesture.
Accessibility can be enabled and disabled quickly via “breakout menus,” which looks like a circle with 3 settings arranged around it (“radial menus”), plus there are landmarks in the corners of the device.  Force feedback – that is, shaking the device – can pause TalkBack.  This option is also available via the breakout menus.
“Local menus” are like context menus.  They allow access to types of structured content, sort of like navigating via headers (h1, h2, etc.) in web pages.  Types of content might include blocks of text, lists of controls, list item sections, and so on.
TV Raman believes that the Android platform has only touched on about 20% of its potential.  There’s tons of potential for growth here.
Showed off some neat gestures, which demonstrated how the device “remembers” navigation.  This feature helps eliminate the need to remember how to traverse deeply nested menus after you’ve left that option tree.  No additional developer input is needed to make features like this work, which I must admit is pretty cool.
Provided a demonstration of docs with mobile device, using a Google doc of questions that had been asked earlier in the day.
MOBILE DRIVE AND DOCS (directly from slide)
Accessible versions of mobile drive
  • Drive is accessible with Braille support
  • Native versions of Docs & Sheets launched in the last year
Still rely on platform support for building most accessibility functionality
  • Best experience on Android is with JellyBean
  • Experience is sometimes constrained by the platform APIs
Live demonstration of Google Drive on an Android (Galaxy Nexus)
  • Can explore through both touch and swiping gestures to:  traverse, select, and open items
  • Navigation through lists works the same way
  • Showed how to share doc with others and provide sharing options (read only, edit, etc.)
  • Showed basic editing of docs, with “full control” of native doc
  • Table was listed at the top of the document to demonstrate table editing feature and TalkBack reading level options (which can themselves also be controlled by swiping gestures)
  • Basically, you don’t want to edit massive documents on a mobile device, but it’s definitely possible to make simple- to medium-sized edits.
  • SpreadSheet editor is coming along
NEAR TERM AND LONG TERM (directly from slide)
Near Term
  • Ensure new featuers are accessible and conform to protocol on each platform
  • Integration testing to prevent regressions
Long Term
  • Continue to build out mobile functionality to match Drive & Docs on the web
Braille Device Support (this feature is pretty badass in my opinion)
  • Makes accessible to deaf-blind users
  • Functions as both a screen and a keyboard
  • Accessibility actions can be triggered via Braille keyboard (BrailleBack)
  • Will be available on ChromeOS soon
  • New version coming out tomorrow on Play, lets devs see “on screen” what the Braille output looks like
  • Also has TTY and high-contrast support
CASEY BURKHARDT demonstrated low vision support
  • Settings to enable magnification feature and turned on gesture support
  • Enabled a specific gesture to turn on magnification with a gesture
  • Can temporarily enable these features to scan things quickly
  • Explicitly does NOT magnify the keyboard (becomes a usability nightmare)


  • Are improvements being made to the main googleVoice interface?
  • When will all default Android apps will be talkback accessible?  Gallery and calendaris notoriously badf
  • What’s the name of the feedback list serve you mentioned earlier?
Accessibility Technology

Google Accessibility Update – Live Blog Post 7 – Accessibility Guides for End Users

Brand New Accessibility Guides for end users announced:

  • Best practices
  • Keyboard shortcuts – not in table format, so it’s easier to navigate

Product managers specifically asked the audience to provide feedback on these guides, they will update based on your feedback!  Contact them here:


  • How often will this document be updated?  Pretty much all the time, but definitely when there are new feature releases.
  • Where do we find this published?  Help Center, Google list serves.
  • Can changes to this document be announced to organizations rolling out Google Apps?  (I think this was a GREAT question).  We will review this with our deployment teams.
  • Any other screen readers beyond ChromeVox?  We’re working on it.
  • Are keyboard shortcuts specific to ChromeVox?  Those with a ChromeVox keys are noted in help guide.
  • Use of “Shift-Question Mark” key will provide list of keyboard shortcuts

TV Raman commented on use of online help:  it should be available BEFORE user begins using the tool.  We will do our best to keep this documentation as up-to-date as possible.


  • What does near and long term mean?  Near = now.  Longer term = we’re working on it, 6-12 months.
  • What about Calendar updates?  This was not covered earlier. These updates fit into the longer term bucket.

Betsy Roman introduced – runs the literacy program at Benetech, and described the Bookshare reader tool.  Helps commercial industry make textbooks accessible.  In short, it’s a reader tool.  I have to admit that I was completely unaware of this effort. There is work coming out of the W3C about adding speech APIs to HTML.  As this is “baked” many exciting new features and technologies will be made available into Google products. Chrome is only browser thus far to have implemented any of these APIs.  Demonstrated a biology textbook using the Bookshare web reader (Chrome extension).  Still working on MathML.


  • How many titles are available?  All Bookshare titles (apparently a thousand-ish).
  • Is there voice search?  Not yet.
  • Is there rich text navigation?  Yes
  • Can you change speed of reader?  Yes
  • Are there bookmarks available?  Not yet
  • For Betsy:  how well has this BookShare tool been embraced by the publishing industry?
  • Can email be deleted in Gmail with one keystroke?  Yes, it’s the pound key, or A for archive.
  • Will ChromeVox be built as a desktop app?  No, it will probably not be built as a standalone screen reader app.
  • Which combination of OS / browsers are tested?  Last 2 versions on Windows, Mac, and Linux.  Including screen readers Jaws, NVDA and most recently WindowsEyes.  We work to make Chrome and ChromeVox most accessible first.
  • Is there a mode that can spell out words?  Reading level can be changed on both phones and desktop.
  • How do you avoid keyboard shortcut collisions?  We try to use same shortcuts other productivity software uses.  Trouble is that we compete with a lot of different things on your computer.  We sometimes have to deviate from “standards.”
  • Can you talk about standalone Google forms?  It’s accessible in the same way as our 3 editors.
  • Can you talk about App Engine?  Can a blind developer develop accessible apps using this platform?  Yes.  However, content creators still have the ability to make content that’s NOT accessible.
  • Is accessibility only for core apps?  Google is committed to building accessibility into all apps.  For practical purposes, it seems that the answer now is yes.  Engineers are now building in automated testing into build process.

TV Raman:  in the web app world, we need to standardize on keyboard combinations.  Which web app will want which keys?  This is an important question.

UPDATE:  Shawn from Google Docs tweeted this while he was sitting two chairs away from me: “Clarification: we also support and test with VoiceOver, not just JAWS, NVDA, ChromeVox.”



Accessibility Technology

Google Accessibility Update – Live Blog Post 6 – A Couple Questions

TV Raman continues…

Quotes:  “Building accessibility for the web platform” and “we’re getting close to an inflection point” where things will get much easier to use.

A couple of my questions for the planned “fireside chat” with the Google crew later:

  1. Do all the features described today require use of Chromebook hardware, ChromeOS, and ChromeVox?
  2. Are keyboard shortcuts consistent across OS and browser platforms?
Accessibility Technology

Google Accessibility Update – Live Blog Post 5 – Drive and Docs Suite

Drive and Docs Suite

Ajay Surie and team (slide went by too quickly)

“These are rich web applications that allow creation of rich content within a web browser”

TEXT FROM SLIDE:  “Building accessibility for complex web applications is still hard.  Docs is built on JavaScript; working with ChromeVox has allowed us to iterate rapidly and identify challenges with today’s accessibility toolset.  Goal is to learn from our implementation and evolve current web accessibility standards where necessary.”  Also want to make apps work with NON-CHROMEVOX screen readers.  Building in dictation features.

Document editor cursor is drawn and positioned dynamically.  Traditional screen readers don’t really work with these kinds of web apps.  We have a way to have the app talk to the screen reader (this was not specifically described).

Dev team put out another call for feedback on Docs.  Docs team has Incorporated a process during development where accessibility is built-in from the start.

Some Drive changes highlighted:

  • Completely redesigned keyboard interaction model in Drive
  • Keyboard navigation across the list easier and more consistent
  • Document list is a single focusable elements:  shortcuts to navigate document list
  • Improved focus management
  • Better verbalization of instructions and application state

Like Alex with Gmail (see post 4), they’re looking at making navigation more consistent.

  1. Search area at top
  2. Folder list area on left
  3. Document list in the middle
  4. Details pane (right side of the page for sighted users, basically a preview area with editable settings such as sharing)

Demonstrated “starring” a doc using ARIA within the Document list, then listing all starred items.  A lot of the keyboard accessibility features and new keyboard model is useful for people without disabilities as well.

Google Slides

Made slide editor “completely accessible.”  This demonstration should be interesting…

Opened presentation inside a new tab; on open, it reads # of slides, title, text, and a lot of other details.

  1. Canvas view is where slide content is created.  Items on slide are tab-navigable.  Opening a slide makes ChromeVox verbalize what’s on the canvas.
  2. Filmstrip is where organization of slides happens.
  3. Notes panel

Slides Demo

  • Demo’ed selection of text on a slide and bolding
  • Added a slide in filmstrip by control-n
  • Control-shift-backspace to activate context menu (equivalent of right-click), selected a slide layout

Docs Editor

  • Showed a ton of keyboard shortcuts (I didn’t catch all of them, unfortunately)
  • Comments feature
  • “Alt-forward slash” key combo brings up search features within the documents editor, i.e. any feature in the visible menus
  • Move between headings: “Control-Alt-P” and “Control-Alt-H”
  • Demonstrated moving back and forth between footnotes

Google Sheets (Excel-alike product)

Focus has been on giving user more contextual information about what they’re working on.  Moving around a sheet’s rows and columns provides a verbalized summary of what’s on the sheet (this demo had a minor glitch with verbalization).

Selecting range of cells provides details on cell content as well (this could get interesting if selecting a large number of cells…this was not demonstrated).

PAUL’S QUESTION:  are keyboard shortcuts consistent across OS and browser platforms?

%d bloggers like this: