Categories
Accessibility Technology

All About Google Chrome

This is my fifth session from the first day at the CSUN conference.  This session covers “…the built-in accessibility features of Chrome, Chrome OS and Chromebooks.”  Description comes from the conference event guide.  I attended Google’s pre-conference seminar in 2013, and it was very informative (my 10-part blog post can be accessed here).  I hope they pack in the juicy details this year too 🙂

Presenters:

  • Dominic Mazzoni, Software Engineer on the Google Team (@)
  • Peter Lundblad, Engineer on the Google Chrome Team (@)
  • David Tseng, Software Engineer Google Chrome Team (@)

 

David Tseng showed off a remote control that comes with ChromeVox built-in.  It’s meant for video conferencing.  David used the tool to join a Google Hangout (a kind of vido call).  It worked well in the demonstration, at least from the perspective of selecting and joining an existing Hangout.

 

Dominic Mazzoni talked briefly about the importance of the web as the world’s largest open platform.  The Chrome browser was originally introduced with the following three principles/priorities in mind:

  • Speed:  re-introduced competition into the browser market
  • Simplicity:  create a browser that doesn’t distract from the content you’re looking at.  Also, updates happen automatically.
  •  Security:  updates resolve holes asap

Dominic jumped into ChromeOS and showed some of the accessibility features available, including on-screen keyboard, screen magnifier, large mouse cursor, high contrast mode, sticky keys, tap-dragging, and ChromeVox itself.

 

Peter Lundblad demonstrated ChromeVox, a screen reader made especially for ChromeOS.  Support for voices in multiple languages has been recently added; Peter demonstrated this with both German and British female voices.  Refreshable braille device support has also been added to ChromeOS.  This particular demonstration was interesting to me because I’ve never actually seen one of these devices in action.  There is a “growl-like” on-screen display of the braille output so sighted users can see what the braille device itself is showing.  Peter added a bookmark using the braille device.

 

Dominic then took over and talked about synchronized bookmarks (and other settings) that “follow the user” to whatever device they may be using.  He demonstrated this using an Android phone.  The phone he showed the audience successfully showed the bookmark that was set by Peter on the Chromebook a few minutes before.  Dominic then activated the local search control (a circular control with links to phone functions) by swiping up and to the right to activate the link.

Dominic then demonstrated the ChromeCast, which lets you “cast” content from any Chrome browser to a display the Chromecast is plugged into.  Laura Palmero shared her personal experience using the ChromeCast.  Laura is a person with a vision disability that makes it difficult for her to view things in the center of her field of view, so she relies on high-contrast displays that are close to her (like her phone).  This has made it much easier for her to interact with her large screen television at home…she now controls it using her phone, which she uses all the time.

 

Question:  what about the accessibility of Google Docs?  There is a Google Docs session tomorrow (Thursday) that goes into great detail about Google Docs.

Question:  what is the strategy with ChromeBook?  It seems like just an interesting toy.  Answer:  it’s not a general-purpose computing device that’s meant to replace all computers.  It’s a device that’s made to work with the web,

Question:  what tools are you providing so developers can have access to things like view source, that sort of thing?  Answer:  we know we have some work to do with this, but there are workarounds.  Please speak with us after the session.

Question:  how well does it support ARIA?  Answer:  we make extensive use of ARIA in our web apps, and we rely on open standards and participate in working groups.

Categories
Accessibility Technology

The CSUN 2013 Web Track Mega Post

Greetings, fellow web accessibilistas!  (not to be confused with accessiballistas, the little-known and even less-documented accessible siege engine of yore).

As you may have gathered if you followed my live blog posts a couple weeks ago, my interest in attending the CSUN 2013 conference was almost exclusively web-related.  Now that it’s been a couple weeks and I’ve had some time to reflect, I figured it would be a good idea if I consolidated everything into one mega-list.  This year, there were several times when I wish I could have been in two places at once.  Hopefully this gives you a pretty representative sampling of what was on offer web-wise this year.  Follow me @paulschantz for more web-related topics, including accessibility, project management, web development and design philosophy, thoughts on working in higher education, bad clients, off-color humor, and other ephemera.  Enough self-promotion…on with the list!

Pre-Conference Seminar:  Google Accessibility

Day One:  February 27, 2013

Day Two:  February 28, 2013

Day Three:  March 1, 2013

Categories
Accessibility Technology

Google Accessibility – Partially Digested Observations

Holy moly, that was an information-packed session today!  And, what a difference from last time I saw Google at #CSUN.

I saw Google’s presentation on accessibility when I attended the #CSUN conference (I believe) four years ago.  At that time, I got the impression that Google was “phoning it in.”  The reps they sent at that time were clearly lower-level, more tech-oriented staff and didn’t present their story to the #CSUN audience in a compelling or memorable way.  If I were cynical, I’d say their approach at that time smacked a bit of technical smugness.

Fast forward four years…

Today, there were no fewer than 15 people from Google, including product managers from their core applications, internal accessibility evangelists, and development staff.  They’re not messing around now.  Here are some of the standout items, from my perspective:

  1. Google’s development team is working hard to standardize keyboard navigation across their core applications.  This is huge, and will pay big dividends for all users in the very near future.
  2. For obvious reasons, Calendar was not mentioned much.  To Google’s credit, they did not evade critical questions.  Calendaring is freakin’ hard – my team made a public web calendar for the CSUN campus a few years back, and I can assure you that that effort was no joyride:  www.csun.edu/calendar
  3. Google acknowledges the problem of keyboard shortcut collisions.  Sorry folks, there are no “standard” keyboard shortcuts but the ones that have come about due to historical cruft of the software industry.  People using niche apps will unfortunately be caught in the lurch.  This isn’t all bad though, because…
  4. …Google’s larger plan is to have their entire ecosystem in the cloud.  Like it or not, this is the future of computing.  This hearkens back to a conversation I had with my Computer Science colleagues regarding “the cloud” about five years ago.  My question back then was “what happens when everything is available in the cloud?”  Answer:  “we pick and choose those services that we need and trust.”  Google is building those services today, and from what I can see, I trust that they’re working on it.  BUT…we have to continue to push for more accessibility.  If we don’t evangelize and make it a priority, it just won’t happen.
  5. Speaking of evangelism, I get the distinct sense that the push for accessibility within Google is an uphill battle at times, but the organization is really starting to “get it.”  Working as a director of a web development team in the CSU with responsibilities around ensuring accessibility on my campus, I can relate.
  6. The advances in accessibility built into the Android OS (Accessibility Service Framework and APIs) are downright impressive.  The work around creating an intuitive “navigation language” alone merits a gold star in my opinion.
  7. Google’s position of supporting current browser version “minus two” is a goddamn blessing and should be shouted from the mountain tops.  I feel very strongly about this, and have written a browser support statement to clarify why I take a similar position with my team’s work:  http://www.csun.edu/sait/web/browsers.htm

Maybe it’s just me being cynical again, but I could kind of sense a faint hint of technical smugness today.  Its character was different though, and I think that comes from the audacious scope of what Google is trying to do as a company.  When you throw around statements like “the web is our platform,” I guess it’s hard to be humble.

Categories
Accessibility Technology

Google Accessibility Update – Live Blog Post 9 – Fireside Chat Q & A

GETTING RIGHT INTO Q&A HERE
  • Will docs provide icons for people with cognitive disabilities?  It is being prioritized “in context” with all the other basic needs of blind and low-vision folks.
  • What is level of accessibility on Android of calendars when scheduling events?  Currently, best option is to use the agenda view.  All basic features work with TalkBack.  Gestures also work…nothing needs to be done any differently to make things work here.
  • TalkBack settings can be set to “audio duck” to give audio focus to events (say, new message notification) when listening to music, television, etc.
  • Are there plans to release tools to help developers automate analysis / do accessibility testing?  We’re looking at doing this in the future, (possibly) through tools that would cause builds to fail.
  • When will all “default” Android apps will be talkback accessible?  CASEY BURKHARDT answered this – we evangelize heavily inside Google, we know Gallery  in particular is a problem.  Most of the apps are good out-of-the-box.
  • How do you work with people who are recently vision-impaired or have limited computer abilities?  We’re not requiring people to have previous knowledge of interacting with machines, but are instead making interfaces that are intuitive for humans.  For example, iOS simplicity being used by children (like my 5 year old).
  • Will Android support other physical devices such as wireless (bluetooth, NFC, security dongles, etc.) devices?  If there are accessibility APIs, then we can bind to them.  Our goal is to enable all of these things, but we probably won’t scale it up on our own…tell us what you need and we can extend the platform to make it happen.  For example, there is NATIVE OTG support for many Android devices.
  • There was a vague question about date / time support in HTML5 (sorry, I didn’t catch the entire question):  support of HTML5 date / time controls are in latest OS version.
  • Can you talk about CourseBuilder?  TV RAMAN:  our team works in research, but we are working with that team, especially the component that allows users to create their own content.  (Paul’s 2 cents:  you can build a content management system, but people can abuse it and put completely inaccessible content into it…can’t really help that.).

TV RAMAN closed the event, thanking everyone for attending.

Categories
Accessibility Technology

Google Accessibility Update – Live Blog Post 8 – The Android Platform

Android Platform

  • Gabe Cohen, Product Manager
  • TV Raman, Lead, Google Accessibility
  • Casey Burkhardt, Software engineer, TalkBack

TV RAMAN Background: we want to make it easy for mobile app devs to make their apps accessible.

  • First Android handset delivered in 2008
  • Accessibility API version 1.0 and TalkBack published in August 2009
  • Use what we build (I call this “eating your own dog food”)
  • Was originally focused on devices with a physical affordance on the device itself (i.e. keyboards, buttons).
  • Added gestures and actions
  • Velocity of accessibility development is proceeding with same velocity of each new OS release
GABE COHEN
Talked briefly about some important dates and things that happened in 2012, highlighted momentum of the platform (1.3 million activations a day and climbing).  Wow, isn’t that an impressive number?
June 2012
  • Been on team since June 2012
  • Nexus 7 launch
  • Jelly Bean 4.1
  • Project Butter (performance improvements in touch and visual frame rates)
  • Google Now (shows you things the OS thinks you may need at any given time – i.e. “context aware” information and functions)

October 2012

  • Nexus 4
  • Nexus 10
  • Jelly Bean 4.2
  • Multiple Users
What were Google’s goals as they pertained to accessibility in 2012?
  1. Enable completely unassisted setup
  2. Faster system navigation
  3. Reliable UI navigation
1 – ENABLE COMPLETELY UNASSISTED SETUP (directly from slide)
  • Easy Activation: 2 finger long-press in setup wizard
  • Accessibility Focus:  persistent focus for next action / navigation, double-tap anywhere to launch focused item
  • Reliable navigation:  navigate forward and backward with a swipe, invoke system functions with gestures
  • Fine-grained text traversal control
  • TalkBack support for the above
  • Support for Braille accessibility services  (get BrailleBack from Google Play)
JELLY BEAN 4.2 CHANGES
  • Accessibility shortcut:  toggle talkback from anywhere using power menu
  • Screen magnification:  triple tap to zoom in and interact while panning, triple tap and hold to temporarily zoom
Android Accessibility Platform (from slide)
Accessibility Service Framework

  • Software navigation of app controls and text
  • Enables:  talkback, brailleback, test automation
  • Open APIs for developers
Accessibility APIs for Apps
  • Standard controls and layouts “just work”
  • Allow custom traversal of app content
  • Allow custom UI controls
  • Design guide, checklist
  • Evangelism
DEMONSTRATION BY TV RAMAN OF GOOGLE NOW
APIs are “built-in” to allow developers to build apps that work with Google Now.  Not designed specifically for people with disabilities, but for everyone.  He showed how tools can be accessed via “linear access” and “random access” methods via touch and gesture.
Accessibility can be enabled and disabled quickly via “breakout menus,” which looks like a circle with 3 settings arranged around it (“radial menus”), plus there are landmarks in the corners of the device.  Force feedback – that is, shaking the device – can pause TalkBack.  This option is also available via the breakout menus.
“Local menus” are like context menus.  They allow access to types of structured content, sort of like navigating via headers (h1, h2, etc.) in web pages.  Types of content might include blocks of text, lists of controls, list item sections, and so on.
TV Raman believes that the Android platform has only touched on about 20% of its potential.  There’s tons of potential for growth here.
Showed off some neat gestures, which demonstrated how the device “remembers” navigation.  This feature helps eliminate the need to remember how to traverse deeply nested menus after you’ve left that option tree.  No additional developer input is needed to make features like this work, which I must admit is pretty cool.
Provided a demonstration of docs with mobile device, using a Google doc of questions that had been asked earlier in the day.
MOBILE DRIVE AND DOCS (directly from slide)
Accessible versions of mobile drive
  • Drive is accessible with Braille support
  • Native versions of Docs & Sheets launched in the last year
Still rely on platform support for building most accessibility functionality
  • Best experience on Android is with JellyBean
  • Experience is sometimes constrained by the platform APIs
Live demonstration of Google Drive on an Android (Galaxy Nexus)
  • Can explore through both touch and swiping gestures to:  traverse, select, and open items
  • Navigation through lists works the same way
  • Showed how to share doc with others and provide sharing options (read only, edit, etc.)
  • Showed basic editing of docs, with “full control” of native doc
  • Table was listed at the top of the document to demonstrate table editing feature and TalkBack reading level options (which can themselves also be controlled by swiping gestures)
  • Basically, you don’t want to edit massive documents on a mobile device, but it’s definitely possible to make simple- to medium-sized edits.
  • SpreadSheet editor is coming along
NEAR TERM AND LONG TERM (directly from slide)
Near Term
  • Ensure new featuers are accessible and conform to protocol on each platform
  • Integration testing to prevent regressions
Long Term
  • Continue to build out mobile functionality to match Drive & Docs on the web
Braille Device Support (this feature is pretty badass in my opinion)
  • Makes accessible to deaf-blind users
  • Functions as both a screen and a keyboard
  • Accessibility actions can be triggered via Braille keyboard (BrailleBack)
  • Will be available on ChromeOS soon
  • New version coming out tomorrow on Play, lets devs see “on screen” what the Braille output looks like
  • Also has TTY and high-contrast support
CASEY BURKHARDT demonstrated low vision support
  • Settings to enable magnification feature and turned on gesture support
  • Enabled a specific gesture to turn on magnification with a gesture
  • Can temporarily enable these features to scan things quickly
  • Explicitly does NOT magnify the keyboard (becomes a usability nightmare)
FIRESIDE CHAT Q & A

QUESTIONS FROM MY TWEEPS

  • Are improvements being made to the main googleVoice interface?
  • When will all default Android apps will be talkback accessible?  Gallery and calendaris notoriously badf
  • What’s the name of the feedback list serve you mentioned earlier?
%d bloggers like this: