This is my fifth session from the first day at the CSUN conference. This session covers “…the built-in accessibility features of Chrome, Chrome OS and Chromebooks.” Description comes from the conference event guide. I attended Google’s pre-conference seminar in 2013, and it was very informative (my 10-part blog post can be accessed here). I hope they pack in the juicy details this year too 🙂
- Dominic Mazzoni, Software Engineer on the Google Team (@)
- Peter Lundblad, Engineer on the Google Chrome Team (@)
- David Tseng, Software Engineer Google Chrome Team (@)
David Tseng showed off a remote control that comes with ChromeVox built-in. It’s meant for video conferencing. David used the tool to join a Google Hangout (a kind of vido call). It worked well in the demonstration, at least from the perspective of selecting and joining an existing Hangout.
Dominic Mazzoni talked briefly about the importance of the web as the world’s largest open platform. The Chrome browser was originally introduced with the following three principles/priorities in mind:
- Speed: re-introduced competition into the browser market
- Simplicity: create a browser that doesn’t distract from the content you’re looking at. Also, updates happen automatically.
- Security: updates resolve holes asap
Dominic jumped into ChromeOS and showed some of the accessibility features available, including on-screen keyboard, screen magnifier, large mouse cursor, high contrast mode, sticky keys, tap-dragging, and ChromeVox itself.
Peter Lundblad demonstrated ChromeVox, a screen reader made especially for ChromeOS. Support for voices in multiple languages has been recently added; Peter demonstrated this with both German and British female voices. Refreshable braille device support has also been added to ChromeOS. This particular demonstration was interesting to me because I’ve never actually seen one of these devices in action. There is a “growl-like” on-screen display of the braille output so sighted users can see what the braille device itself is showing. Peter added a bookmark using the braille device.
Dominic then took over and talked about synchronized bookmarks (and other settings) that “follow the user” to whatever device they may be using. He demonstrated this using an Android phone. The phone he showed the audience successfully showed the bookmark that was set by Peter on the Chromebook a few minutes before. Dominic then activated the local search control (a circular control with links to phone functions) by swiping up and to the right to activate the link.
Dominic then demonstrated the ChromeCast, which lets you “cast” content from any Chrome browser to a display the Chromecast is plugged into. Laura Palmero shared her personal experience using the ChromeCast. Laura is a person with a vision disability that makes it difficult for her to view things in the center of her field of view, so she relies on high-contrast displays that are close to her (like her phone). This has made it much easier for her to interact with her large screen television at home…she now controls it using her phone, which she uses all the time.
Question: what about the accessibility of Google Docs? There is a Google Docs session tomorrow (Thursday) that goes into great detail about Google Docs.
Question: what is the strategy with ChromeBook? It seems like just an interesting toy. Answer: it’s not a general-purpose computing device that’s meant to replace all computers. It’s a device that’s made to work with the web,
Question: what tools are you providing so developers can have access to things like view source, that sort of thing? Answer: we know we have some work to do with this, but there are workarounds. Please speak with us after the session.
Question: how well does it support ARIA? Answer: we make extensive use of ARIA in our web apps, and we rely on open standards and participate in working groups.