Categories
Technology

…and the 2015 Kurogo Conference Begins!

After an uneventful flight from LAX to O’Hare and a two-hour drive, I arrived at Notre Dame University yesterday afternoon about 4:00 PM local time.  The Kurogo folks put on a nice mixer at the Morrison Inn for attendees last night from 6:00 – 9:00 PM, and I got an opportunity to meet a number of folks from around the US and Canada who are using the platform.  I’m eager to see how people are using and extending it to meet their campus needs!

The facility that we’re using for the event is the Notre Dame Innovation Center, which I’ll post picture for in a future update.  A very cool space, it’s really ideal for meetups, hackathons, and incubator-type activities.  I wish we had one of these at CSUN!  The reception desk of the center has a whole set of “that was easy” Staples-inspired buttons.  If you’re wondering, yes I pressed every one of these 🙂

that was

This conference is a number of “firsts” for me:  first time at Notre Dame, first time at the Kurogo conference, and first time at a mobile-focused conference.  I’ll do my best to capture what’s being presented here, along with a bit of the event’s flavor.

Watch this space!

Categories
Accessibility Technology

CSUN 2014 Web Track Mega Post

As usual, I like to make a post that sums up my entire conference experience…I call this the “Mega Post.”  As you may have guessed from the titles of the sessions I attended, I’m interested in the web track.  If the web is your bag, you just might find all this helpful.

Enjoy!

 

Friday, March 21

 

Thursday, March 20

 

Wednesday, March 19

 

Tuesday, March 18

 

Monday, March 17

Categories
Accessibility Technology

iOS vs. Android, a Web and Native Application Accessibility Comparison

This is my first session from the second day at the CSUN conference.  This session covers “a comparison of accessibility and WAI-ARIA support in Android and iOS.  Which ARIA features work in one, both, or neither of these mobile platforms.”  I met Paul at CSUN in 2013, and he’s both an enthusiastic and opinionated developer.  This makes his Twitter feed fun to watch at times 😉

Presenter:

  • Paul Adam, Accessibility Evangelist at Deque (@pauljadam)

 

RESOURCES

 

Google has done a great job improving Android’s accessibility API, but is still a few years behind Apple.  Paul showed Apple’s UIKitFramework and some of the properties available when using XCode, including:

  • accessibilityViewIsModal()
  • accessibilityPerformMagicTap()
  • UIAccessibilityPostNotification()

Sending focus to items is much easier in native apps versus mobile web apps (web is not a nice controlled environment and still has a way to go).

Adding descriptions to images in Android:  androidcontentDescription

XCode allows you to add content descriptions via GUI or programatically.

 

WAI-ARIA Live Demos

There’s a WAI-ARIA attribute support matrix available from Paul’s presentation (see resources link above).  The matrix is not exhaustive, but let’s give Paul a break, eh?

ARIA is generally well-supported in current browsers.

Paul then opened up a web page made by Andrew Kirkpatrick that had some pictures with ARIA tags for people to test how different browsers handle the tags.

With Live Regions you can make the screen reader read entries on-the-fly.  However, you need to have a container painted on the page to receive updatable content.

 

Paul then demonstrated a couple simple forms with VoiceOver showing how aria-describedby, aria-required, aria-invalid and jQuery.focus().  Paul has an iOS WAI-ARIA “fail list” on his web site which is worth checking out:  http://www.pauljadam.com/demos/aria-expanded.html

 

Apple does not support HTML5 form validation, but Google does.   Using HTML5 form types are recognized, so context-appropriate keyboards are presented, i.e. numeric keypads in telephone fields, datepickers in date fields, etc.

All mobile browser default placeholders / control outlines fail contrast tests.

 

Paul then gave a demonstration of how captions work on both Android and iOS.  Interesting fact: in iOS, you can’t make an HTML5 button say whatever you want.

 

As a user, you probably want to use iOS.  As a developer, you probably want to play with Google.  I captured a few of the platform pros/cons Paul listed:

  • A big con for iOS developers:  Apple doesn’t do a very good job of documenting fixes to VoiceOver.
  • A con for Android users:  zoom level isn’t maintained between apps.
  • Facebook and Twitter are much more accessible on iOS

 

Q & A

Question:  what are the user stats for each platform…which gets used more?  Answer: iOS is far and away the most used, according to WebAIM’s screen reader survey.

Categories
Accessibility Technology

A Pattern Library as a Foundational Sketch for Web Accessibility Efforts

This is my seventh session from the first day at the CSUN conference.  This session covers LinkedIn’s DaVinci UI Pattern library, which “…demonstrates accessible web patterns that designers and developers can combine into a work of art.”  I’m a big believer in the idea that having published standards can help guide an organization to be consistent with how they do things across channels.  Very much looking forward to this session.

Presenters:

 

RESOURCES

 

SLIDE ONE

Some details about LinkedIn and their mission statement…

 

SLIDE TWO

Accessibility Task Force

  • 12 cross-functional team members
  • Design, web dev, iOS engineering
  • Weekly meetings
  • Review designs and code in progress
  • Working a prioritized backlog of enhancements in existing products

Scrum-style process.  We’re building out a group of full-time professionals to help us with accessibility.

 

SLIDE THREE

A set of pictures that highlight LinkedIn’s mission to help everyone

 

SLIDE FOUR

Getting leverage

  • Company has 1000+ developers
  • Size of dev team has tripled in 2 years
  • Leverage methods

A driving question:  how can one person (or a small team) effectively communicate with such a large development team?

 

SLIDE FIVE

The DaVinci Pattern Library (brings together art & science)

  • Online library of UX patterns for web and mobile
  • Design variants documents
  • Dev implementation documented
  • Accessibility notes (when appropriate)
  • A common language

LinkedIn is considering open sourcing this…”dust” framework and “archetype” are already available.

 

SLIDE SIX

Library overview

Have several sections to their library; there are separate sections for mobile and desktop.  Each section has samples with “exemplifiers” that highlight what it actually looks like in action.

“Patternable things” can be contributed to the library after at least two development teams have done something that looks like – and they have been identified as – a pattern.

 

Drew took over and reviewed the iconography the site uses.  They’re switching to use webfonts now instead of sprites because they scale better and are much smaller in size.  They also have an accessibility that explains how the icons should be coded so it is properly represented by screen readers appropriately (and not read out loud as unicode).  Drew then spoke a bit about dialogs and how they should look (modeless and modal).  Dialogs are given focus, are dismissed by the escape key, and place focus back on the previous element after dialog is dismissed.

 

Sarah then talked about the slider control, an emerging pattern at LinkedIn.  It can be used with numeric values ranges and also discrete values i.e. for privacy (more open, default, private).  Arrow keys move it one increment, while page up/down moves 10, home/end keys move to lowest/highest values respectively.  Both HTML samples and Dust partials are available for implementation to internal developers.

 

Drew then described how forms are implemented.  Stacked forms are preferred to “sided forms” because some languages are more verbose than others.  They’re replacing JavaScript select boxes, which is saving load times.

 

SLIDE SEVEN

Sarah talked a bit about Future Patterns

  • User cards
  • Drop down/overlay controls with full aria
  • Global and page-specific keyboard shortcuts
  • iOS7-style toggle to represent a checkbox

 

Q & A

Question:  How old is DaVinci?  Answer:  about a year and a half.

Question:  This is a lot of overhead, is it worth it?  Yes, developers are expected to deliver value horizontally and contribute to this.

There were a number of really great questions about practical implementation details as well, sorry but I didn’t catch ’em all 🙁

Categories
Accessibility Technology

All About Google Chrome

This is my fifth session from the first day at the CSUN conference.  This session covers “…the built-in accessibility features of Chrome, Chrome OS and Chromebooks.”  Description comes from the conference event guide.  I attended Google’s pre-conference seminar in 2013, and it was very informative (my 10-part blog post can be accessed here).  I hope they pack in the juicy details this year too 🙂

Presenters:

  • Dominic Mazzoni, Software Engineer on the Google Team (@)
  • Peter Lundblad, Engineer on the Google Chrome Team (@)
  • David Tseng, Software Engineer Google Chrome Team (@)

 

David Tseng showed off a remote control that comes with ChromeVox built-in.  It’s meant for video conferencing.  David used the tool to join a Google Hangout (a kind of vido call).  It worked well in the demonstration, at least from the perspective of selecting and joining an existing Hangout.

 

Dominic Mazzoni talked briefly about the importance of the web as the world’s largest open platform.  The Chrome browser was originally introduced with the following three principles/priorities in mind:

  • Speed:  re-introduced competition into the browser market
  • Simplicity:  create a browser that doesn’t distract from the content you’re looking at.  Also, updates happen automatically.
  •  Security:  updates resolve holes asap

Dominic jumped into ChromeOS and showed some of the accessibility features available, including on-screen keyboard, screen magnifier, large mouse cursor, high contrast mode, sticky keys, tap-dragging, and ChromeVox itself.

 

Peter Lundblad demonstrated ChromeVox, a screen reader made especially for ChromeOS.  Support for voices in multiple languages has been recently added; Peter demonstrated this with both German and British female voices.  Refreshable braille device support has also been added to ChromeOS.  This particular demonstration was interesting to me because I’ve never actually seen one of these devices in action.  There is a “growl-like” on-screen display of the braille output so sighted users can see what the braille device itself is showing.  Peter added a bookmark using the braille device.

 

Dominic then took over and talked about synchronized bookmarks (and other settings) that “follow the user” to whatever device they may be using.  He demonstrated this using an Android phone.  The phone he showed the audience successfully showed the bookmark that was set by Peter on the Chromebook a few minutes before.  Dominic then activated the local search control (a circular control with links to phone functions) by swiping up and to the right to activate the link.

Dominic then demonstrated the ChromeCast, which lets you “cast” content from any Chrome browser to a display the Chromecast is plugged into.  Laura Palmero shared her personal experience using the ChromeCast.  Laura is a person with a vision disability that makes it difficult for her to view things in the center of her field of view, so she relies on high-contrast displays that are close to her (like her phone).  This has made it much easier for her to interact with her large screen television at home…she now controls it using her phone, which she uses all the time.

 

Question:  what about the accessibility of Google Docs?  There is a Google Docs session tomorrow (Thursday) that goes into great detail about Google Docs.

Question:  what is the strategy with ChromeBook?  It seems like just an interesting toy.  Answer:  it’s not a general-purpose computing device that’s meant to replace all computers.  It’s a device that’s made to work with the web,

Question:  what tools are you providing so developers can have access to things like view source, that sort of thing?  Answer:  we know we have some work to do with this, but there are workarounds.  Please speak with us after the session.

Question:  how well does it support ARIA?  Answer:  we make extensive use of ARIA in our web apps, and we rely on open standards and participate in working groups.

%d bloggers like this: