- W3C WCAG conformance guidance
- Accessibility support database
- Guidance for tool developers
- Other resources for evaluation of site conformance
ACHIEVING CONFORMANCE
- Awareness raising and buy-in up and down the chain is critical
- Accessibility must be “built-in” and occur throughout the development process
MAINTAINING CONFORMANCE
- Awareness raising and buy-in up and down the chain is critical
CONFORMANCE EVALUATION
- Evaluate to improve
- Verifies conformance statements
- Regular monitoring progress from year-to-year (is your approach working?)
WCAG-EM (Evaluation Model)
- Is currently a working draft
- This is process-oriented guidance.
- Web site definition: “coherent collection of pages that together provide common use or functionality”
5 EVALUATION STEPS OF WCAG-EM
- Define evaluation scope (what is target we’re aiming at, what kind of report do we want – simple “thumbs up / thumbs down,” a list, etc.). Need this for common expectations.
- Explore the target website. Ideally, this happens with the help of the site owner and / or web developer. Do we do this “white box style” or “black box style” Can the developer tell you where the templates are, etc.
- Select representative sample
- Audit the selected sample
- Report evaluation findings
HIGHLIGHTS
Types of reports
- Basic (yes / no)
- Detailed (points out examples)
- In-depth report analyzes issues (details about the issue, how to fix, etc.)
- Targeted sampling
- Random sampling
- Sample adjustment
Performance score (so-called)
- Coarse and potentially misleading
- Misses context and can defer focus
- …but powerful for communication
COMMENTS FROM AUDIENCE
CSUSB has a scoring system devised by a Math student (does yes / no analysis and uses a Likert scale of severity). Reports are provided to management, which creates a kind of “naming and shaming” mechanism. This system has engendered competition among divisions and has increased accessibility across the board – at least according to their system.
Some agencies have a scoreboard…is this a good thing? It can be challenging due to the apples versus oranges nature of differences between sites. This approach gamifies the system, which can be good or bad. It’s probably best to compare yourself against yourself.
CROWD SOURCING DATA COLLECTION AND ANALYSIS
- How can you collect the sheer amount of info?
- Scope: not all info is really equally interesting
- Attempt to collect partially existing info
- Become a “hero” contributor!
- There are a lot of future research projects
- Browsers and AT vendors have a ton of information
VERY COOL: Shadi demonstrated prototype testing tool (I think it’s called “Easy Checks”) that collects data into a central database (URL for tool is in slide deck, which is at top of this post)
GUIDANCE FOR EVALUATION TOOL DEVELOPERS
- Most claim to support WCAG2.0
- Most also claim to be the best tool…
- …yet there are many variances
- Need support for WCAG2 to be more consistent among tools
Current idea is to provide a framework that describes:
- Features that tools can provide
- Profiles of different types of tools
- Can export results into bug-tracking systems (great idea)
- Contribute to more consistent support for WCAG2
- Selection of features to guide tool developers
- Selection of features to inform tool users
How does WCAG-EM work for you? Input regarding scoring is being asked of the community, please send comments by 22 march 2013 to public-wcag-em-comments@w3.org
SCHEDULE
- Current draft is out for review
- Summer 2013: final draft
- December 2013: first Note
One reply on “Managing Website Accessibility Conformance”
[…] Notes from Paul Schantz […]