Presentation is loading. Please wait.

Presentation is loading. Please wait.

WCAG 2.0 Compliance Costing Model

Similar presentations


Presentation on theme: "WCAG 2.0 Compliance Costing Model"— Presentation transcript:

1 WCAG 2.0 Compliance Costing Model
16th Annual Accessing Higher Ground Accessible Media, Web and Technology Conference November 4-8th, 2013 Westminster, Co Why this presentation? Phill Jenkins – IBM Research, Human Ability & Accessibility Center Dan Shire – IBM Interactive, IBM Canada 3:30 – 4:30 PM Cotton Creek II 6 November v6

2 Our agenda Project Overview I/T Web Project Life Cycle
Sample sites & pages Testing Results Costing Model Policy Considerations Opportunities & Challenges Our experience is relevant to organizations and web properties trying to estimate the costs of accessibility compliance. Why this presentation?

3 Project Overview Questions that need answers:
What does it take to make a website accessible? What does it take to keep it that way? What are the opportunities and challenges? Project Steps Identify candidate sites and sample pages Assess (test) sample pages for accessibility Estimate cost to test, repair, and maintain Apply project experiences to sites and policy 7 week timeline for initial project

4 I/T project life cycle (a simplified view!)
User experience, including accessibility and support for inclusive design, should be at the heart of your project – and needs to be integrated into every phase of the project. Roles: Activities: Deliverables: Though we can use many models to describe a typical I/T project, this one provides an easy and generic set of phases: initiation and requirements, design, build, test and finally deploy/support. At the center of any information technology solution, we should have the users. We need to meet their fundamental business needs, or we cannot be successful. Within each of these project stages, there are critical roles (or participants), activities and work products or deliverables. Each of these items are affected by the accessibility requirements.

5 Initiation & Requirements
I/T project life cycle User experience, including accessibility and support for inclusive design, should be at the heart of your project – this can be integrated into every phase of the project. Users Initiation & Requirements Design Test Build Deploy/support Roles: Business Owner Marketing Project Manager Solution Architect Business Analyst Designer Developer Tester/QA Technical writer Activities: Manage procurement Define requirements Creative Design Macro and micro design Develop application Define & execute test plans Document solution Support users Plan for next cycle Though we can use many models to describe a typical I/T project, this one provides an easy and generic set of phases: initiation and requirements, design, build, test and finally deploy/support. At the center of any information technology solution, we should have the users. We need to meet their fundamental business needs, or we cannot be successful. Within each of these project stages, there are critical roles (or participants), activities and work products or deliverables. Each of these items are affected by the accessibility requirements.

6 Accessibility impacts by role
Business Owner - correct policy (accessibility standards) and funding in place Marketing – requirements, communications and awareness (i.e, launch) Project Manager – requirements, checklist, and project milestones and status Architect - enabled and capable technology designed into the solution Designer - design and implementation guidance complies with standards Developer - coding implementation to comply with applicable standards Tester - QA and system testing to ensure compliance with applicable standards Technical writer – documents accessibility features, tutorials, etc. Policy and Funding

7 Web site refresh cycles
Two factors to consider Look and feel (branding and navigation) Expensive to update – redesign and significant coding changes. If you get accessibility wrong, it can be broken everywhere. Content – e.g. seasonal hours of operation, ‘contact us’, online restaurant menu, new products and services. This information can be relatively dynamic and changes frequently. Compliance is a problem if the CMS is not enforcing the presentation style How often do organizations update their sites ? Consensus from 5 web developer agency interviews Static sites seem to be updated every 3 years, with a refresh of look & feel, navigation, menus, branding, and certainly refresh of content & technology Our test sample (10 sites) Refresh periods varied over time. Anecdotal and a small sample. In the last 5 years, most sites appear to have been refreshed on a cycle of < 2 years.

8 Sites selected and description
99% of businesses1 Company Size / Site complexity  Small  Medium    Large Static web pages A1 Restaurant A12 Township B1 Insurance C1 Restaurant chain Dynamic A2 Online magazine B2 University department C2 National Retailer C22 Government department E-commerce B3 Computer retailer C3 Automaker online store A1: Small, private, static A restaurant in a mid-sized town. It is a typical “pamphlet” style website, containing static pages with information on the menu and location. A12: Small, public, static The official website of a township. It offers static information about the town and council for residents and visitors A2: Small, private, dynamic An online lifestyle “magazine” offering articles, blogs and member-generated content. B1: Medium, private, static A multi-office insurance company’s site offering information on products and services. Sections of site include role-based authentication. Content includes downloadable claim and administrative forms B2: Medium, public, dynamic A university department offering department-specific content presented through the university’s website branding and style. Includes course, faculty and department information, as well as a publications library. B3: Medium, private, e-commerce The e-commerce site of a computer retailer with multiple outlets. Site content includes catalogue and full online transactions and well as user-based product rating and reviews. C1: Large, private, static Multi-national restaurant chain’s informational site for Canada, featuring menus, location information, job opportunities and news C2: Large, private, dynamic Canadian retailer’s web presence for everything from investor information to careers. Site offers different entry point for online shopping. C22: Large, public, dynamic Provincial government department offering online citizen services and applications C3: Large, private, e-commerce Multi-national automobile manufacturer’s online store for ordering a new car 1 Source: Stats Canada

9 Most businesses are small businesses
Canadian experience 380,000 businesses small and medium 1 to 49 employees: 94.8 % 50 to 200 employees: 4.2 % 1 to 200 employees: 99.0 % large > 200 employees: 1.0 %

10 Site pages tested Small Medium Large Company Size / Site complexity
Static web pages A1: 5 of 8 Homepage, Contact, About Us, Menu, Press A12: 5 of 154 Homepage, Contact, Map, Community Events, News & Announcements B1: 5 of 46 Homepage, Forms, News, Contact, Sign In C1: 6 of 50 Homepage, Location/hours, Nutrition calculator, Community, Card, FAQ Dynamic A2: 5 of 380 Homepage, Registration, Search Results, Video Page, Blog Page B2: 6 of 126 Homepage, Permission forms, Graduate studies list, Publications, Calendar C2: 7 of 107 Homepage, Search results, Sign up, Locations, E-flyer signup, Weekly flyer, Sports, Golf C22: 5 of 14 Homepage, Welcome, FAQ, Introduction, Step 1 E-commerce B3: 4 of 1,434 Create account, Product, Shopping Cart, Order Confirmation, (Third Party “Place Order”) C3: 5 of 201 Build and Price, Choose Model, Choose Options, Choose Accessories, Summary

11 Testing methodology The assessment team performs the assessment tasks, collects, and summarizes the findings. Perform Assessment Scope & Schedule Validate Requirements Deliver and Review Findings & Recommendations Optional Usability Assessment * Remediation * Impact Analysis & Sizing Perform Assessment phase Create detailed application test cases Configure automated test tools, AT, and OS settings 1 Run test using automated test tools 1 Conduct test using Assistive Technologies 1 Conduct manual test using expert techniques Inventory and summarize findings Assign severity ratings to issues Review issues with technical stakeholders This diagram represents a summary level view of the overall process of doing an assessment. Assessment *Remediate LEGEND Note: 1Comprehensive testing methodology

12 Testing results – number of issues
Company Size / Site complexity  Small  Medium    Large Static web pages 63 issues 68 issues 87 issues 200 issues Dynamic 248 issues 85 issues 32 issues 160 issues E-commerce 48 issues 28 issues 1. The total number of issues does not reflect the relative effort to remediate issues. Some issues, such as using appropriate headings in content, are quite easy to resolve, while others, such as appropriately communicating errors in a form, are complex. 2. The total does not always reflect the relative maturity of a site. The Medium E-Commerce site is very difficult to use but has one of the lowest issue totals. The Large Static site shows the second most issues, but only had one page that#. The total does not reflect the severity of various issues. Some issues will not prevent the site from being usable. (Failing to provide a means of skipping to the main content on a page makes the site harder but not impossible to use). Other issues, like the keyboard control becoming trapped on a page so a user cannot do anything, make a site unusable. Level A and AA issues are not separated out in this table. 3. Conclusions about the nature of a category of site cannot be drawn from the sample size. While the two Small Static sites were quite closely aligned in regard to total errors, the two Large Dynamic sites’ issues varied by a magnitude of 5. A larger sampling size is required to make any accurate broad conclusions. The type of business (online versus bricks & mortar) can be a significant factor in the same quadrant. Similarly, there is not necessarily a correlation between the size of a company and the size of its website. 4. The number of pages inspected varies by site, so the totals are not strictly comparable. Most sites have 5 sample pages, but others range from 4 to 7. With such a small sample size, one or two pages make a significant difference to the total issues captured, which can be misleading. Notes: Total number of issues does not reflect the relative effort to remediate issues. Total does not always reflect the relative maturity of a site. Total does not reflect the severity/impact of various issues. Total does not reflect the nature of a site’s category. Sample size may not be large enough to draw conclusions. Totals are not strictly comparable due to variance in number of pages assessed.

13 Testing results – types of issues & challenges
Company Size / Site complexity  Small  Medium    Large Static web pages A1: identical ALTs, no keyboard, video ownership A12: no keyboard, confusing; staff content B1: Easy to understand, predictable, parsing errors, good attachments C1: Pretty good, except Nutrition table, parsing errors Dynamic A2: Advertising, “messy”, “accessible, just not usable”, parsing issues B2: good text-based navigation; unreadable attachments C2: 900 Parsing errors, large but “pretty good”, minor structure issues C22: Step 1 focus issue, by line navigation E-commerce B3: Horrible code, “busy”, 3rd party transaction C3: flash front end (not tested), On input error A1 A1’s failures as an accessible website stem from three main issues, which taken together create a “perfect storm” of inaccessibility. Its dependency on poorly constructed mouse-overs to provide navigation Its use of a table structure to control the visual presentation of its content Its use of the same ALT description on every image, regardless of the image’s meaning or purpose. The mouse-overs are incorrectly coded, so cannot be activated by a keyboard. Because they are the sole means of navigating the site (a failure, in itself, of 2.4.5), it is impossible for a user without a mouse to go to a different page. A1’s layout is done by tables. This is not necessarily a problem, but the technique is outdated, and if done incorrectly (as here) can aggravate accessibility. Much of A1’s table structure is used to position images on the screen. But because each of these images and image fragments, is given the same ALT name (the site’s name), a user navigating with a screen reader hears the site name repeated almost a dozen times on every page before any other content is announced. If the ALTs were properly configured, the table would likely be treated as a layout table by assistive technologies, and not treated as a data table, containing useful information. A1 also raises an interesting question about who is responsible for accessible content. It contains three links to videos, but the videos, which lack captions and descriptions, are hosted on a separate site. Is A1 responsible for making this content accessible? Artifacts of old links left (tab stops with nothing there) Principle 1: Navigation controls that use text links instead of images will inevitably create less issues. A12 Inaccessible navigation. Confusing orientation. Information architect failure ** Usable Not keyboard accessible. Main navigation is a script-based imitation of a link. No Home on the home drop-down. Forget how to navigate, even with mouse Content updatable by Staff (Events). Issue with training or upload tool? A2 Advertising “Messy” Accessible, just not usable Design versus information architecture. Site has “look” but no flow Redesigned about 2 years ago. Retrofitted content to new look? Good information on site, just not organized Parsing issues Breadcrumbs could maybe help… B1 Easy to understand content. Clear. Accessible downloadable content (out of scope) Predictable issues (e.g., a link to a doc in left nav) Parsing errors ALTS, but not critical Demos how there are three components: HTML coding, accessibility and usability B2 All attachments unreadable (out of scope) Used headers and content in usable tables Minimized graphics so text-based navigation and standard elements are good Despite HTML errors, the site is relatively usable “matter of degree” B3 Parsing issues. Horrible code “extremely frustrating to use” Dead-ended at shopping cart verification. Couldn’t confirm** (3.3.4) “so much stuff” “I find it overwhelming” Visual info on the page not organized “busy” Jaws did not get focus. Dynamic updates? Check.** hard to handle keyboard focus C1 M-L without nutrition calculator Not very good markup, but pages not that big Drop downs for navigation (keyboard access?) Lots of parsing errors Nutrition designed like a standalone app, but fails – drag & drop, flash Ghost PDFs (also on Oscars) C2 Parsing errors. 889 just on one page. Reduced to 10 with review Has link to accessibility page “other than the pages being large, pretty good” Some structure issues: multiple lists with no headings in between (likely result of items) “I was quite impressed with it” C22 Main issue Step 1 focus –error 3.3.1? ** Check and report to Phill Could have made more use of headers, but layout not bad. Had to navigate by line C3 Flash no good otherwise quite good. Could follow steps in process Pictures of modified car with no ALT (opportunity for long ALT?)** Had to arrow a lot due to info not separated with structure Inclusive Design, Usable Access

14 Interactive – gives views by priority and technology
WCAG 2.0 Success Criteria and Techniques 4 Principles/ 12 Guidelines 61 Success Criteria 25 Level A 13 Level AA 23 Level AAA Interactive – gives views by priority and technology How to Meet 4 Principles Guidelines Success Criteria (A + AA) 100’s Techniques Perceivable 1.1 Text alternative 1.2 Time-based media 1.3 Adaptable 1.4 Distinguishable Operable 2.1 Keyboard Accessible 2.2 Enough time 2.3 Seizures 2.4 Navigable Understandable 3.1 Readable 3.2 Predictable 3.3 Input Assistance Robust 4.1 Compatible Understanding WCAG 2.0 Rationale and Benefits Examples Sufficient Techniques WCAG 2.0 Techniques General (G1-G199) HTML (H2-H91) CSS (C6-C63) SCRIPT (SCR1-37) SERVER (SVR1-4) SMIL (SM1-14) TEXT (T1-T3) ARIA (ARIA1-4) Common Failures (F1-F89) Core of the guidelines is at criteria and techniques WCAG 2.0 has 25 Level A and an additional 13 Level AA for a total of 38 requirements. The 4 principles are for grouping. The 12 guidelines are similar to categories or functionality The 61 Success Criteria are equivalent to required checkpoints 38 success criteria A & AA 25 Level A 13 Level AA 23 Level AAA Success Criteria = “What” Techniques = “How” Informative Techniques General and technology-specific Test procedures 100s of techniques 199 General 91 HTML 30 CSS 37 Client-side scripting 4 Server-side scripting 89 Common failures WCAG 2.0 challenges: Technical language and structure Inconsistent granularity Seemingly inconsistent groupings of criteria Addressed in template by attempting to: Create clear issue categories Create one-to-one mapping of issues to criteria Use natural language Breakdown (or merge) criteria to arrive at usable data Ongoing process

15 Issues/topics by WCAG 2.0 Success Criteria

16 25 Success Criteria (Level A)
Costs of Web Accessibility remediation – Hours of effort by role (Level A) 12 Guidelines 25 Success Criteria (Level A) Biz Mkt PM Ach Dev QA TW 1.1 Text Alternative 1.1.1 Non-Text Content 4 1 * 1 All 1.2 Time-based Media 1.2.1 Audio-only and Video-only (Prerecorded) + $ 2.25 / min 1.2.2 Captions (Prerecorded) + $ 4.00 / min 1.2.3 Audio Descriptions or Captions (Prerecorded) 1.3 Adaptable 1.3.1 Info and Relationships 2 8 1.3.2 Meaningful Sequence none 1.3.3 Sensory Characteristics 1.4 Distinguishable 1.4.1 Use of color .5 1.4.2 Audio Control 2.1 Keyboard Accessible 2.1.1 Keyboard 4 8/10 2.1.2 No Keyboard Trap 24 2.2 Enough Time 2.2.1 Timing Adjustable 2.2.2 Pause, Stop, Hide 1 site 2.3 Seizures 2.3.1 Three Flashes or Below Threshold .25 2.4 Navigable 2.4.1 Bypass Blocks 9/10 2.4.2 Page Title .1 6/10 2.4.3 Focus Order 12 2/10 2.4.4 Link Purpose 3.1 Readable 3.1.1 Language of Page 3.2 Predictable 3.2.1 On Focus 3.2.2 On Input 4/10 3.3 Input Assistance 3.3.1 Error Identification 5 Help, Tutorials 3.3.2 Labels or Instructions 7/10 4.1 Compatible 4.1.1 Parsing 4.1.2 Name, Role, Value All = all sites tested failed this criteria None = none of the pages tested exhibited this issue/failure x/10 = number of sites out of the 10 sites tested had this failure Notes: Example of variability by role - Text alternatives to images vs business alternative to CAPTCHA (i.e., phone support). Example of variability of applicable success criteria - Short text descriptions vs long descriptions vs CAPTCHA alternative.

17 13 Success Criteria (Level AA)
Costs of Web Accessibility remediation – Hours of effort by role (Level AA) 12 Guidelines 13 Success Criteria (Level AA) Biz Mkt PM Ach Dev QA TW 1.1 Text Alternative Text alternatives 1.2 Time-based Media 1.2.4 Captions (Live) 4 * 1 none 1.2.5 Audio Description (Prerecorded) 5/10 1.3 Adaptable 1.4 Distinguishable 1.4.3 Contrast (4.5:1 minimum) 6 3 4/10 1.4.4 Resize text 8 1.4.5 Images of text 2 1 site 2.1 Keyboard Keyboard Accessible 2.2 Enough Time 2.3 Seizures 2.4 Navigable 2.4.5 Multiple Ways 3/10 2.4.6 Headings and Labels 2.4.7 Focus visible 2/10 3.1 Readable 3.1.2 Language of Parts .5 3.2 Predictable 3.2.3 Consistent Navigation 3.2.4 Consistent Identification 3.3 Input Assistance 3.3.3 Error Suggestion 3.3.4 Error Prevention (Legal, Financial) 4.1 Compatible Empty row or cell = no applicable level AA Success Criteria

18 Example rates by role Large ICT businesses Small ICT businesses
$ Level 2 Business Owner $72 Level 1 Marketing $83 Level 1 Project Manager $ Level 1 Architect $ Level 1 Designer $83 Level 1 Developer $83 Level 1 Tester/QA $72 Level 1 Technical writer Rates per day (7.25 hrs) Experience Level 1 = 2-4 years, Level 2 = 4-9 years, Level 3 = 10+ years of experience Small ICT businesses Rates approx. $40 to $100 15 page sites range from $3k to $5k Surveyed 5 small businesses Assessed 10 sites

19 Preliminary survey – 5 web development teams
A very small sample - 5 web development businesses (2 sole-proprietor, 3 teams (6, 20 , 60)) Common themes Small sites (< 10 pages) are still reasonable candidates for an HTML solution Most sites are now implemented with a Content Management System (CMS) CMS allows the end client to maintain their own site content Emerging support in CMS for accessibility standards (Drupal, WordPress) Some consultants have a proprietary CMS – this creates a dependency New sites? 10 to 15 pages cost in the range of $3,000 to $5,000 – with an open-source CMS Municipalities – publicly available information (newspaper article) – refresh the look and feel and the information architecture of 4 municipal web $57,000 with AODA (WCAG 2.0 AA) compliance - 8 years since last refresh Knowledge of accessibility and standards General awareness (5 of 5) Some consultants seem to have a good understanding (4 of 5 in this sample)

20 Sample Remediation costs
Company Size / Site complexity  Small  Medium    Large Static web pages A1: $ 4.5k – $ 18.5k 50 hours to 199 hours A12: $ 4.8k - $ 16.4k 53 hours to 184 hours B1: $ 4.8k - $ 16k 52 hours to 178 hours C1: $ 12.4k - $ 32.3k 134 hours to 349 hours Dynamic A2: $ 10.3k - $ 34k 112 hours to 382 hours B2: $ 4.5k - $ 21.2k 50 hours to 247 hours C2: $ 9.2k - $ 34.8 100 hours to 385 hours C22: $ 2k - $ 10k 24 hours to 115 hours E-commerce B3: $ 3.7k - $ 19.9k 41 hours to 224 hours C3: $ 2.4k - $ 13.5k 27 hours to 152 hours The "Range" is provided to reflect the variability in the estimated effort to meet a particular success criteria. Due to dependencies on site design, site technology choices, and variance in the requirements themselves in WCAG 2.0 the range in estimating the effort varies - some success criteria with little to no variance, and a few others varying as much as 1 to 8 hours to remediate the type of issue. The range is displayed in detail in the cost model assumptions themselves and example discussed in the presentation. The costs reflect only the effort to fix the sample pages. The model includes sizings for all the type of issues to meet the 38 success criteria for WCAG 2.0 Level A and AA conformance, however, the occurrence of those issues is limited to those discovered in the selected pages. If the occurrence were higher and/or if there had been more variety of issues discovered, then the associated costs would have been reflected in the hours and costs depicted in the table. Since only 5 pages were tested, the costs do not reflect the total costs to remediate the entire site. For example, other parts of the Large e-Commerce (C3) were explored (not tested) and determined to contain Level AA issues not reflected in theses costs. The costs do not include QA/Testing and other costs outside the model framework (e.g., training). Only the costs for remediation that involves the roles of Business Analyst, Architects, Designers, Developers, and some time from Project Management were included in the sizings established to meet each success criteria. Other costs outside the model framework are discussed in the report. Conclusions about the costs to remediate other sites in the same category of site cannot be drawn from the sample size. While the two Small Static sites were quite closely aligned in regard to total errors, the two Large Dynamic sites’ issues varied by a magnitude of 5. A larger sampling size is required to make any accurate broad conclusions about the correlation between size of site and size of the company. The sample sites used in study do not necessarily represent the region – a valid sample size survey is needed. Costs for conformance to WCAG 2.0 Level A and AA are presented later in this document. Notes: Level A + AA costs for Business Analyst, Architect, Design, Development, & minimal Project Management (remediation costs did not include QA/Test). Costs above only to remediate 5 sample pages. Some remediation costs quickly exceed site redesign costs. Ranges reflect variability in site design, technology choices, and applicable accessibility requirements themselves.

21 Example of variability 3. 3
Example of variability Level A Provide labels - Labels and input instructions Sighted user sees the ‘Search’ button Blind user hears “Image Button 1” The following are examples of 1. how technology choices can have a great affect on the variability of costs to remediate accessibility 2. maintenance considerations 3. when to re-design 4. popularity of common user interface components in-house built open source vendor proprietary 5. Trends: today’s problems will not be tomorrow’s problems – so remediation and maintenance costs may not be accurate tomorrow. JAWS screen reader displays list of fields Technology choice has greatest impact on variability of remediation costs Maintenance considerations When technology changes In-house custom built vs open source vs vendor Trends: Today’s technology may not be tomorrow’s problem

22 Example of variability 1. 4
Example of variability Level AA Resize Text - Non-HTML may not resize Flash video player Note the small blue icon in bottom right of video – see how it enlarges with zoom such that it is almost readable at 2X. Video content zooms - good, but player controls and labels don’t zoom - poor Fix choice requires major re-work or major dependency

23 Site type and design impacts maintenance costs
Company Size / Site complexity  Small  Medium    Large Static web pages Poor page structure Poor navigation Poor design/template affect whole (but small) site Little CMS usage Average page structure Uses CSS well Easy Navigation Poor control labels Poor keyboard Dynamic Dependency on inaccessible Non-HTML unique user interface components Document repository accessibility will be large burden to maintain accessibility of all the documents Good Navigation Good Keyboard Good Form labels Needs nested headings E-commerce Transactions cannot be completed with keyboard Difficult to understand Poor form labels Flash movies have no audio or text descriptions Needs long description of pictures Web Site Summary: 1. A1: This site is a low maturity; very poor page structure, poor navigation markup, poor text labeling and descriptions. This site cannot be navigated by a keyboard due to inaccessible Links and missing meaningful text on Images and Links. 2. A12: This site is low maturity; poor navigation markup, poor page structure and text labels. This site cannot be navigated by a keyboard due to inaccessible Links and missing meaningful text on Images and Links. 3. A2: This site is low maturity; poor layout, poor navigation markup. This site can be navigated with a keyboard, but requires page content structure with proper navigation markup, text description on Images, and alternate media format (audio/caption/text) on Flash movie clips. Also, Forms require proper text labels and HTML markup. 4. B1: This site is a low maturity; Average page structure, average page navigation markup. OVERALL ASSESSMENT: Structured well using CSS as a layout tool. Otherwise, has no real accessibility features. This page can easily be navigated with a keyboard, and primarily contains downloadable content. Remediation requires meaningful text on Images and Links, and requires proper HTML coding. 5. B2: This site is low maturity; Poor page structure, poor page markup, inaccessible documents. This site can be navigated with a keyboard, and primarily contains inaccessible downloadable content. Remediation requires proper page structure with Headers and navigation markup, and proper HTML coding. 6. B3: This site is low maturity; Very poor page structure, very poor navigation markup, Very difficult to understand. This site can be navigated with a keyboard, but the shopping transaction cannot be completed with a keyboard. Remediation requires accessible keyboard controls, meaningful text labels on Forms, and proper page structure with nested Headers to make large pages understandable and easier to navigate. 7. C1: This site is low maturity; poor page layout, poor navigation markup. This site can be navigated with a keyboard, but some pages have functions that lack accessible keyboard controls. Remediation requires proper page structure with nested Headers, and keyboard controls on mouse only actions. 8. C2: This site is medium maturity; Lacks page structure and Landmarks, Good navigation Header markup. This site can be navigated with the keyboard, but tends to have large pages that are difficult to navigate. Remediation requires meaningful text on Forms and Images, and proper page structure of content sections. 9. C22: This site is medium maturity; poor navigation markup and text labeling, good accessible Forms and accessibility help. This site can be navigated with a keyboard, and contains meaningful text for the most part. Remediation requires proper page structure with nested Headers. 10. C3: This site is medium maturity; poor page structure and navigation markup, good accessibility to Forms. All Flash movies have no audio output, no text description, and no keyboard controls. This site can be navigated with a keyboard, but lack proper page structure with nested Headers. Remediation will require long descriptions for pictures, and alternate media (audio/caption/text) on Flash movie clips Note: Example of maintenance impacts

24 Costs assumed outside model framework
Technical and business IT costs Project level overhead costs Organizational costs Software tools purchases and training Automated tools like IBM Rational Policy Tester, a-Designer, etc. Tools costs still impact to business and out sourcing groups Opportunity for government to provide automated tools to small business? Technical accessibility skills training Architects, designers, developers, and testers Assume the skills and resources are available Costs are still impact Opportunity for government to provide some training? Technology accessibility enabled Newer technologies and proprietary toolkits may not yet be enabled Newer technologies and proprietary toolkits may be better enabled Alternative or redundant solutions driving costs to 50% and beyond. Return on Investment (ROI) not factored into model framework Increased sales, increased employee productivity, etc. Studies and pilots needed Example of alternative cost driver: early versions of FLASH and PDF, current 3D Internet like 2nd Life, Microsoft SharePoint without the HiSoft plug-ins, etc.

25 Costs assumed outside model framework – continued
Project level overhead costs Schedule Model framework assume its part of an existing project - so no schedule delay/costs due to additional accessibility testing for example. Project overhead It’s part of an existing project - so no additional costs for project management, regression testing, etc. Organizational/Enterprise costs User Accommodations & support Costs for ensuring that employees, citizens, and end users have a reasonable level of supporting assistive technology, internet access, latest enabled browsers, and a supporting operating system platform and the amount of training required to configure and use the technology efficiently – not included in model framework. Periodic Audits Quarterly or Annual audits are recommend to better maintain compliance. Measurements (scanning & testing) and reporting (including executive dashboards) drives accessibility compliance for the organization / agency – not included in model framework. Application portfolio management Should include accessibility and other attributes such as quality, privacy, security, internationalization, etc. - assumed are in place. Document Management Additional costs depending on business/organization - assumed are in place. Costs are not zero!

26 Accessibility is less expensive when “built-in”
Pre-existing site (add accessibility after site built) New/Re-design site (build-in accessibility before design) Accessibility Assessment (one time costs) [avoid these costs] Remediation efforts (one time costs) [avoid these costs] Training and education (periodic costs) = Training and education (periodic costs) Tools and integration (one-time & maintenance) = Tools and integration (one-time & maintenance) Project and Overhead (schedule and costs) = Project and Overhead (schedule and costs) Governance integration (one-time & maintenance) = Governance integration (one-time & maintenance) Quality Assurance (periodic costs) Remediation (less over time) = Quality Assurance (periodic costs) Remediation (less over time)

27 Accessibility costs LESS than NOT doing accessibility
Building in accessibility Not doing accessibility Training and education (periodic costs) Law suits (recurring costs*) Tools and integration (one-time & maintenance) Customer loyalty (recurring costs) Project and Overhead (schedule and costs) Reach less customers (lost revenue) Governance integration (one-time & maintenance) Search Optimization (recurring costs) Quality Assurance (periodic costs) Remediation (less over time) Adaptation for Mobile (recurring costs) Brand value (recurring costs) *lawyers are more expensive than web developers

28 Do most businesses have the internal skills to maintain their own web sites?
In a word … no. 87% of businesses have fewer than 20 employees (Statistics Canada 2009) Most of these organizations will not have an I/T person on staff They would typically outsource their web site development and maintenance (based on interviews with 5 web development companies) If the site is implemented in an accessible CMS, content can be well-maintained locally (consistent, accessible, and following the rules) I/T and related technology based businesses may have internal skills available – (see report under IT capacity & skills) 5.2 % of companies have 50 or more employees Reasonable to assume these companies may have I/T staff capable of developing and maintaining accessibility of the web site.

29 Canadian Accessibility Regulations
Ontario: Accessibility for Ontarians with Disabilities Act (AODA) Covers BOTH public and private sector 380,000 obligated organizations Customer service (like U.S. ADA) Information and communication (more than U.S. ADA & 508) All government (public sector) – WCAG 2.0 A now, AA later Businesses with > 50 employees must provide WCAG 2.0 (A, later AA) web sites for customers Employment (like U.S. ADA) I/T must support employees with disabilities, or organization must provide reasonable accommodation Government of Canada WCAG 2.0 A and AA for all federal departments & agencies

30 Policy Considerations
Ownership of the problem / content – will require applicability guidance Content vs hosting vs links Own content but not hosting environment (uncaptioned video on YouTube) Own hosting environment but not content (documents & web services from others) Own both (applicable) and/or links (not applicable) 3rd party services and/or components/widgets (who and when in life cycle) Evolving technologies – will require updates to policy Mobile apps (native & web), platforms, service providers, content owners, etc. HTML 5 Conformance claims guidance Platform: Which OS and browser to test? (Usage, ubiquity; which browser to recommend?) Testing tools: Which tools to use to make claims? (Cost, ease of use, supporting AA criteria) Assistive Technology: Which AT to make claims? (JAWS, ZoomText, AT failure, down level versions, etc.)

31 Identified challenges and recommendations
Our project challenges reflect the realities to be faced by web site properties Insufficient testing tools, improving testing methods, uncertainty with guidelines, challenges with issue summarization, coordination of resources Training - there’s a significant learning curve – WCAG standards are complex to learn and use. Universities – fundamental inclusive design curriculum Universities – add accessibility to accreditation Practical & relevant training for web developers Professional associations and certifications Tools – for development, testing, and reporting Different tools/approaches give different results Recommendations & support for tools, sponsor W3C WAI Work with vendors and open-source community (e.g. Drupal CMS) Processes – templates A library of resources, especially helpful to small organizations Collaborate with international partners on reusable assets (e.g. G3ict.org) I’m going to talk about challenges and opportunities arising out of the testing portion of the project. Dan will later talk about the same opportunities, as they relate to IT firms. The report contains quite a bit of information about the challenges the project faced. It may seem like I was looking for sympathy, but in fact, what I was intending to do was show how relevant our project experiences are to any organization faced with accessibility for the first time. The project requirements were a little unusual, and they lifted three experienced accessibility subject matter experts out of our standard way of doing things. It forced us to consider testing procedures and these guidelines in a way we hadn’t had to do in quite a while. We developed a testing template and checklist from scratch (appendix E). We compared a bunch of automated tools to see how each one broke (see . We puzzled over whether we were talking about the same issue when more than one of us flagged the same component on a page. We worked through how and possibly why the automated results failed to match our manual tests. We learnt from the experience, and we think the province as a whole, through the government, could benefit from that. Project challenges represent government opportunities. Our position is that the degree to which the province responds to the considerations we raise and provides a solution to the issues we flagged will determine the ease with which the site owners will meet the accessibility guidelines. Here are some things that could be done [ read slide: “standardize…”] 1) Measuring Tool Adopt or create a tool for testing compliance. This would be the same tool that would be used by the government to audit compliance; therefore, a company would have a clear understanding of whether they had achieved the minimum goals of the province. a) The tool must be adaptable (e.g., government can modify the criteria it searches against -- necessary for a stepped implementation) b) and handle multiple streams (e.g., size of business determines entry point) HTML validator run first, then automated tool, then manual; Want to talk about the concept of Severity (David does 4 levels), and how it ties into a need to have a focus. You can’t just adopt the guidelines en masse. You have to decide which issues to tackle first – no difference then offering priorities based on experience and observations

32 Summary Relevant to organizations and web properties estimating the costs. Considered all roles and activities in the IT project life cycle. Reviewed company sizes, web site complexities, anomalies. Reviewed accessibility testing methodology and results. Understanding Costing Model framework, costs, and variability. Skills availability and regulations. Policy Considerations. Challenges and Recommendations.

33 Questions? Phill Jenkins Senior Engineer & Business Development
IBM Research, Human Ability & Accessibility Center Dan Shire IBM Interactive Global Business Services, IBM Canada IBM Accessibility IBMAccess


Download ppt "WCAG 2.0 Compliance Costing Model"

Similar presentations


Ads by Google