Download presentation
Presentation is loading. Please wait.
Published byVernon Barton Modified over 7 years ago
1
Canvas Accessibility: Toward an Accessible LMS AHG 2016
Hadi Rangin (University of Washington) Robert Fentress (Virginia Tech) Jacob Bates (University of Central Florida) Dana Grey (Instructure)
2
Preview Background and why we conducted CATE Methodology and process
About the results: Strength & weaknesses ATHEN Collaboration/CATE from Instructure view Conclusion Contacts and FAQ
3
Background LMS’s have become an essential part of education
LMS’s have feature rich and complex applications Vendors/open-source communities are more aware and investing into accessibility Canvas was proactive in addressing accessibility We didn't have a holistic view of Canvas
4
Acknowledgements ATHEN Canvas Accessibility Collaboration Group
Large number of member institutions CATE (Canvas Accessibility Testing & Evaluation) UW, UCF, VT, U. Mitch, Sean Keegan from CCC Tech Center Big thanks to all collaborators and my students
5
CATE Life Cycle (1/2) Brought interested parties together
Used the LMS Comparison projects as a base Identified the methodology and process for the evaluation Identified infrastructure & resources Time, expertise, meeting tool, test platform, testing tools, file sharing tool, ATS, other logistics
6
CATE Life Cycle (2/2) Performed test and recorded data Normalized data
Evaluated the data Compiled report
7
Technical vs. functional accessibility
We focused on functional accessibility Many accessibility issues are in fact usability issues Technical accessibility is required but not sufficient Our goal: Improve the usability of LMS for everyone including those with disabilities
8
Post CATE Agile development methodology
Canvas has a very proactive accessibility team Canvas provided the platform for us for testing Have been approached by Canvas about the CATE results during the project Some issues were fixed by the time the report was published More issues have been extracted and fixed Canvas now engaging the community in the design process
9
Accessibility/Usability Testing & Evaluation
Methodology Divided critical and major functional tasks into 3 categories: Global Features Main user interface (UI) Selected third-party LTIs
10
Global Features Concluded from overall observations
Basic usability/accessibility features that can enhance user experience for all users Login and Configuration Compatibility Personalization/Customization Navigation Forms, Help, and Documentation Rich Content Editor (RCE)
11
Canvas Main User Interface (UI)
Core tools/features: Canvas landing page (User Dashboard) Courses Calendar Inbox Account: Profile Settings Notifications Files ePortfolio
12
Common Modules/Tools Course landing page (Course Dashboard)
Announcements Assignments Discussions Grades People Pages Files Syllabus
13
Common Modules/Tools (2)
Outcomes Rubrics Quizzes Modules Settings
14
Learning Tools Interoperability (LTI)
Instructure does not have control of them, but can affect accessibility at institutions. Collaborations Conferences Chat Panopto Recordings
15
Learning Tools Interoperability (LTI)
Instructure does not have control of them, but can affect accessibility at institutions. Collaborations Conferences Chat Panopto Recordings Do we want to talk about the diffuse responsibilities here and things Instructure could reasonably be expected to do? For instance, Instructure doesn't write the LTIs; by nature, they adhere to a standard that allows interoperability of 3rd-party tools. That being said Instructure does perform some vetting of LTI tools, in the sense that they have various levels of "partnerships" with LTI tool vendors whose tools they have reviewed. Should accessibility reviews be included in these? At what level? Who will perform them and to what standard? Should this be an industry-wide initiative, rather than the responsibility of one vendor? These issues remain outstanding.
16
Functional Accessibility
Not focused on technical violations of WCAG or Section 508 Can individuals accomplish the same tasks using assistive technologies? So with the broad categories established, what exactly were we testing and for what purpose? Our approach to testing was not to look for technical violations of WCAG criteria or Section 508 standards, though these are a useful guide, but was rather to see if users of certain common assistive technologies would be able to accomplish the same tasks as individuals not using those technologies. Beyond providing guidance to the vendor, such information can assist institutions in knowing where their users might encounter problems and come up with ways of working around them, until the issues are fixed.
17
Features and Criteria Based on
Use Cases derived from Canvas documentation General functionality and specific tasks used in previous LMS evaluations Prior to beginning our collaboration, I had been working on a comprehensive functional evaluation of the product and, as a part of this I had developed a list of use cases, largely based on the Canvas documentation. These included things such as, “Create an announcement” or “Delete a Course Section” and these tasks also indicated the role of the user attempting to complete the task, such as teacher or student, since the interaction can be handled differently sometimes depending on who is performing the action. In Hadi’s previous evaluations of learning management systems, such as Blackboard, he had developed his own list of general functionality and specific tasks, so we combined our lists to come up with the list of functionality we would be testing. From my previous work, I had developed a course in Canvas populated with the elements required to complete each task. This same course was duplicated and used by each person reviewing the application, so that we could make our results as consistent as possible.
18
Testing & Evaluation Process
Each task/function was rated from 1–5, 5=fully accessible & 1=completely inaccessible Each task/function was given a weight from 1-5 1=least impact & 5=high impact Testers were assigned an assistive technology, browser, and operating system, based on availability and expertise Held weekly meetings to address any issues encountered in testing & rating Normalized the data and retested to verify We evaluated each task based on a five point scale where 5 indicated no accessibility problems were encountered and 1 indicated a show stopper, or a problem that completely prevented the user from completing the task, and we assigned a weight to how important we viewed that functionality. We did this for each task for each assistive technology tested, so a task might score a 5 for JAWS, but a 1 for ZoomText.
19
AT Tested Assisted Reading High Contrast Screen Magnification
Read&Write with Chrome High Contrast Screen Magnification Windows: ZoomText with IE Screen Reader Windows: Jaws/IE and NVDA/Firefox Mac: VoiceOver/Safari Speech Recognition and Control Dragon 12.5 initially, then Windows Speech Recognition Keyboard-only Windows: Chrome, FF, IE Mac: Safari The general classes of assistive technologies tested included “assisted reading”, “high contrast”, “screen magnification”, “screen reader”, and “speech recognition and control.” We also tested whether tasks could be accomplished using only the keyboard. Where we couldn’t accomplish a task, we tried to give some indication of where the problem occurred, and, if obvious, where to look for a solution.
20
Compiling The Report Description and rationale Testing criteria
Result summary and feedback Show Excel spreadsheet
21
Procedures Screen Readers View Changes apparent?
Same options available? Focus managed appropriately? When dialogs opened and closed When content moved, added, or deleted When actions confirmed or cancelled Hidden things still accessible unintentionally? Instructions up-to-date and accurate? Alerts, tooltips, popovers, etc. voiced? Controls labelled? Appropriate roles used? In planning how we would test things with each assistive technology, we tried to come up with some standard procedures or things to look out for. For instance, for screen readers, we checked things like: Were users informed of major or unexpected changes to the page view? Could we access all options available to sighted users? Was focus managed appropriately, for instance, When dialogs opened or closed? When content was moved, added or removed? Both when confirming or cancelling an action? Were things that were visually hidden still unintentionally readable by the screen reader? Were instructions presented to screen reader users with visually-hidden text up-to-date and accurate? Were alerts, tooltips, popovers, and the like voiced? Were fields labelled and did controls, generally have accessible names? Was the proper role used for elements? For instance, was something that functioned as a button actually marked up as a link?
22
Procedures (continued)
Keyboard Focus indicators Magnification Focus (such as dialog) Speech recognition By labels, type, show numbers, simulated keypress, mousegrid Scrollable overflow containers Invisible controls Missing labels (RCE) Ambiguous icons Visible label doesn’t match actual label Fixed position links Contrast Link text distinguishable from surrounding text
23
Demo: Reordering Module items
With Mouse With Keyboard As regards modules, I thought I would perhaps show some of the complexities involved in accomplishing certain tasks. For instance, I could show how someone using a mouse might reorder items in the Modules tool, and then show how Canvas provides a different way of doing this for a keyboarder.
24
Demo: Deleting Announcements
Complexities: Focus? JAWS but relevant stuck in Forms mode AT Problem or improper technique/Canvas bug? Not addressed by review for institutions and Then I could go on to show how challenging it is to get things right even if you are being thoughtful, if you do not actually perform integration tests with assistive technology, by going to the Announcements tool and demonstrating how attempting to delete an announcement. I could then discuss the complexities of such an interaction in terms of where focus goes and that there is not always clear guidance in the standards about such things and what is useful for one class of users may be confusing to others. Show how deleting an announcement causes JAWS to get stuck in Forms mode. Given this does not appear to be a problem with other screen readers, it raises the question, “Is this a problem with the assistive technology or is there some technique that is being used improperly that the other screen readers are working around?” It was beyond the scope of our review to make these determinations, but it does point to the challenges facing developers of modern rich internet applications. In either case, if necessary tasks cannot be performed with common assistive technologies used by a large percentage of a certain population of users it is a problem that institutions should be aware of and that vendors should try to resolve, even if it is a hack designed to work around a bug in the AT.
25
Dialogs Used throughout Chose not to use dialog role
Points out challenges in accommodating user expectations, assistive technology support, and user familiarity with AT features Skip?
26
Hidden controls Replying to messages with Inbox
Controls missed by speech control users Tradeoffs or opportunities for customization? Editing and deleting answers in MC questions Skip MC questions
27
Personalization and Customization
Users have different needs and ways of viewing and interacting with applications Users can adapt the system to their needs instead of adapting to the application Personalization improves the user’s experience Moved this, since it is a natural segue from talking about the Inbox. Other possible opportunities for personalization might include the ability to choose the hotkeys, customize positioning of sections, and set session timeout.
28
Time Check
29
Layout Customization Personalization and Customization Criteria: Layout Customization What is the default layout Frame- or non-frame-based Is tool selection and arrangement keyboard accessible Can users' settings be saved globally? Can users define desired alert types? What type of alert JavaScript, ARIA? Other? Can users set globally the desired contrast Can users set globally the desired bg/fg colors, font size, font type? Can users set globally the session timeout to a desired length?
30
Layout Customization (2)
Are users alerted before the session expires and can it be extended if desired? Can users set the default page globally? Can users set a desired editor globally? Can the editor be changed on-the-fly? Rationale for Navigation Navigation is the most important element for accessibility. Often there is no visual clue or description of the layout. Users need to obtain necessary information to make navigation decisions. Users must be able to navigate effectively and with certainty.
31
Time Check
32
Navigation Criteria Page title and breadcrumbs Page title
Does the page title convey current information about where the user is located? Are there breadcrumbs and are they easily located/navigated to? Can breadcrumbs be used for navigation (are they clickable)? Navigation bars and menus How are navigation bars and menus constructed? Lists? Table? ARIA tree? Are they collapsible/expandable, etc.? Is the construction consistent?
33
Global Navigation Criteria
Navigation Technique What technique is used for keyboard navigation? Headings? ARIA landmarks? Other? Sufficient Navigation Are there sufficient navigation mechanisms provided to allow users to move to the various sections of the page? Linearization Does the page linearize properly and logically? Tabbing order Is the tabbing order logical? Main content How do keyboard users navigate to the main content? Heading 1? ARIA main landmark? Skip navigation? Other?
34
Discussion Criteria Navigation features Information Hierarchy
How is the main content area labeled? Are there shortcut keys for moving between focusable areas? Information Hierarchy How are the different elements laid out? Are there different views such as threaded, un-threaded? Read/Unread Summary Does it show the number of new/unread messages? Message Read Status How are messages marked read/unread? How are new messages marked as read?
35
Discussion Criteria (2)
Posting New Messages/Replying Are the form controls accessible/labeled? Editor style? How does a user attach files? What verification/error notification technique is used? Support for Forwarding and RSS RSS Support Is RSS supported? Is the feature to subscribe/unsubscribe accessible? Support for Forwarding to External Systems notification? Support for direct reply? Additional Features and Concerns? Are there other features that may enhance accessibility? Are there issues that may negatively affect accessibility? Are form fields properly labeled, etc.?
36
Time Check
37
Summary & Feedback: Screen Readers
No top-level accessibility Help topic, including AT-specific instructions, shortcut keys for RCE, etc. (as Google has done) “Gear icon” button menu kills Virtual PC Cursor mode in JAWS Dismissed dialogs persist at end of DOM, announced by Virtual PC Cursor in JAWS No clear differentiation between links (go somewhere) and buttons (do something) Proper ARIA roles not used: buttons, menus, dialogs Redundant alert messages in JAWS
38
Summary & Feedback: Screen Reader (2)
No integrated spell checker, NVDA with Firefox sufficed (needed?) E-portfolio has major blockers (e.g., drag-and-drop, submissions) File upload/download difficult to learn, cumbersome to use Inbox proved difficult Auto-filter announcements interfered with JAWS speech NVDA performed best, JAWS second, and then Mac VoiceOver Take home message: screen reader experience a bumpy ride Sighted assistance has been required at U-Michigan
39
Summary & Feedback: Windows Speech Recognition (WSR) & Keyboard-only
Good news: Most important tasks can be completed (though sometimes with difficulty) using only keyboard or using WSR with a combination of techniques. Disclosure used to limit tabs and simplify interface.
40
Summary & Feedback: Windows Speech Recognition & Keyboard-only (2)
Originally planned to use Dragon 12.5 with, i.e., but switched to WSR Problems: Ambiguous icons and invisible accessible names that were often not unique made activation of controls more difficult Some controls could not be accessed by keyboard Some controls only became visible on hover and some only on hover or focus, presenting operability and discoverability problems
41
Summary & Feedback: Windows Speech Recognition & Keyboard-only (3)
Not all drag behaviors have keyboard alternative, and when supplied cues can be confusing Focus order sometimes did not match visible order Some controls could only be activated by simulated keypresses or mouse grid Keyboard shortcuts, when available, not easily discoverable, nor customizable Focus management issues Focus indicator weak to non-existent, occasionally Some elements which appear to be static content only revealed to be editable on hover or focus Pagination of results would likely have assisted keyboarders
42
Summary & Feedback: ZoomText
ZoomText accessibility was good overall.
43
Summary & Feedback: Read&Write
Dashboard – In tile mode The buttons for settings, announcements, assignments, etc., were not highlighted when announced. Dashboard – In recent activity mode The show more link was not highlighted when being explained. Some icons were not highlighted when announced. Course menu was not announced when the menu was opened. The reading order was corrupted when the menu bar was announced. Read&Write interferes with the CSS formatting/layout.
44
Chrome High-Contrast Extension
Overall feedback Chrome does not automatically adopt windows high contrast mode, unlike other browsers High Contrast mode changes the CSS formatting. The tab order completely skips over the tool icons in the Rich Content Editor Specific areas Navigation - Comments area The comment icon for an assignment grade is not desernable (dis) when hovering over an assignment grade Discussions When replying - the content on the screen gets pushed out of view ePortfolio - add/removing content Delete icon - only visible when hovering over the Rich Text Content area
46
Canvas & Accessibility
Project Signoff Requires review w/ Product, UI, a11y team, & Lead ENG Complete review of mockups and workflows Final, comprehensive audit prior to release Engineering Onboarding: Accessibility Course A11y team increased in size Code Reviews & QA Defined a11y acceptance criteria
47
Canvas Support System Overall Support Stats Escalation Process
93% customer ticket satisfaction over the last 12 months; 94% first-response SLA compliance over the last 12 months Escalation Process Reported to L1 support Testing & Verification Passed up to a11y team within 2 days (on average) A11y team to triage Product prioritizes issues into engineering sprints Three-week release cycle Issue Source (2016) Count Instructure 223 Third-party Audit 79 Canvas User Group 47 Other Institutions 60 TOTAL 409
48
Canvas & User Feedback ATHEN Collaboration Group
2017 will be the 4th year of collaboration Canvas Community – Accessibility User Group
49
Canvas & Third-party Audits
October 17, 2016 WebAIM.org, a third party authority in web accessibility, has evaluated the Canvas Learning Management System (LMS) by Instructure and certifies it to be substantially conformant with Level A and Level AA of the Web Content Accessibility Guidelines version 2.0. A representative sample of system views was evaluated for accessibility. This sample included calendars, quizzes, and communication tools. WebAIM cannot verify the conformance of content that is changed after October 17, However, based on our interactions with Instructure, WebAIM is confident in their ability and willingness to maintain a substantially conformant LMS.
50
Conclusion Accessibility is a process and not a switch
We need to get involved in vendor collaboration Vendors need to engage accessibility community in their design, development and QA process
51
Resources CATE homepage: Removed Instructure’s response, since it is in process and required login.
52
Contacts & Q&A
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.