1 / 33

Development of Accessible E-documents and Programs for the Visually Impaired

Development of Accessible E-documents and Programs for the Visually Impaired. Web accessibility testing (v2010). 1. Methods. testing using automatic tools Manual testing Testing by a user from the target group. 2. Automatic tools (A.T.). There are different solutions from different vendors

chika
Télécharger la présentation

Development of Accessible E-documents and Programs for the Visually Impaired

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Development of Accessible E-documents and Programs for the Visually Impaired Web accessibility testing (v2010)

  2. 1. Methods • testing using automatic tools • Manual testing • Testing by a user from the target group

  3. 2. Automatic tools (A.T.) • There are different solutions from different vendors • Limited online versions, • Fully functional commercial products • Approximately 50% of mistakes can be caught by these tools

  4. 3. A.T. How it works • Checking syntactical aspects • are "alt" options defined? • Are all image maps client-based? • Are the row and column headers in tables defined? • Are the frames named? • ... • Heuristics • Can answer yes/no questions

  5. 4. A.T. demonstration Mostly used tool: • Cynthia Says: http://www.contentquality.com/ • One page per minute / site can be tested

  6. 5. A.T. : limitations • False positives • Incomplete results • not a single Priority 1 checkpoint could be fully checked by automated testing tools

  7. 6. A.T. failures (general) • Checkpoint 1.1 Provide a text equivalent for every non-text element • Checkpoint 2.1 Ensure that all information conveyed with color is also available without color, for example, from context or markup • Checkpoint 6.1 Organize documents so they may be read without style sheets. For example, when an HTML document is rendered without associated style sheets, it must still be possible to read the document. • Checkpoint 6.2 Ensure that equivalents for dynamic content are updated when the dynamic content changes.

  8. 7. A.T. failures (images and image maps) • Checkpoint 1.2 Provide redundant text links for each active region of a server-side image map • Checkpoint 9.1 Provide client-side image maps instead of server-side image maps except where the regions cannot be defined with an available geometric shape.

  9. 8. A.T. failures (tables) • Checkpoint 5.1 For data tables, identify row and column headers. • Checkpoint 5.2 For data tables that have two or more logical levels of row or column headers, use markup to associate data cells and header cells.

  10. 9. A.T. failures (miscellaneous) • Applets and scripts: Checkpoint 6.3 Ensure that pages are usable when scripts, applets, or other programmatic objects are turned off or not supported. If this is not possible, provide equivalent information on an alternative accessible page.

  11. 10. A.T. failures (miscellaneous) • Multimedia: Checkpoint 1.3 Until user agents can automatically read aloud the text equivalent of a visual track, provide an auditory description of the important information of the visual track of a multimedia presentation • multimedia: Checkpoint 1.4 For any time-based multimedia presentation (e.g., a movie or animation), synchronize equivalent alternatives (e.g., captions or auditory descriptions of the visual track) with the presentation.

  12. 11. A.T. false positives • Automated tools frequently falsely detect errors. E.G. www.qantas.com.au will often be detected as an audio file without text alternative. • "123.jpg" can be in some situations a good content of "alt" attribute, but tools with some kind of intelligence may report it as an error

  13. 12. A.T. tools • Cynthia Says: http://www.contentquality.com/ • WebThing markup validation: http://valet.webthing.com/page/ • WebThing Accessibility validation: http://valet.webthing.com/access/url.html • SSB Technologies Ask Alice: http://askalice.ssbtechnologies.com:8080/ssb/aa/anon/index.jsp • WebAIM wave: http://www.wave.webaim.org/index.jsp

  14. 13. Manual Testing • As you seen on previous slides, there are many situations where automatic testing fails • Manual testing is the necessary part of web accessibility testing • There are many offline tools (accessibility toolbars,...) which may help • Mozilla Firefox Web Developer Toolbar: http://mozilla.sk/rozsirenia/web-developer/# • Internet Explorer Accessibility Toolbar: http://tinyurl.com/mhus7 • Colour contrast analyzer: http://www.visionaustralia.org.au/info.aspx?page=628

  15. 14. Manual testing (2) • Use toolbars from prior slide to manually test the page • Find the representers of groups of pages (mainly for sites using Content management systems) • Disable CSS to test if document structure exists and is defined by correct tags (headings, item lists,...)

  16. 15. Manual testing (2) • Disable page colours to test if the colours are not used incorrectly • Disable images and enable "display alt attributes" to test relevance of your alts • Enable "display form details" and debug your forms

  17. 16. Testing by users from the target group (TTG) • This is focused mainly on usability testing • Choose the group of testers with different disabilities • Think about the site which will be tested • Find important things (which parts need to be tested?) • Prepare tasks for the target group • ask them to do prepared tasks and value successfullness, speed, measure of comfort,...

  18. 17. TTG: example • We have to test a site of internet shop • Target group: visually impaired people • Registered users can search for goods by browsing the categories or by search form • Payment by credit card • Question: Ideas about the target group?

  19. 18. TTG: target group • Persons with different level of vision (blind, color blindness, low vision,...) • Persons with different level of internet browsing experiences (programmers are able to read a html so results from them are not very relevant) • Users of different assistive technologies (different screen readers,...) • question: What is important to test?

  20. 19. TTG: What to test • Registration process (new users must register) • Login process... • Operations with goods (searching, reviewing, finding detailed information about particular thing,...) • Basket operations (reviewing the content, adding and removing goods, finding information about final price,...)

  21. 20. TTG: What to test • Payment process • Online help accessibility • is the license and other important information accessible? • question: Ideas about tasks for testers?

  22. 21. TTG: Tasks for testers • Create a new account • Find the price of one particular thing • Is the particular thing in stock? • Buy the portable MP3 player with record button (record function is activated by a button not from a menu) • Copy and paste (to an external file) some topic from online help (E.G. using basket) • ... • question: how to evaluate the test?

  23. 22. TTG: evaluation • Were all tasks realizable for all members of the group? • Compare time requirements of testers with requirements estimated by specialist • Summarization of member's comments

  24. 23. Results of testing • The primary goal of testing is to inform developer about problems on the site • Resulting document must be transparent and relevant • It is very important to uniquely determine tested document (url and date) • Report must contain information about testing environment • More info and examples here: http://www.informatizacia.sk/ext_dok-zaverecna-sprava-pristupnost-2008-1-polrok/5112c

  25. 24. Accessibility and modern technologies

  26. 25. JavaScript Some accessibility issues: • Navigation. Inability or difficulty navigating using a keyboard or assistive technology. • Hidden content. Presentation of content or functionality that is not accessible to assistive technologies. • User control. Lack of user control over automated content changes. • Confusion/Disorientation. Altering or disabling the normal functionality of the user agent (browser) or triggering events that the user may not be aware of

  27. 26. JavaScript event handlers • Do not use device dependent event handlers (OnMouseOver, OnMouseOut, OnDblClick, OnKeyDown, OnKeyUp,...) • Device independent handlers are mostly ok (OnFocus, OnBlur, OnChange, OnSelect, OnClick(if not used on inactive text)) • Actions of some device independent event handlers must be analyzed to determine if they cause accessibility problems (OnChange, OnSelect) • It is possible to combine device dependent event handlers to provide device independent controll of a site but it is hard to test • More information: http://www.webaim.org/techniques/javascript/

  28. 32. Flash • The majority of Flash content cannot be made natively accessible to screen readers • Flash content is time - based and often changes over time • Supported only marginally and only by few up-to-date versions of screen readers • Hard to track by screen readers because of dynamic content • MSAA (Microsoft active accessibility) is supported

  29. 33. Flash: accessibility • Make the Flash content natively accessible to screen readers • Make the flash content self-voicing, eliminating the need for the screen reader • Provide an accessible alternative to the Flash content • More info: http://www.webaim.org/techniques/flash/

  30. 34. Image CAPTCHA alternatives (1) • Following techniques are mainly server - sided • May not work on enterprise level applications where spammers may target forms specifically • These techniques primarily stop bots and automated spam submission programs

  31. 35. Image CAPTCHA alternatives (2) • Detect spam-like contents within submitted form elements • Detect content within a hidden form element: • <span style="display:none;visibility:hidden;"> • <label for="email"> • Ignore this text box. It is used to detect spammers. If you enter anything into this text box, your message • will not be sent. • </label> • <input type="text" name="email" size="1" value="" /> • </span>

  32. 36. Image CAPTCHA alternatives(3) • Validate the submitted form values • Search for the same content in multiple form elements (e.g. are the first and last name equal?) • Generate dynamic content to ensure the form is submitted within a specific time window or by the same user

  33. 37. Image CAPTCHA alternatives (3) • Create a multi-stage form or form verification page • Check the referrer • Detailed descriptions with examples are here: http://www.webaim.org/blog/spam_free_accessible_forms /

More Related