1 / 31

Comparison Study of PDF Accessibility Checking Tools

Comparison Study of PDF Accessibility Checking Tools. Christy Blew, M.S., C.C.T. Information Technology Accessibility Specialist Disability Resources and Educational Services University of Illinois at Urbana/Champaign Jon Gunderson, Ph.D .

jean
Télécharger la présentation

Comparison Study of PDF Accessibility Checking Tools

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Comparison Study of PDF Accessibility Checking Tools Christy Blew, M.S., C.C.T. Information Technology Accessibility Specialist Disability Resources and Educational Services University of Illinois at Urbana/Champaign Jon Gunderson, Ph.D. Coordinator Information Technology Accessibility Disability Resources and Educational Services University of Illinois at Urbana/Champaign

  2. In Fall of 2010, an audit of the PDF files in admissions, financial aid, provost, and chancellors office for accessibility was performed. The results showed almost all the files were non-accessible. The individuals in charge of the PDF files had no knowledge of the techniques to make a PDF file accessible and almost all asked if there were tools available to help. After researching available tools, it was decided to look at how well these tools perform and do the tools alone give accurate reports of the accessibility. Overview

  3. How effective and reliable is each tool in analyzing the accessibility of a PDF file? • Are the results of the tool’s evaluations consistent with one another? • What are the strengths and weakness of each tool? • Does the use of these tools save the developer time over manually checking for PDF accessibility? ProjectQuestions

  4. No knowledge of the accessibility of the PDF document • User has little or no knowledge of PDF accessibility • User has access to one or more of the testing tools • Recorded results represent the results user would get TestingScenario

  5. Adobe Acrobat Reader 9 and 10 (X) • Adobe Acrobat Pro 9 and 10 (X) • EGovMon web based tool • PDF Accessibility Checker – PAC • CommonLook by Netcentric Testing Tools

  6. Tagging Structure • Images • Heading Structure • Title • Lists • Tables • Forms Testing Categories

  7. 2003 PDF - Files created in Word 2003 and converted to PDF Using the PDF creator add-in that installed with the installation of Acrobat Pro 9 • 2010 PDF - files created in Word 2010 and converted to PDF using the built in PDF creation tool • Re-tagged PDF - PDF files retagged in Acrobat Pro X to match the originally intended structure of the document • No-tag PDF - PDF files created from Word 2010 documents using “print to PDF” option Test File Categories

  8. Spreadsheet of results created • Pass • Fail • Pass with exceptions (warn) • Product Does Not Test For Reporting Structure

  9. Contains Tags: Tagging structure is found in the document • Find No Tags:No tagging structure is found in the document • PDF as Image:Document contains a scanned image or no text is present. Tagging Structure Tests

  10. Tagging Result Summary

  11. Number of non-text elements (images) in document:Report the number of non-text elements found in the document. • Alt text found:Report the document contains non-text elements that do not have alt-text. Non-Text Element (Images) Test

  12. Images Result Summary

  13. Number of Headings in document (Correct): Report the number of headings in the document • Heading Hierarchy (Correct): Report the hierarchy of headings matches the logical reading order of the document. • Number of Headings in document (Incorrect): Report the number of headings in the document • Heading Hierarchy (Incorrect): Report the hierarchy of headings does not match the logical reading order of the document. • Fool test with blank header:Report presence of an H1 element • Header as text:Report no issues Heading Structure Tests

  14. Headings Result Summary

  15. Shows as tag: Text marked as “title” in the native application is tagged as “title” in the PDF • Shows in properties:Text entered in the title area of the native document’s property feature will show in the properties feature of the PDF file. Title Element Tests

  16. Title Result Summary

  17. Number of Lists:Report the number of list elements in the document. • Tagged as list:Report list items are tagged as list items • Lists: UL v OL:Report the correct usage of UL and OL lists. • Children:Report correct tagging structure for children elements in a list of items. List Element Tests

  18. Lists Result Summary

  19. Number of Tables:Report the number of table elements in the document. • TH usage:Report the absence of TH elements in the table. • Col-Row Spans:Report the absence of rowspan and colspan elements in complex tables. Table Element Tests

  20. Tables Result Summary

  21. Number of Elements:Report the number of form elements found in the document. • Type of Element:Report the type of form elements present in the document. • Label (Tooltip): Report the presence of text in the tooltip option of the form properties • Tab Order:Report the tab order of form elements matches the logical reading order of the document Form Element Tests

  22. Forms Result Summary

  23. Aside from if a document is tagged, not tagged, or a scanned image, no single automatic test or tool can test for all aspects necessary for a PDF to be completely accessible • All tools require manual follow up to ensure a PDF to be completely accessible • Each tool returned at least one “false positive” • Very little consistency of reporting methods is present between tools • Each tool has a different learning curve to use properly • Each tool can report false positives for an element • Only Acrobat Pro offered a batch testing option Summary

  24. Reality – Tags and Images

  25. Reality – Headings, Title, Lists

  26. Reality – Tables and Forms

  27. These tools are effective in reporting if a document is tagged, not tagged, or a scanned image. Beyond these tests, they are unpredictable in the accuracy of their findings. • Although automatic checking tools can give you a starting point when working toward PDF accessibility, they are not meant to be the only option. Manual checking is needed to verify correct structure and tagging markup. How effective and reliable is each tool in analyzing the accessibility of a PDF file?

  28. The tools did give consistent results in each test. Although tools may have used different nomenclature in the reports, it was clear to the user what was being reported. Are the results of the tool’s evaluations consistent with one another?

  29. It cannot be determined that one tool is better than another when reporting if a document is tagged, not tagged, or a scanned image. As each tool had limitations to the items it could test for, the use of just one tool would leave the user without a full representation of accessibility issues. • Tools such as Acrobat Pro and CommonLook may be considered better tools as they offer environments to fix the accessibility of a PDF document as well. Is one tool better than another?

  30. It is determined that the knowledge base of PDF accessibility and personal preferences of the developer would contribute to the answer of this question. • For the testing scenario of someone with little to no experience, the tools may save some time as to give the individual a quick idea of the accessibility of the document. • For a user who has experience working with the manual checks and remediation of PDF files for accessibility, the tools may not save the developer time as they would be manually testing the same items the tools are checking for. Does the use of these tools save the developer time over manually checking for PDF accessibility?

  31. http://presentations.cita.illinois.edu/2011-03-csun/pdf Presentation Information Test Files and Descriptions Summary of Findings Summary of Testing Tools Christy Blew: clblew@illinois.edu Jon Gunderson: jongund@illinois.edu MoreInformation

More Related