Accessibility Evaluation

Accessibility Evaluation

Accessibility evaluation in Usability Evaluation Criteria investigates limitations and error degrees in accessing a digital library. W3C Web Accessibility initiative provides many tools that have been developed to evaluate accessibility automatically. Seven web accessibility evaluation tools are selected to evaluate many digital libraries in a short time. Accessibility is mainly chosen, because it is widely accepted as a prime element in usability evaluation (Bishop, 1998).

Definition

Accessibility is defined as “the ability to receive, use, and manipulate data and operate controls included in electronic and information technology … ; whether users can access information of a digital library with no or at least limitations without errors” (IITAA). IITAA is the law for governing accessibility for State of Illinois agencies including universities.

Methodology

Accessibility evaluation in Usability Evaluation Criteria investigates limitations and error degrees in accessing a digital library. Giorgio Brajnik (2008) lists some possible methods of evaluating accessibility: expert review, user testing, subjective evaluations, and barrier walkthrough (Brajnik, 2008).

W3C Web Accessibility initiative provides many tools that have been developed to evaluate accessibility automatically. For the prototype evaluation, seven web accessibility evaluation tools are selected, because using existing open source tools is simple but accurate and economical to evaluate many digital libraries in a short time.

Seven Accessibility Evaluation Tools

The selected seven accessibility evaluation tools use different evaluation methods based on different standards. Each accessibility evaluation tool complies with one of Illinois Information Technology Accessibility Act (IITAA), Electronic and Information Technology Accessibility Standards (Section 508), W3C Web Content Accessibility Guidelines (WCAG), etc. The Table summarizes unique characteristics of the selected seven web accessibility evaluation tools.

Accessibility Evaluation Tools What is the tool? Which standards does it use? Why is it chosen for accessibility evaluations?
Functional Accessibility Evaluator (FAE) FAE is developed to test web sites for functional accessibility features defined in the iCITA HTML Best Practices. The iCITA HTML Best Practices is a statement of techniques for implementation of the W3C Web Content Accessibility Guidelines (WCAG), the United States Federal Government Electronic and Information Technology Accessibility Standards (Section 508) and the Illinois Information Technology Accessibility Act (IITAA). It evaluates Accessibility of a digital library by the W3C Web Content Accessibility Guidelines (WCAG) and the United States Federal Government Electronic and Information Technology Accessibility Standards (Section 508).
The HiSoftware Cynthia Says portal The HiSoftware Cynthia Says portal is to identify errors in a home page that is related to Section 508 standards and/or the WCAG guidelines. It checks verification checklist, according to 508 Standards section 1194.22 and/or the WCAG guidelines
W3C Markup Validation Service W3C Markup Validation Service checks the markup validity of Web documents in HTML, XHTML, SMIL, MathML, etc. It validates specific content such as RSS/Atom feeds or CSS style sheets, MobileOKcontent, or broken links. It finds errors and warnings checking the web site as XHTML 1.0 Transitional. It is a unique tool that checks the markup validity of HTML and others with CSS.
Web Accessibility Evaluation Tool (WAVE) WAVE tool shows the accessibility evaluation results with embedded icons and indicators in the original digital library homepage including detected error numbers. The embedded icons and indicators tell accessibility of the page. It is very simple to evaluate Accessibility (their used standards are not clearly stated).
The etre Accessibility Check The etre Accessibility Check checks accessibility of web sites by the WAI accessibility guidelines. It checks accessibility of web sites by the WAI accessibility guidelines. The results show accessibility issues by WAI Priority.
Web Accessibility Evaluator in a single XSLT file (WAEX) WAEX evaluates web accessibility by WCAG Checkpoints priority 1, 2, and 3 and W3C Recommendations. It evaluates web accessibility by WCAG Checkpoints priority 1, 2, and 3 and W3C recommendations. The report shows whether the digital library passes WCAG Checkpoints of priority 1, 2, 3, and whether it is in WCAG or not.
Fujitsu Web Accessibility Inspector 5.11 Fujitsu Web Accessibility Inspector 5.11 checks accessibility of web sites with CSS files by WCAG Checkpoints priority 1, 2, and 3, and W3C Recommendations. It checks accessibility of web sites with CSS files by WCAG Checkpoints priority 1, 2, and 3, and W3C recommendations.
General accessibility evaluation process

With the chosen tools, the web pages of each digital library are evaluated whether they have accessibility errors or broken links. General accessibility evaluation process is:

  • Put each website address of the candidate digital libraries on each evaluation tool;
  • Each tool evaluates their accessibility, and gives a report how many accessibility errors come out, and whether it passes guidelines (e.g. Section 508, IITAA, or WCAG);
  • Analyze the result, and give scores based on 5-point scale: 5-point rating scale is used variously depending on evaluation tools and number of detected errors.
  • Calculate an average for each candidate digital library from seven scores of seven evaluation tools. The average for each candidate means a degree of accessibility through accessibility evaluations. The high average means high accessibility of a candidate digital library.

Results and Analyses

The results of accessibility evaluations are somewhat different based on different evaluation tools. Thus, diverse analysis methods are applied to each evaluation tool and its evaluation results.

Functional Accessibility Evaluator (FAE)

Functional Accessibility Evaluator 1.0.3 evaluates Accessibility of digital libraries based on the W3C Web Content Accessibility Guidelines (WCAG) (UIUC). The evaluation results of using FAE shows whether the web pages pass, fail or have warning based on WCAG guidelines. The result includes the details of the accessibility features and problems in a website, and how many ‘Complete,’ ‘Partially Implemented,’ and ‘Almost Complete’ occur. Based on the number of ‘Complete,’ the digital library is scored with 5-point scale.

Total, 4.8% digital libraries of sixty two digital libraries gained 4.5 points. The overall average of the results with FAE tool is 2.6123.

This may mean that many digital libraries do not follow more than 95% of the W3C Web Content Accessibility Guidelines (WCAG) in inclusion of structural markup, using properly of images for interoperability, using of CSS styling, and supporting for HTML standards. Moreover, four digital libraries of sixty two digital libraries could not be evaluated, because of broken links or accessing errors.

As a result, the tool shows the shortest time to execute evaluating accessibility among the chosen seven accessibility tools.

‘Cynthia Says’ Tool

Cynthia Says tool checks verification checklist, according to 508 Standards section 1194.22. Two of verification checklists are ‘A text equivalent for every non-text element shall be provided,’ and ‘Equivalent alternatives for any multimedia presentation shall be synchronized with the presentation’ (CybthiaSays). The results express whether web sites of the digital library pass the checklist or not. The number of ‘Yes’ for the passed checklist is counted for rating based on 5-point scales.

Totally, only five following digital libraries had three ‘Yes,’ because it is difficult to satisfy all standards of Section 1194.22. That is, 7.9 % of the candidate digital libraries gained 3 points.

  • The National Archives Online Exhibits in History of the America Subject area,
  • SMETE digital library in Education Subject area,
  • California Sheet Music in Music Subject area,
  • S. Department of Health & Human Services in Medicine, and
  • The National Archives Education Resources, UK, in Social Science Subject Area.

The overall average is as low as 1.580357 of seven accessibility tools.

It may mean that home pages of many digital libraries could not satisfy many rules of Section 508 standards, and/or the WCAG guidelines.

On the other hand, the method seems very strict to evaluate Accessibility, since it does not give degrees of percentage but give only number of ‘Yes,’ when the digital library satisfies all rules. Three digital libraries could not be evaluated, because of HTTP Transfer Errors.

W3C Markup Validation Service Tool

The W3C Markup Validation Service tool finds errors and warnings checking the web site by XHTML 1.0 Transitional (W3C, W3C Markup Validation Service). The tool reports what kinds of errors are detected, and how many errors and warnings are found. Depending on the number of errors, they are scored with 5-point scales. The digital library that has many errors gets lower score in 5-point scales, but number of warnings is ignored.

Totally, 14.3 % of the candidate digital libraries gained 5 points, but 50.8% of them gained 0 point by W3C Markup Validation Service Tool. Interestingly, more than half digital libraries of sixty two digital libraries had more than twenty errors in Markup validation. Overall average is as low as 1.5864071.

Against expectations, there are many errors in codes of HTML. That is, considerable errors are detected in the markup validity of Web documents in HTML, XHTML, SMIL, MathML, etc., although the candidate digital libraries are very good digital libraries in their subject areas. It points out that digital libraries should check carefully the markup validity of HTML or XHTML codes. The tool detects many errors than other six Accessibility tools with explanations about errors in each line and column.

WAVE (Web Accessibility Evaluation) Tool

The Web Accessibility Evaluation (WAVE) tool shows the accessibility evaluation results of the digital library including number of detected errors with embedded icons and indicators (WAVE). The embedded icons and indicators tell accessibility of the webpage. Depending on number of detected accessibility errors, they are scored with 5-point scale. The digital library that has many detected errors has lower scores in 5-point scales.

Totally, 49% of the candidate digital libraries gained 5 points. It is a considerable number. 12.7% of all digital libraries gained 0 point. The total average of accessibility results by WAVE is as high as 3.618304. Two digital libraries cannot be evaluated by WAVE Accessibility tool.

Relatively, it detects fewer errors than other tools for all candidate digital libraries. While it is simple to be used, it seems not being able to detect many accessibility errors with standards.

etre Accessibility Check Service Tool

The etre Accessibility Check Service Tool checks accessibility of web sites by the WAI accessibility guidelines (Etre). The results of etre Accessibility Check tool state accessibility issues by WAI Priority. The result shows number of errors that the tested page does not adhere to the WAI accessibility guidelines. The Priority 1 section reports number of errors that must be fixed. The Priority section 2 reports number of errors that should be fixed. Depending on number of errors or problems of Priority 1 and 2, the result is scored into 5-point scales. Then, an average is calculated with two score values of Priority 1 and 2. The averages become final accessibility evaluation scores for the candidate digital libraries by etre Accessibility Check tool.

Totally, etre Accessibility Check tool detects few errors of Priority1 (must fix) and Priority2 (should fix) than other six accessibility tools. Thus, it has the highest average, 3.6781, of all sixty two digital libraries. 30% of digital libraries gained 5 points. It is somewhat high rate. There are no digital libraries that get 0 point, although etre Accessibility Check tool evaluates by WAI Priority1 and Priority 2. Moreover, there are no digital libraries that are detected more than 10 errors in Prority1, and that are detected more than 15 errors in priority 2.

It may imply whether many digital libraries follow WAI Priority 1 and Priority 2, or whether the tool could not detect many accessibility errors than other tools.

Web Accessibility Evaluator in a single XSLT file (WAEX)

Web Accessibility Evaluator in a single XSLT file (WAEX) evaluates web accessibility by WCAG Checkpoints priority 1, 2, and 3 and W3C Recommendations as FAE does (WAEX). The result shows whether the digital library pass WCAG Checkpoints priority 1, 2, 3, and Checkpoints in WCAG or not. The number of ‘passed’ is counted in order to be scored with 5-point scale.

Totally, the average of sixty two digital libraries by Web Accessibility Evaluator in a single XSLT file (WAEX) is the lowest, 1.4739, while FAE tool that evaluates with same standards is 2.6123. Only 19% digital libraries of them gained 3 points out of 5 points. 30% digital libraries had 0 point.

The results show that many digital libraries could not pass WCAG checkpoints of Priority 1, 2, and 3, and not in WCAG. The reasons might be that they do not follow WCAG Checkpoints priority 1, 2, and 3, and W3C Recommendations in web pages with CSS files, or that the tool is very strict to evaluate them.

Fujitsu Web Accessibility Inspector 5.11 Tool

Lastly, Fujitsu Web Accessibility Inspector 5.11 checks accessibility of web sites with CSS files by WCAG Checkpoints priority 1, 2, and 3, and W3C Recommendations like WAEX and FAE. It also checks Fujitsu Web Accessibility Guidelines by priority level 1 (‘it checks very important items such as whether the alt attribute was set’) and level 2 (‘it checks for existence of <caption> of the table in addition to priority 1’) (Fujitsu). The number of problems of priority 1 and 2 are scored with 5-point scale. Then, two score values by Priority 1 and Priority 2 are calculated into an average. The average becomes the final accessibility value for each digital library by Fujitsu tool.

Totally, although Fujitsu Web Accessibility Inspector 5.11 detects some errors, the total average is as high as 2.9989. 14.3% gains 5 points out of 5 points. Only 0.8 % of digital libraries had 0 point. Most digital libraries gained over 2.5 points.

Since several digital libraries show many errors, Fujitsu seems to detect errors well by WCAG Checkpoints priority 1, 2, and 3.

Total Analyses with Seven Accessibility Tools

Exceptions of Accessibility Evaluations

Accessibility of sixty two candidate digital libraries in fifteen subject areas was evaluated seven times by the selected seven accessibility evaluation tools. Two digital libraries, ‘Digital Past’ in the history of the America subject area and ‘Electronic Cultural Atlas Initiative’ in the social science subject area could not be evaluated by all seven accessibility evaluation tools. They may have broken links or use /robots.txt. And ‘Database of Recorded American Music digital library’ in the music subject area was evaluated by only two accessibility evaluation tools, since it has broken links, HTTP transfer errors, or inability to access in five other accessibility evaluation tools. It uses /robots.txt, too. ‘/robots.txt’ is ‘The Robots Exclusion Protocol.’ As a result, seven evaluation tools couldn’t access their websites by /robots.txt. They cannot be scored nor evaluated for accessibility evaluations. Also, ten digital libraries could not be evaluated by one or two evaluation accessibility tools. If it cannot be evaluated by one or two tools, the score becomes 0. All scores by seven tools were used to calculate an average.

Analyses based on the Averages of each evaluation tool, and Subject Domains

Since seven accessibility tools have different evaluation methods and standards, the results of evaluation are somewhat different. The results show which guidelines or standards a digital library follows. For example, United States `Department of Defense in military science subject area gains five points by etre Accessibility Check tool that checks whether the digital library does adhere to the WAI accessibility guidelines. But the same digital library gains 0 point in W3C Markup Validation Service tool that checks the web site as XHTML 1.0 Transitional. That is, the digital library follows WAI standards, but does not follow W3C Markup rules in HTML.

According to the average,

  • many digital libraries of sixty two digital libraries follow Web Accessibility Initiative (WAI) accessibility guidelines that etre Accessibility Check tool uses.
  • digital libraries follow usually WCAG standard and W3C Recommendations as averages of Fujitsu and FAE tools show.
  • but, many digital libraries follow less Web Content Accessibility Guidelines (WCAG) Checkpoints priority 1, 2, 3, and Checkpoints not in WCAG that WAEX uses.

According to subject domains,

  • Although we cannot confine that accessibility evaluation results are different among subject domains,
  • World History and history of Europe (Asia, Africa, Australia, New Zeal, and ETC) subject domain shows that it provides the highest accessibility in all seven accessibility tools.
  • Music and books on music domain shows the lowest accessibility overall in all seven accessibility tools.
Analyses Based on Accessibility Evaluation Tools

FAE is defined in the iCITA HTML Best Practices that implement the W3C Web Content Accessibility Guidelines (WCAG), the United States Federal Government Electronic and Information Technology Accessibility Standards (Section 508) and the Illinois Information Technology Accessibility Act (IITAA). The HiSoftware Cynthia Says portal detects Accessibility errors that are related to Section 508 standards and/or the WCAG guidelines.

  • Two standards (WCAG and Section 508) are used in both tools, but the averages of them are some different: FAE: 2.6123 vs. Cynthia: 1.58037.
  • It may be caused from different methods to evaluate accessibility. That is, the results vary by used standards and evaluation methods.
    • FAE shows percentages with three kinds of status in four categories: Navigation & Orientation, Text Equivalents, Scripting, Styling, and HTML Standards.
    • However, the HiSoftware Cynthia Says portal shows only ‘Yes’ in 100% or ‘No’ not in 100%. The HiSoftware Cynthia Says portal does not cover below 100%.

Moreover, WAEX and Fujitsu Web Accessibility Inspector 5.11 evaluates web accessibility by WCAG Checkpoints priority 1, 2, and 3 and W3C Recommendations.

  • Like above two accessibility tools, the tools have the same patterns in averages: WAEX is 1.4739, but Fujitsu is 2.998.
  • WAEX shows only ‘pass (100%)’ or ‘failed (not 100%).’
  • However, Fujitsu Web Accessibility Inspector 5.11 shows number of errors. Depending on number of errors, the number of errors is scored with 5-point scale. Thus, Fujitsu has higher average than WAEX average.
  • That is, using different methods to evaluate Accessibility affects the results of accessibility of each digital library. Thus, it remains a question, whether the strict method that shows only 100% is still better to evaluate Accessibility or not.

W3C Markup Validation Service checks the markup validity of Web documents in HTML, XHTML, SMIL, MathML, etc. It evaluates accessibility in somewhat different point of view. The results show that there are many errors in the markup. It needs cautions when web developers build codes with markup languages.

In conclusion,

  • it will be better for accessibility evaluation tools to give number of errors or percentages than giving the results of only pass (yes) or fails (no).
  • it will be much better, when accessibility tools give detail explanations about errors. It will give chances for web developers to fix their errors frequently. It fits with the purpose of evaluations, for digital libraries to improve their websites.
The Highest Score Digital libraries by the Accessibility evaluation in Each Subject Domain

The most important result of accessibility evaluations is to find which digital libraries provide good accessibility so that users can access and retrieve information effortlessly. Finally, fifteen digital libraries in fifteen subject areas are turned out that they provide better accessibility than other digital libraries in their subject areas. They were evaluated by all seven selected accessibility tools. That is, if a digital library could not be evaluated by even one accessibility tool, the library was excluded in the best digital libraries.

The below Table shows digital libraries by descending series. Descending series show first the highest score digital library.

Subject Area The highest score digital library in accessibility evaluation
World History and history of Europe (Asia, Africa, Australia, New Zeal, and ETC) EuroDocs
Medicine Children’s Medical Center
Philosophy, psychology, religionScience Online Medieval & Classical Library National science Digital Library
History of the AmericaPolitical Science and Law Library of Congress: American History & CultureHarvard Law School Library Digital Collections
Geography Census Atlas of the United States 
Arts William Blake Archive
Agriculture Western Waters Digital Library 
Social ScienceMilitary science The National Archives, Education Resources, UK Military History and Military Science of The Library of Congress
Education The National Library of Education 
Music and books on music California Sheet Music 
Technology JSC Digital Image Collection 
Language and literature Writers of the Purple Sage 

*More details are in the paper, Chapter V. Usability Evaluation, 1. Accessibility Evaluation. This website and the paper are developed by the same person.

Comments are closed.