Home » Test for Accessibility » Overview of Testing Methods for 508 Conformance

Overview of Testing Methods for 508 Conformance

There are several ways to validate conformance to the Revised 508 Standards:

  • Automated - High volume 508 conformance testing tools automatically scan and test electronic content;
  • Manual - Manual testing uses a documented, consistent, repeatable process;
  • Hybrid - A combination of automated and manual testing.

Automated Testing

Take advantage of high volume (automated) 508 compliance scanning tools, but be aware of their limitations.

  • Automated scanning tools cannot apply human subjectivity, and therefore either produce excessive false positives or—when configured to eliminate false positives—test for only a small portion of the requirements.
    • Determine the best strategic mix of false-positive generation vs. coverage of your agency requirements by ensuring the tool vendor defines and quantifies the method and accuracy of its rule sets in regard to its alignment with your agency’s standards and expectations.
  • Consider whether or how server-based automated scanning tools will be able to access content secured behind firewalls and password- or otherwise protected content.
  • Select tools that test using the document’s native format. Tools that scan documents often convert files into HTML before testing. This conversion process reduces the fidelity and accuracy of conformance testing.
  • Your agency may need to deploy multiple scanning tools to cover multiple content types (e.g., HTML, Word, Excel, and PDF). It can be a challenge to extract and aggregate results to identify trends and focus remediation efforts.
  • Plan and deliver reporting tailored to your stakeholders. You may want to provide output from scanning tools directly to developers. Additional work may be required to integrate results into dashboard reporting to tell your organizational story.

Key Success Factor: To provide value for the agency and support the highest level of accessibility improvement, the tool or tools you select must foster adoption and buy-in across multiple applicable roles (UX designers, developers, etc.) within the agency.

Technical Requirements

When reviewing automated tools for potential purchase, consider their ability to:

  • Scan the types and volume of electronic content your agency produces. Many tools focus on web pages, but some also scan PDF and Microsoft Office documents.
  • Customize scanning and test ruleset parameters.
  • Use a centralized custom ruleset among all tool feature sets.
  • Assign and control the ruleset version available to users from a central administrative location.
  • Scan code on a local PC to support full compliance assessments in a designer/developer unit-test environment.
  • Control and synchronize error and remediation messages presented to users for customized rules.
  • Flag false positives and ensure the errors are not repeated in subsequent test results.
  • Categorize issues by type, frequency, and severity.
  • Configure, schedule, and suspend scans; change the rate of scans; and restart in-process scans.
  • Fully customize all evaluation rule sets to address inaccurate interpretation of requirements or reduce false positives.
  • Support exclusion of specific domains, URL trees, pages, or sets of lines.
  • Emulate multiple browsers during scans.
  • Provide contextually relevant remediation guidance
  • Customize summary and detailed reports to monitor current 508 conformance; analyze trends by website and by organizational component; and export summary and detailed results to external reporting tools.
  • Direct users to specific code location(s) that are generating errors, and provide contextually relevant remediation guidance.
  • Integrate test tools and conformance monitoring into test automation environments (Dev/Ops).
  • Produce accessible system and report outputs.

Support Services Requirements

  • Installation, configuration, validation, and customization of 508 test rulesets, scans, and reporting capabilities.
  • Integration of 508 test tools, reporting, and monitoring capabilities into test automation environments.
  • Online self-paced training for web content managers, developers, programmers, quality assurance testers, project and program managers, and tool administrators.
  • Operations & maintenance support, including ongoing configuration and customization.

Validate Rulesets

  • Determine whether separate rulesets exist for different types of web content (web pages, Microsoft Office documents, Adobe PDF documents, etc.).
  • Look for a setting that indicates “WCAG 2.0 Level AA Success Criteria” which should test all the Level A and AA /Revised Section 508 requirements that are applicable to web content supported by the tool.
  • Assess each ruleset for reliability, accuracy, and degree of alignment with agency requirements in your environment. Suggested steps:
    1. Create a test bed of sample content. Ensure the test bed includes as many ways to fail a specific checkpoint as known, then uniquely identify each failure point to quantify alignment with agency guidelines as testing progresses.
    2. Configure the scan to evaluate the test bed.
    3. Run the rule set.
    4. Compare the results against manual test results to validate the script’s accuracy. Ensure this comparison is performed by senior subject matter experts who are trained to perform manual testing.
    5. After constructing a viable initial ruleset framework by “passing” the internal test bed tests, test the resulting rule “in the wild” by scanning against multiple sites or applications constructed by technical resources not associated with the internal rule testing effort, to help identify false-positives and requirements to correct rule detection.
    6. Delete inaccurate scripts, or obtain developer assistance to customize the scripts to increase reliability in your environment.
    7. Continue testing until you end up with rule sets that provide an acceptable level of accuracy in your environment.

    Configure Scans

    • Firewall restrictions.
    • Scan depth.
    • How the results should be aggregated.
    • Server capacity and length of time to run scans.
    • How to abort and restart scans.
    • The ability to eliminate rulesets that only generate warnings.
    • The ability to identify content subject to the safe harbor provision. Content that conformed to the Original 508 Standards and has not been altered on or after January 18, 2018 does not need to conform to the Revised 508 Standards (i.e., legacy content). See Section 9.2 below for tips on identifying legacy content.

    Configure Reports

    • The target audiences (web managers, program managers, executive managers).
    • Reporting scope: (issue description, category, impact, priority, solution recommendation, and location in the code).
    • Reporting format (single scan view vs comparison against previous scans, trend highlighting, and identification of major positive and negative changes).

    Manual Testing

    Follow the instructions outlined in Test for Accessibility, endorsed by the Federal CIO Council’s Accessibility Community of Practice.

    Hybrid Testing

    A hybrid testing approach is usually the best solution to handle a large volume of electronic content. Consider the following:

    • Ensure developers build accessibility into code during development.
    • Whenever possible, perform manual testing prior to publishing new content.
    • Use stand-alone automated testing tools to identify obvious errors and augment manual testing.
    • Integrate automated rules sets into developer operations to add increased scale to 508 validation efforts for applications prior to release.
      • Use automated scanning tools to scan as much electronic content as possible and periodically conduct manual testing on high priority published content. Focus on content that is returning poor test results in scans and is frequently accessed.

    Related Resources


    This guidance was developed by the U.S. Federal Government Revised 508 Standards Transition Workgroup. Members include the U.S. Federal CIO Council Accessibility Community of Practice, the U.S. Access Board, and the General Services Administration.


    Reviewed/Updated: May 2018