Skip to content
Branch: master
Find file Copy path
Find file Copy path
Fetching contributors…
Cannot retrieve contributors at this time
126 lines (80 sloc) 5.94 KB


Tests are run using a Selenium/Mocha/Chai framework:

  • selenium-webdriver
  • mocha
  • chai


We assume the following are installed:

Verify Prerequisites

To check if ChromeDriver and Node.js have already been installed, type the appropriate commands to print the version.

$ chromedriver --version
ChromeDriver 2.40.565498 (ea082db3280dd6843ebfb08a625e3eb905c4f5ab)

$ node --version

$ npm --version

Note: Versions may vary.

If the prerequisite has not been installed, first use the information below to install, then run the appropriate commands to print the version (ie verify the installation by running the commands above).


Selenium is a tool that automates browsers and ChromeDriver is the Chrome implementation of WebDriver, which is an open source tool for automated testing of webapps across many browsers.

See Chrome Driver Help for more information about installation.

Note: Make sure the "$HOME/bin" directory exists, it is on the system path and that ChromeDriver is placed within the directory.


Node.js is a JavaScript runtime built on Chrome's V8 JavaScript engine. Visit the Node.js Downloads for various intallers.

Note: Windows users are recommended to use the Git Bash console to run Node.js. An Installer may be found on the Git Downloads.

Node Version Manager (NVM) is a tool that allows users to swtich between different versions of Node.js.

Running Verification Tests to Test the Custom Ruleset

The verification tests for the custom ruleset are based on five main categories as listed in the description of the Custom Ruleset.

Step 0: Download Code and Change the Directory

Run the following commands:

git clone
cd accessibility-ruleset-runner/rulesets/tests

Note: If you are working from a forked repository, you might use slightly different commands than those given above. Also, if you have already downloaded the code, you can skip step.

Step 1: Install Package Dependencies

To install dependencies (from the Public NPM Registry), run the following command from the rulesets/tests directory:

npm install

Step 2: Run Tests

To run the tests, run the following command from the rulesets/tests directory:

npm run custom.ruleset.verification.tests

The output should match the Custom Ruleset Verification Tests Output.

Testing Methodology

The custom ruleset is vetted against a library of good/bad html code snippets as discussed in Creating a Ruleset.

Test Library

Creating and modifying rules requires careful thought into how variations of code should be treated. Variations of code can be added to a test library (ie the Custom Ruleset Test Library), providing several use cases to verify the rule is working as expected. This becomes the foundation for which additional use cases can be added as they are discovered.

Treatment of Incorrectly Classified Code Snippets

Sometimes the ruleset will not classify a specific code variation correctly. We have the following options:

  • Add a new use case and evolve the ruleset to handle the new use case along with the old use cases
  • Change our thinking about some old use cases and evolve the ruleset appropriately
  • Throw out the rule due to complexity and the discovery of false positives
  • Create an exemption to handle the use case

As the number of use cases increases, the complexity of the ruleset also increases. There becomes a point where a developer simply cannot handle all the use cases without the use of a test bed.


In some cases, a ruleset may behave as expected but an organization may decide to make an exemption for certain types of code snippets. This is handled through exemptions.

For example, in the case of ambiguous links, a library website may have two book titles which are exactly the same but lead to different checkout pages, one for each book. In this case, the library may determine that links may remain ambiguous and require the user to navigate to discover the context of each link (ie author, price, year, etc). Solutions to append additional information (ie the author) may become too verbose or may not even work (ie appending the author will not work if the authors are also the same).

Another common use case are websites that provide user entered content.

The ruleset should flag exempted use case as a failure within the results. The results may then be post processed to treat failures which match certain patterns (ie adding a class like 'bookTitle' or 'userContent'). Here are a few options:

  • Remove the exemption from the failed set
  • Add the exemption to the passed set
  • Track the exemption in an exempted set
You can’t perform that action at this time.