Clone this wiki locally
General goals and regular tasks
Executing automated tests regularly with various combinations of operating systems, browsers, fonts and output modes (currently 40 combinations) and ensure that MathJax is bug-free.
Improve the testing framework:
- Maintain the test suite:
- Ensure that tests work on all platforms and that new releases of browsers or MathJax do not make them break.
- Add tests for MathJax issues.
- Add tests for still uncovered features.
- Keep documentation up-to-date:
- Oversee browser support evolution:
Platforms Supported. Mainly dependent of Selenium evolutions and available infrastructure.
- improve selection mode for Internet Explorer (do not rely on Selenium 1 API)
- support for Chrome Linux with Webdriver
- support for Safari with Webdriver
- support for Android/iOS with Webdriver
- support for Konqueror with Webdriver?
- Determining MathJax's workarounds for browser bugs and report them upstream In Progress
- Ensure that the tests work correctly on all platforms or at least have known failures annotated. In Progress
- Review the MathJax 2.0 doc and add missing tests. In Progress
- Add tests for MathJax extensions.
- Make test comparison less strict (fuzzy reftests, avoid rounding issues etc). => first experiments on fuzzy reftests do not seem to show they help a lot.
- Find a way to improve simulation of user interaction via Selenium 2's "native events". Improve/Complete UI and Configuration tests that require this feature.
Interface to choose/update a browser version (stable, beta, nightly etc). For the moment this is possible with Selenium 1 by installing the programs somewhere on a test machine and setting the browserPath configuration option accordingly.
- Tests covering most of the AsciiMath support.
Work done since January 2012
- Moving the MathJax testing framework into the cloud.
- Create wiki pages to present browser bugs or missing features that are problematic for MathJax.
- Final testing of MathJax 2.0.
- Make Webdriver work with Internet Explorer.
- Set up DSI test machine to run automated tests.
- Finishing documentation describing the testing framework, including installation instructions.
- Documentation on how to use the DSI machine.
- Allowing annotation of reftest failure.
From May 2011 to the end of December 2011
- Tests covering most of the LaTeX support.
- Tests covering most of the MathML support.
- Tests covering most of the public API.
- Tests covering most of the Configuration options.
- Tests covering most of the UI.
- Web User Interface to control the testing instances. (see taskViewer.png in attachment)
- Allow scheduling testing instances.
- Upgrade Selenium and support more platforms. (see Migration from Selenium 1 to Selenium 2 and Platforms Supported)
- Support for SVG output Jax.
- Interface to choose/update a version of MathJax (stable, development, developer's branches etc).
- Set up the webfactional machine to store the test suite, the development branches, the test runner and the QA web interface.
From the end of January 2011 to the end of April 2011
- Analyze areas of functionality to be tested, strategy for testing them, and
identifying test cases to be created.
- See TestSuite
- Review automated test frameworks, and select one.
- Create a test plan. After preliminary discussions with Frédéric, our plan is:
- Create Selenium ref tests for LaTeX and MathML processing, rendering, and UI testing that can be automated.
- Create "scenario tests" for installation, configuration, API and UI testing that can't easily be tested with Selenium ref tests. The scripts in the current ./test directory can be thought of as an initial step in this direction
- Define testing procedures.
- Create a test suite of ref tests to be executed in Selenium. The plan here is:
- Robert will setup a github repository dedicated to MathJax testing.
Robert will find people to try Selenium and the automated testing from MathJax-test github repository.
- Frédéric will start writing reftests for LaTeXToMathML and MathMLToDisplay covering the main features.
- Frédéric will check the issue trackers for sensitive parts of the implementation that should be tested in priority (marked with QA tags).