Open-source benchmark suite for visual testing of web-based GUIs, used for the evaluation of our abstract GUI state (AGS) framework implementations recheck and recheck-web.
The benchmark suite consists of offline versions from 20 of the most popular websites. The currently available pages are (extracted on January 21, 2020):
- 360.cn
- alipay.com
- apple.com
- baidu.com
- bbc.com
- blogspot.com
- csdn.net
- ebay.com
- facebook.com
- github.com
- google.com
- jd.com
- live.com
- linkedin.com
- soso.com
- stackoverflow.com
- twitter.com
- vk.com
- wikipedia.org
- youtube.com
Based on the GUI changes taxonomy introduced by Moran et al., each page was modified to simulate real-world scenarios. The expected (original state) and actual (modified state) versions can be used to evaluate visual testing tools. In order to do so, set the system property de.retest.visualtesting.pagesState to expected or actual.
You can reproduce the experiments with the provided .travis.yml build configuration. If you want to execute the experiments locally, first follow the Applitools Selenium/Java tutorial to set up your environment. In addition, make sure to install Firefox and GeckoDriver.
Please note that we used customized (and meanwhile possibly outdated) versions of recheck, recheck-web and recheck.cli. In all projects, you will find a branch named visual-testing that you can build from source:
mvn install
If you want to use our GUI, please go to https://retest.de/review/ and contact us for an evaluation license.
For Applitools, you have to create an account at https://applitools.com/ and set the environment variable APPLITOOLS_API_KEY to your personal API key.
Then, clone this repo and run the evaluation:
mvn test --activate-profiles eval
Please note:
- While it possible to declare a specific Firefox version on Travis CI, one cannot select a Chrome version. This means that reproducing the experiments might lead to different results if the Travis CI environment changes.
- Also if you are executing the experiments locally, the results may vary from the ones in the paper due platform, browser or other environment differences.
- While we downloaded the aforementioned web pages to minimize external influences, some pages still obtain assets via the Internet. This might also be a reason for different results.
- We do not execute the actual evaluation on Travis CI per default. In order to do so, you can activate the corresponding Maven profile via
--activate-profiles eval. You have to adaptci/script.shaccordingly if you want to do this permanently.
Please refer to our paper "Visual Testing of GUIs by Abstraction" for an overview of the results and discussion. Further details such as screenshots and test reports can be found in the eval folder.