-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large scale syntax regression tests #1124
Comments
See #1126 for the first version of this. I could use some help in collecting the necessary files for all languages. |
Does syntect have the ability to run the syntax tests that Packages come with? That may also help, whereas if you hardcode the results, things will break on improvements. For instance, when sublimehq/Packages#2305 lands you will probably detect that as a break, even if it is not broken, just different. |
Yes, syntect has a syntax test runner, which is part of syntect's CI: trishume/syntect@0a74d87 However, until trishume/syntect#185 is merged, there is no comfortable way to run the tests outside of syntect, as currently all the code is in an example file. Asserting that the colors are the same between versions can be advantageous for detecting when syntax definition changes affect highlighting that users may be used to for common constructs, to avoid nasty surprises. Fwiw, someone wrote a package for ST to do something similar: https://packagecontrol.io/packages/ColorSchemeUnit |
I see, so the intention is to have this to test the actual colors, as opposed to catching if syntect "broke", or if the features used in the syntax are not supported by syntect. |
@wbond and @keith-hall: Thank you for your feedback! And for your work on Sublime Text! The syntax tests in This is intended to be a very high-level integration test that makes sure that the output of We currently support around 150 different languages in
Absolutely. That is the big disadvantage. If there are actual intended changes to the syntax highlighting, we will have to manually review the changes (compare before and after) and overwrite the stored versions.
Ideally, I would like this test to be theme-agnostic. Testing with a hardcoded default theme is certainly not ideal, but that is something that could be improved in the future. To summarize: For integration tests, I think there is a certain value in having the tests as easy as possible (here: actually run |
Mostly just wanted to see if the tests coming with ST's Packages were run since I think they would have detected all of the issues referenced. Certainly they won't help for the third-party syntaxes, but I have a feeling those won't be bulk imported like Packages, and are less likely to be using the new syntax engine features. It sounds like your method of testing should work out well considering your constraints. Wrangling syntax definitions is definitely not a simple or quick task. 🙂 |
Unfortunately, no. They are executed inside the |
Yeah, unfortunately I have a feeling if there were failing tests before, it will probably just keep getting larger as we are starting to more widely use some significant new features in the engine – branch points, inheritance and "version 2" scoping rules. |
This has now been implemented. See #1213 for a follow-up with the list of syntaxes. |
I would like to set up a test suite that makes sure we don't run into regressions like #963, #914 or #1099.
The way this could work would be to collect a large set of source files for all the different languages that
bat
can handle (we could start with the ones that are inside "Packages" and require this for new syntaxes in the future). We would then generate the syntax-highlighted versions of these source files viabat --color=always --plain example.rs > example.rs.highlighted
and store the
*.highlighted
versions in the repository as well. Finally, we would set up a test (similar to the "snapshot tests") that would compare the output of the current version ofbat
with these stored versions.We could use a specialized fixed highlighting theme for these tests, in order for theme changes not to require any updates of these files.
We certainly wouldn't catch all bugs with this method, as it relies on the content of the example files (more content is better, I guess). But we certainly wouldn't be surprised by things like #963, where the syntax highlighting for Markdown was essentially completely broken.
This collection of example files would certainly also be valuable for other sorts of tests. The biggest effort would be to collect these files. A possible source could be http://rosettacode.org/wiki/Rosetta_Code, but licensing needs to be checked.
The text was updated successfully, but these errors were encountered: