-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add CI test with full realistic workflow on sample data #112
Comments
Some questions:
|
As outlined above: for every new commit on a PR branch (via CircleCI rule
Good question. I have a feeling they are almost always the same, but you never know. So let's do the "harder" test, running all the above within a
I am still with CircleCI, as everything has been set up already, and one could easily add more platforms there in the future. |
Personally I don't have access to the CircleCI results. I started several times to look for an account and always decided against it because I did not like to give them the required rights. |
As this is designed as an OCR-D-Container-Systemtest, the workflow should focus on module integration, especially with known incompatibilities and, as to provide some sort of regression testing, on previously solved problems. |
We probably could do more in this regard, but #362 contains a first-order approximation ( |
Most submodules are CI-tested individually. So when we make a new release here, as long as we don't negligently integrate failed versions, we can feel better with each update.
However, there are still two classes of errors which will evade this scheme:
For the latter, we could reduce our risk by introducing workflow tests that run many different modules on a small set of sample data. Since the CI job for PRs is already set to use
make docker-maximum
, one could use any processor(s) after that. And we already have enough data incore/repo/assets/
and workflows inworkflow-configuration/
. So the test would run that workflow on (say)data/kant_aufklaerung_1784
, check it did not crash, check the target file group exists for all pages, and perhaps validate the workspace.(We don't include any models in the standard distribution though. So one would either have to use a very simple workflow, or need an extra step to install segmentation and recognition models.)
The text was updated successfully, but these errors were encountered: