fixed the assignment of the token count in several integration tests #280
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
I recently noticed that the multi_output_file_test occasionally generates "trigger inhibit" warnings when running the SWTPG part of its testing. As part of investigating this, I noticed that the token count that is being requested in the integtest script was not being taken into account. The problem was that the location of the token_count parameter in the dataflow daqconf schema tree changed, but the integtests were not correspondingly updated I fixed that problem and tweaked a couple of the token_count calculations. With these changes, the multi_output_file_test integtest runs more reliably without trigger inhibit warnings. To test these changes, I ran each of the 5 modified integtests from the develop branch and then from the kbiery/integtest_token_count_fix branch and compared the generated configurations under /tmp/pytest_of_/pytest_N/jsonX and /tmp/pytest_of_/pytest_N-1/jsonX to confirm that values for the DFO busy and free parameters were different from the default values (which were being used before the changes).