Add guard against exponential time pathology #1082
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Bad news: Hypothesis's shrinking is sometimes
O(16^n).Good news: Here's a fix for that!
Bad news: This problem is intrinsic to the problem of test-case reduction and there's no escape. All is lost. We are doomed to live in exponential land forever.
(This isn't really a problem in practice because
max_shrinkssaves us, but I thought I'd spend some time figuring out how to deal with the common case and it turns out that the common case is easy to deal with).Anyway, we can detect and fix some of the common cases as follows:
Important note: The lexical minimizer now has quadatic complexity in the size of its input. This mostly doesn't matter because the two cases where it gets used:
I do want to emphasise though that we are intrinsically doomed here. There is no non-exponential shrink algorithm that also produces good results. Something like the adversarially slow shrinker example we have in our test suite will for all intents and purposes always allow us force exponential slow downs.
So given that this PR is of somewhat questionable validity. My main arguments in favour of it are: