Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Calm down fuzzing #296

Merged
merged 1 commit into from
Nov 12, 2019
Merged

Calm down fuzzing #296

merged 1 commit into from
Nov 12, 2019

Conversation

UniversalSuperBox
Copy link
Contributor

The fuzzing configuration introduced by #191 did find a novel issue in its first run. However, it is unable to run on travis-ci due to taking too long without output. I tried running it locally to make sure it wasn't Travis being Travis, and sure enough, on my i9-9880H, the operation took well over 20 minutes before I stopped it.

Because of this, I've reduced the number of examples that Hypothesis will run. This will catch fewer errors but help us get PRs back on track.

Ping @untitaker, who introduced this change originally

The fuzzing configuration introduced by
#191 *did* find a novel
issue in its first run. However, it is unable to run on travis-ci due to
taking too long without output. I tried running it locally to make sure
it wasn't Travis being Travis, and sure enough, on my i9-9880H, the
operation took well over 20 minutes before I stopped it.

Because of this, I've reduced the number of examples that Hypothesis
will run. This will catch fewer errors but help us get PRs back on track
@mister-roboto
Copy link

@UniversalSuperBox thanks for creating this Pull Request and help improve Plone!

To ensure that these changes do not break other parts of Plone, the Plone test suite matrix needs to pass.

Whenever you feel that the pull request is ready to be tested, either start all jenkins jobs pull requests by yourself, or simply add a comment in this pull request stating:

@jenkins-plone-org please run jobs

With this simple comment all the jobs will be started automatically.

Happy hacking!

Copy link
Contributor

@untitaker untitaker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@UniversalSuperBox you might want to look into the deadline parameter instead of max_examples but either way this seems fine

@UniversalSuperBox
Copy link
Contributor Author

It appears that deadline is for individual examples rather than the entire test case.

@UniversalSuperBox UniversalSuperBox merged commit 165b4e4 into master Nov 12, 2019
@UniversalSuperBox UniversalSuperBox deleted the less-fuzzing branch November 12, 2019 18:35
@wbob
Copy link

wbob commented Nov 17, 2019

On a tangent: Hypothesis allows to use profiles, as @untitaker uses in vdirsyncer. Travis can then be instructed to use a smaller example size, while per default outside CI more samples are generated.
When I stumbled on this issue, I even went down to max_examples=10**3 and set suppress_health_check to avert a failing test[1] as of slow generation (see HypothesisWorks/hypothesis#2009 for a similar case).

[1]: src/icalendar/tests/hypothesis/test_fuzzing.py:24: FailedHealthCheck FailedHealthCheck: Data generation is extremely slow: Only produced 7 valid examples in 1.25 seconds (0 invalid ones and 0 exceeded maximum size). Try decreasing size of the data you're generating (with e.g.max_size or max_leaves parameters).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants