Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Heisenbug: tests sometimes fail audio buffer validation #951

Closed
bmcfee opened this issue Aug 14, 2019 · 8 comments · Fixed by #1179
Closed

Heisenbug: tests sometimes fail audio buffer validation #951

bmcfee opened this issue Aug 14, 2019 · 8 comments · Fixed by #1179
Labels
bug Something doesn't work like it should testing Issues with our test design and continuous integration services
Milestone

Comments

@bmcfee
Copy link
Member

bmcfee commented Aug 14, 2019

Description

I've noticed that some of our CI tests seem to randomly fail with audio buffer validation errors, only to pass correctly when restarted. I don't have any explanation for this, and I doubt it's due to anything in librosa.

It seems to have started after the switch to soundfile as the primary decoder, and it may be indicative of a bug deep in the guts of libsndfile, if not in the python bindings. Any help diagnosing this would be much appreciated!

@bmcfee bmcfee added bug Something doesn't work like it should testing Issues with our test design and continuous integration services Upstream/dependency bug Another package is causing us trouble! labels Aug 14, 2019
@lostanlen
Copy link
Contributor

Is there a bruteforce procedure to trigger the Heisenbug with (1-ε) probability ?

@bmcfee
Copy link
Member Author

bmcfee commented Aug 15, 2019

Is there a bruteforce procedure to trigger the Heisenbug with (1-ε) probability ?

I haven't found any reproducible trigger for it, no. I can't get it to happen on my local installation either.

@lostanlen
Copy link
Contributor

(7 weeks later)
is this still happening?

@bmcfee
Copy link
Member Author

bmcfee commented Oct 10, 2019

I haven't seen it lately, but I also haven't been running tests very often.

@lostanlen
Copy link
Contributor

Would be nice to keep an eye on this in the wake of #1064

@bmcfee
Copy link
Member Author

bmcfee commented Feb 23, 2020

Would be nice to keep an eye on this in the wake of #1064

Indeed. FWIW i haven't seen it happen lately, and certainly not since rewriting the test fixtures. I think it's probably clear, but let's hold off on closing this out until we're ready to push 0.8 (ie after having merged a bunch of PRs and run lots of tests).

@bmcfee
Copy link
Member Author

bmcfee commented May 23, 2020

This came back at us in #1120 -- a reboot of the build image fixed it. So it's definitely still active.

@bmcfee
Copy link
Member Author

bmcfee commented Jun 23, 2020

Aha! I think I've figured this one out.

Some of our safety checks (eg length checks or error cases) that use dummy input audio (because the content doesn't matter) use np.empty to construct the buffer. If we get unlucky with the uninitialized data, it can parse as nan/inf and fail the valid_audio checks.

I'll create a PR to remove np.empty from all test cases shortly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something doesn't work like it should testing Issues with our test design and continuous integration services
Development

Successfully merging a pull request may close this issue.

2 participants