-
-
Notifications
You must be signed in to change notification settings - Fork 66
Resolve ImportError due to mutual imports #713
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…mpirical_models to avoid ImportError
…nt shape attribute of a python float
…INE only to avoid testing timeouts on Travis
…ing them on APH_MACHINE
…esting to avoid Travis CI build problems
…mposite_model.rst and tinker13_composite_model.rst documentation tutorials
After resolving the ImportError problem that was the original intent of this PR, I made some minor changes throughout the test suite in an attempt to resolve the Travis and AppVeyor build problems. All now appears to be good with AppVeyor, although Travis still has a few problems for certain configurations. See #710 for details |
Thanks for the tip, @bsipocz. I think that's a fine workaround since it accomplishes the needed testing, and maintains the documentation. I just checked my current testing suite and verified that the exact syntax I commented out of the doctests is actually already covered in a unit-testing module, so in the end, commenting out these few doctest lines actually has no consequence (though I didn't know that at the time.) The most recently merged PR #713 actually passes all CI builds, so in principle Halotools is back on track. But I had to manually rebuild several different configurations in order to get 100% passes. So, in other words, for a given test, sometimes it passes, sometimes it fails, even with no changes to the source code. I expect this will continue to be the case, which will make it difficult to manage future development. All this time, I have never had a local failure. That probably means these are CI-environment specific failures and nothing to worry about. But all the same, this is uncomfortable since I'm on the verge of release. In fact, there are no other outstanding Probably the thing to do now is to notify my colleagues and collaborators that I would not mind delaying release for the sake of ensuring code stability. So if either @eteq or @bsipocz have suggestions for other things to try before notifying the core user base and then releasing, please let me know. |
@aphearin - Have you though about doing a release candidate? That would allow you to get an easier way of testing by a group of collaborators, installing from pip, while the wider usebase remain unaffected. |
@bsipocz - I have not done a "candidate release" before, but I like that idea. Is that release procedure any different from what I normally do? I'd rather not manage bug-fix branches if I can help it, it's just more work than I'd like to take on if possible. |
It's more or less the same, the trick is in the version numbering, you want to make sure pip recognizes that it's a pre-release, so |
Ha! Obviously I've just passed over that section too many times to notice it anymore :-\ Thanks for the tip, this is probably how I'll handle the situation. |
That's in another docs, and since you're not planning on bugfix branching, it's unlikely to click on the link in the note at the bottom of the page :) |
empirical_models and mock_observables were previously trying to import each other, causing an ImportError that was not caught by CI since Travis is currently failing due to unrelated reasons.