-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conda Installation #1041
Conda Installation #1041
Conversation
Codecov Report
@@ Coverage Diff @@
## main #1041 +/- ##
=======================================
Coverage 99.91% 99.91%
=======================================
Files 183 183
Lines 10147 10147
=======================================
Hits 10138 10138
Misses 9 9 Continue to review full report at Codecov.
|
@@ -15,7 +15,104 @@ EvalML uses [semantic versioning](https://semver.org/). Every release has a majo | |||
|
|||
If you'd like to create a development release, which won't be deployed to pypi and conda and marked as a generally-available production release, please add a "dev" prefix to the patch version, i.e. `X.X.devX`. Note this claims the patch number--if the previous release was `0.12.0`, a subsequent dev release would be `0.12.dev1`, and the following release would be `0.12.2`, *not* `0.12.1`. Development releases deploy to [test.pypi.org](https://test.pypi.org/project/evalml/) instead of to [pypi.org](https://pypi.org/project/evalml). | |||
|
|||
## 1. Create release PR to update version and release notes | |||
## 1. Test conda version before releasing on PyPI |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This workflow was taken from the featuretools repo. They mention that there have been few cases where conda cannot build their release branch so they do a draft release before the real release.
I see three possible options we can do:
- Follow the featurestools steps listed in this PR
- Follow the featuretools steps but rather than doing a draft release, we can manually upload to test pypi. This may be desireable to not clog up our release history with a lot of draft releases.
- Write a CI job that tests that conda can build our packages. I think this may take some thinking but I think its possible given some conversations I had with Roy.
In the short term, I think we should do 1 or 2 while we file an issue for 3 and work on it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good. I'm fine with option 1 for now. Do we have something filed for option 3 yet?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice and thorough, this is great! Docs header looks incorrect though but otherwise everything else LGTM 👍
@@ -4,9 +4,9 @@ | |||
"cell_type": "markdown", | |||
"metadata": {}, | |||
"source": [ | |||
"# Install\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for catching that! I'll change it back to "Install" and follow your suggestion below to add a short intro that says that evalml can be installed with both pip or conda.
docs/source/install.ipynb
Outdated
@@ -30,6 +30,25 @@ | |||
"```" | |||
] | |||
}, | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it'd be nice to have a short intro on this page that says it can be installed for pip or conda, and then have these two subsections. Just from a UI/UX standpoint, I can't see the install with conda
when I land on the page and unless I scroll down, I don't even know there's an install with conda option!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@angela97lin what do you think of this? It still doesn't all show up on one screen but at least conda is mentioned in the first line.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@freddyaboulton I'm a fan of how this screenshot looks, although "Conda with all dependencies" shows up on the "On this page" sidebar twice
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@eccabay Thanks for pointing that out. It was a typo!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yah, I like this better!
docs/source/release_notes.rst
Outdated
@@ -6,6 +6,7 @@ Release Notes | |||
* Split `fill_value` into `categorical_fill_value` and `numeric_fill_value` for Imputer :pr:`1019` | |||
* Added `explain_predictions` and `explain_predictions_best_worst` for explaining multiple predictions with SHAP :pr:`1016` | |||
* Added new LSA component for text featurization :pr:`1022` | |||
* Added support for installing with conda :pr:`1041` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🥳
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@freddyaboulton can you change this to "added guide on installing with conda"? Strictly speaking, this PR doesn't add any code changes, and the work to support conda was done elsewhere
1. Make a new development release branch on evalml (in this example we'll be testing the 0.12.2 release) | ||
```bash | ||
git checkout -b release_v0.12.2.dev | ||
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"to v0.12.2.dev0 and push branch to repo." dangling here :o
release.md
Outdated
3. Publish a new release of evalml on Github. | ||
1. Go to the [releases page](https://github.com/FeatureLabs/evalml/releases) on Github | ||
2. Click "Draft a new release" | ||
3. For the target, choose the new branch (v0.12.2.dev) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I notice both v0.12.2.dev
and v0.12.2.dev0
being used here. Is this intentional? :o
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
v.0.12.2.dev
is the branch name and v.0.12.2.dev0
is the name of the release but I'll standardize to make it less confusing!
@@ -6,18 +6,18 @@ | |||
"source": [ | |||
"# Install\n", | |||
"\n", | |||
"EvalML is available for Python 3.6+. It can be installed by running the following command:\n", | |||
"EvalML is available for Python 3.6+. It can be installed with pip or conda.\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're reminding me that once we open the github, we should add ", or directly from source." But this is fine for now.
@angela97lin @dsherry I've addressed your feedback! See the latest doc changes here. |
d3f4322
to
8651789
Compare
release.md
Outdated
6. For the release description, write "Development release for testing purposes" | ||
7. Check the "This is a pre-release" box | ||
8. Publish the release | ||
4. The new release will be uploaded to TestPyPI automatically |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps for clarity you could link here: https://test.pypi.org/project/evalml/
release.md
Outdated
@@ -15,7 +15,104 @@ EvalML uses [semantic versioning](https://semver.org/). Every release has a majo | |||
|
|||
If you'd like to create a development release, which won't be deployed to pypi and conda and marked as a generally-available production release, please add a "dev" prefix to the patch version, i.e. `X.X.devX`. Note this claims the patch number--if the previous release was `0.12.0`, a subsequent dev release would be `0.12.dev1`, and the following release would be `0.12.2`, *not* `0.12.1`. Development releases deploy to [test.pypi.org](https://test.pypi.org/project/evalml/) instead of to [pypi.org](https://pypi.org/project/evalml). | |||
|
|||
## 1. Create release PR to update version and release notes | |||
## 1. Test conda version before releasing on PyPI | |||
Conda releases of evalml rely on PyPI's hosted evalml packages. Once a version is uploaded to PyPI we cannot update it, so it is important that the version we upload to PyPI will work for conda. We can test if an evalml release will run on conda by uploading a test release to PyPI's test server and building a conda version of evalml using the test release. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@freddyaboulton could you add a couple sentences here explaining how the conda packages are set up? With a link to the conda-forge repo and perhaps also a reference to the meta.yaml recipe?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point!
release.md
Outdated
``` | ||
4. Push updated branch to the forked feedstock repo | ||
3. Make a PR on conda-forge/evalml-core-feedstock from the forked repo and let CI tests run - add "[DO NOT MERGE]" to the PR name to indicate this is PR should not be merged in | ||
4. After the tests pass, close the PR without merging |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow, this is a lengthy process... It's going to be super helpful to add automation for this. @rwedge you do this on every featuretools release right now?
sha256: <fill-this-in> | ||
``` | ||
* Update if dependencies have changed. | ||
* Update the run requirements section for evalml-core if core dependencies have changed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@freddyaboulton this is kinda a nit-pick, but can you add "Example:" at the end of this line, to make it clear the code snippet below is an example? Same for the test deps example
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good idea!
release.md
Outdated
conda-smithy rerender --commit auto | ||
``` | ||
4. Push updated branch to the forked feedstock repo | ||
3. Make a PR on conda-forge/evalml-core-feedstock from the forked repo and let CI tests run - add "[DO NOT MERGE]" to the PR name to indicate this is PR should not be merged in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the numbering is off here, should be 5 and 6, not 3 and 4
release.md
Outdated
conda-smithy rerender --commit auto | ||
``` | ||
4. Push updated branch to the forked feedstock repo | ||
3. Make a PR on conda-forge/evalml-core-feedstock from the forked repo and let CI tests run - add "[DO NOT MERGE]" to the PR name to indicate this is PR should not be merged in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@freddyaboulton could you add something to highlight that this is the point of the whole process, that the tests on the PR pass?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch! How about the following:
Congratulations! If the test on the PR pass, then we can build a conda package from our latest release and we
can now upload our package to PyPI instead of TestPyPI. The next steps will cover how to do this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@freddyaboulton the install updates look great. Your release process update is quite thorough! Thank you for that.
There's a lot going into that conda testing... it definitely crosses my threshold for wanting to automate it, to prevent confusion and human error if nothing else. We can do it for Aug but let's make sure something is filed to track adding an automation job to handle this for us. We should discuss further with @rwedge as I'm sure featuretools would benefit from that work too.
I left a few minor suggestions on the release notes.
…e in release notes.
…ts more clear in coda section of release notes.
8651789
to
318795c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! :D Thanks for addressing everything, it looks great!
Pull Request Description
Fixes #965 by letting users install
evalml-core
andevalml
packages with conda.After creating the pull request: in order to pass the release_notes_updated check you will need to update the "Future Release" section of
docs/source/release_notes.rst
to include this pull request by adding :pr:123
.