Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conda Installation #1041

Merged
merged 8 commits into from
Aug 12, 2020
Merged

Conda Installation #1041

merged 8 commits into from
Aug 12, 2020

Conversation

freddyaboulton
Copy link
Contributor

Pull Request Description

Fixes #965 by letting users install evalml-core and evalml packages with conda.


After creating the pull request: in order to pass the release_notes_updated check you will need to update the "Future Release" section of docs/source/release_notes.rst to include this pull request by adding :pr:123.

@freddyaboulton freddyaboulton changed the title Adding section on how to install with conda and updating release notes. Conda Installation Aug 11, 2020
@codecov
Copy link

codecov bot commented Aug 11, 2020

Codecov Report

Merging #1041 into main will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##             main    #1041   +/-   ##
=======================================
  Coverage   99.91%   99.91%           
=======================================
  Files         183      183           
  Lines       10147    10147           
=======================================
  Hits        10138    10138           
  Misses          9        9           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b27b18b...9611a8e. Read the comment docs.

@@ -15,7 +15,104 @@ EvalML uses [semantic versioning](https://semver.org/). Every release has a majo

If you'd like to create a development release, which won't be deployed to pypi and conda and marked as a generally-available production release, please add a "dev" prefix to the patch version, i.e. `X.X.devX`. Note this claims the patch number--if the previous release was `0.12.0`, a subsequent dev release would be `0.12.dev1`, and the following release would be `0.12.2`, *not* `0.12.1`. Development releases deploy to [test.pypi.org](https://test.pypi.org/project/evalml/) instead of to [pypi.org](https://pypi.org/project/evalml).

## 1. Create release PR to update version and release notes
## 1. Test conda version before releasing on PyPI
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This workflow was taken from the featuretools repo. They mention that there have been few cases where conda cannot build their release branch so they do a draft release before the real release.

I see three possible options we can do:

  1. Follow the featurestools steps listed in this PR
  2. Follow the featuretools steps but rather than doing a draft release, we can manually upload to test pypi. This may be desireable to not clog up our release history with a lot of draft releases.
  3. Write a CI job that tests that conda can build our packages. I think this may take some thinking but I think its possible given some conversations I had with Roy.

In the short term, I think we should do 1 or 2 while we file an issue for 3 and work on it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good. I'm fine with option 1 for now. Do we have something filed for option 3 yet?

@freddyaboulton freddyaboulton marked this pull request as ready for review August 11, 2020 20:42
@freddyaboulton freddyaboulton requested review from dsherry and jeremyliweishih and removed request for dsherry August 11, 2020 20:42
Copy link
Contributor

@angela97lin angela97lin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice and thorough, this is great! Docs header looks incorrect though but otherwise everything else LGTM 👍

@@ -4,9 +4,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Install\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if this is why but the headers have been updated to "Install with pip":

image

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for catching that! I'll change it back to "Install" and follow your suggestion below to add a short intro that says that evalml can be installed with both pip or conda.

@@ -30,6 +30,25 @@
"```"
]
},
{
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it'd be nice to have a short intro on this page that says it can be installed for pip or conda, and then have these two subsections. Just from a UI/UX standpoint, I can't see the install with conda when I land on the page and unless I scroll down, I don't even know there's an install with conda option!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@angela97lin what do you think of this? It still doesn't all show up on one screen but at least conda is mentioned in the first line.

image

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@freddyaboulton I'm a fan of how this screenshot looks, although "Conda with all dependencies" shows up on the "On this page" sidebar twice

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@eccabay Thanks for pointing that out. It was a typo!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yah, I like this better!

@@ -6,6 +6,7 @@ Release Notes
* Split `fill_value` into `categorical_fill_value` and `numeric_fill_value` for Imputer :pr:`1019`
* Added `explain_predictions` and `explain_predictions_best_worst` for explaining multiple predictions with SHAP :pr:`1016`
* Added new LSA component for text featurization :pr:`1022`
* Added support for installing with conda :pr:`1041`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🥳

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@freddyaboulton can you change this to "added guide on installing with conda"? Strictly speaking, this PR doesn't add any code changes, and the work to support conda was done elsewhere

1. Make a new development release branch on evalml (in this example we'll be testing the 0.12.2 release)
```bash
git checkout -b release_v0.12.2.dev
```
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"to v0.12.2.dev0 and push branch to repo." dangling here :o

release.md Outdated
3. Publish a new release of evalml on Github.
1. Go to the [releases page](https://github.com/FeatureLabs/evalml/releases) on Github
2. Click "Draft a new release"
3. For the target, choose the new branch (v0.12.2.dev)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I notice both v0.12.2.dev and v0.12.2.dev0 being used here. Is this intentional? :o

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

v.0.12.2.dev is the branch name and v.0.12.2.dev0 is the name of the release but I'll standardize to make it less confusing!

@@ -6,18 +6,18 @@
"source": [
"# Install\n",
"\n",
"EvalML is available for Python 3.6+. It can be installed by running the following command:\n",
"EvalML is available for Python 3.6+. It can be installed with pip or conda.\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're reminding me that once we open the github, we should add ", or directly from source." But this is fine for now.

@freddyaboulton
Copy link
Contributor Author

@angela97lin @dsherry I've addressed your feedback! See the latest doc changes here.

release.md Outdated
6. For the release description, write "Development release for testing purposes"
7. Check the "This is a pre-release" box
8. Publish the release
4. The new release will be uploaded to TestPyPI automatically
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps for clarity you could link here: https://test.pypi.org/project/evalml/

release.md Outdated
@@ -15,7 +15,104 @@ EvalML uses [semantic versioning](https://semver.org/). Every release has a majo

If you'd like to create a development release, which won't be deployed to pypi and conda and marked as a generally-available production release, please add a "dev" prefix to the patch version, i.e. `X.X.devX`. Note this claims the patch number--if the previous release was `0.12.0`, a subsequent dev release would be `0.12.dev1`, and the following release would be `0.12.2`, *not* `0.12.1`. Development releases deploy to [test.pypi.org](https://test.pypi.org/project/evalml/) instead of to [pypi.org](https://pypi.org/project/evalml).

## 1. Create release PR to update version and release notes
## 1. Test conda version before releasing on PyPI
Conda releases of evalml rely on PyPI's hosted evalml packages. Once a version is uploaded to PyPI we cannot update it, so it is important that the version we upload to PyPI will work for conda. We can test if an evalml release will run on conda by uploading a test release to PyPI's test server and building a conda version of evalml using the test release.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@freddyaboulton could you add a couple sentences here explaining how the conda packages are set up? With a link to the conda-forge repo and perhaps also a reference to the meta.yaml recipe?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point!

release.md Outdated
```
4. Push updated branch to the forked feedstock repo
3. Make a PR on conda-forge/evalml-core-feedstock from the forked repo and let CI tests run - add "[DO NOT MERGE]" to the PR name to indicate this is PR should not be merged in
4. After the tests pass, close the PR without merging
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wow, this is a lengthy process... It's going to be super helpful to add automation for this. @rwedge you do this on every featuretools release right now?

sha256: <fill-this-in>
```
* Update if dependencies have changed.
* Update the run requirements section for evalml-core if core dependencies have changed.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@freddyaboulton this is kinda a nit-pick, but can you add "Example:" at the end of this line, to make it clear the code snippet below is an example? Same for the test deps example

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea!

release.md Outdated
conda-smithy rerender --commit auto
```
4. Push updated branch to the forked feedstock repo
3. Make a PR on conda-forge/evalml-core-feedstock from the forked repo and let CI tests run - add "[DO NOT MERGE]" to the PR name to indicate this is PR should not be merged in
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the numbering is off here, should be 5 and 6, not 3 and 4

release.md Outdated
conda-smithy rerender --commit auto
```
4. Push updated branch to the forked feedstock repo
3. Make a PR on conda-forge/evalml-core-feedstock from the forked repo and let CI tests run - add "[DO NOT MERGE]" to the PR name to indicate this is PR should not be merged in
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@freddyaboulton could you add something to highlight that this is the point of the whole process, that the tests on the PR pass?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch! How about the following:

Congratulations! If the test on the PR pass, then we can build a conda package from our latest release and we
can now upload our package to PyPI instead of TestPyPI. The next steps will cover how to do this.

Copy link
Contributor

@dsherry dsherry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@freddyaboulton the install updates look great. Your release process update is quite thorough! Thank you for that.

There's a lot going into that conda testing... it definitely crosses my threshold for wanting to automate it, to prevent confusion and human error if nothing else. We can do it for Aug but let's make sure something is filed to track adding an automation job to handle this for us. We should discuss further with @rwedge as I'm sure featuretools would benefit from that work too.

I left a few minor suggestions on the release notes.

@freddyaboulton
Copy link
Contributor Author

Thanks for the feedback @dsherry ! I agree that we should automate our conda testing as much as possible. I created #1046 to track that.

@angela97lin angela97lin self-requested a review August 12, 2020 19:00
Copy link
Contributor

@angela97lin angela97lin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! :D Thanks for addressing everything, it looks great!

@freddyaboulton freddyaboulton merged commit 4b8c71a into main Aug 12, 2020
@dsherry dsherry mentioned this pull request Aug 25, 2020
@freddyaboulton freddyaboulton deleted the 965-install-with-conda branch October 22, 2020 18:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support installation through conda
4 participants