Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cluster auto-discovery and management across environments #82

Open
benbovy opened this issue Aug 23, 2019 · 3 comments
Open

Cluster auto-discovery and management across environments #82

benbovy opened this issue Aug 23, 2019 · 3 comments

Comments

@benbovy
Copy link

benbovy commented Aug 23, 2019

As a follow-up on #18 and #31, it would be nice if the "search button" and the cluster management section on the side panel could work across multiple (conda) environments.

Those features work very well when I'm using a single environment where everything (jupyterlab, dask, extensions, etc.) is installed.

However, my (possibly a common?) configuration consists of running jupyterlab from within its own dedicated, lightweight conda environment and using nb_conda_kernels to run kernels installed in other environments (one per project). In this case the search button is unresponsive and the cluster management section in the side panel only manages clusters in the jupyterlab environment. I could still manually copy dashboard addresses in the text field, though (and I'm happy doing this!)

Unfortunately, I have no idea on how much effort this would require to implement.

@benbovy
Copy link
Author

benbovy commented Sep 19, 2019

I could still manually copy dashboard addresses in the text field, though (and I'm happy doing this!)

This works locally, but unfortunately not on jupyterlab running on a remote server (related to #41 I guess).

@hadim
Copy link

hadim commented Nov 23, 2019

I have the same workflow as @benbovy. My base conda env only contains a minimum set of libraries to run JLab and I have one conda env per project.

So it would be nice to be able to create a new cluster "inside" an existing kernel.

Or I am also fine executing the kernel in the notebook if the extension can "discover it".

@mangecoeur
Copy link

Similar issue, also related to #41 . I have JLab+Jhub running for a set of users, there is a lightweight env hosting the Hub and Lab interface that users never interact with. Instead there is a shared conda env and each user can also create their own.

It seems the extension can at the moment only start a cluster in the same environment as the Jupyter server, is that correct? It would be great if the "new" button followed a similar logic to the new notebook page in JLab in showing all the available kernels and allowing you to start cluster in whichever you choose.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants