-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What does nb_conda_kernels
do?
#45
Comments
After some more hacking my colleague tried Having run the There's also a note here https://docs.continuum.io/anaconda/jupyter-notebook-extensions on Uninstalling the extensions and disabling them. If there's a really good reason not to disable the |
[note: most of this information is based on my usage of the environment_kernel_manager which more or less has the same idea as nb_conda_kernels] The kernel isn't specific to the machine, it uses a name defined by the conda environment name: in this case the nb_conda_kernels creates a (virtual) kernel spec for each conda environment (which has a kernel installed?). So if you both have nb_conda_kernels installed and the same name for the projects conda environment, it should work. Just to comment on the idea for a newish workflow:
The "default name" is actually depended on having a jupyter kernel in the same environment as the jupyter notebook, which does not need to be the case (at least I tried to remove it, but conda didn't let me, bu AFAIK there shouldn't be a technical reason other than not confusing the user). |
BTW: IMO the notebook (or a new package) should become a meta package which installs all the package which the notebook now depends on in addition to a new "only the notebook" package. That way you could remove the meta package and everything you don't like (or need, so my usecase of not installing a kernel next to the notebook server). |
Maybe my confusion came from having older Notebooks (prior to I agree that the kernal names are machine inspecific (I've edited the original post), it felt like it might be specific as with kernel name "fds" set in the metadata by me and my colleague with an environment called "fds" he still had to manually choose his "fds" environment (and conda had already been used to activate his "fds" environment). I'm putting that down to random weirdness as it doesn't seem to happen now. What I don't understand is why:
Does it really make sense to have me activate an environment, open a Notebook (using It isn't the biggest hurdle by any means but it seems like an extra step of effort for 0 gain. I'm still left wondering "what am I missing?". |
IMO this Assumtion never holds, no matter what you use: In your way, the implicit assumption is that the "default kernel" has all the package which the notebook needs, but that probably also never the case (at least it wouldn't be the case if you would send me a notebook...). And then add different python versions into the mix (e.g. in your case start with py27 environment and then with a py3x env). Either you mandate conventions or you will have these problems.
You don't need to activate the notebook env, you just need to call the jupyter script ( This is my workflow and for that workflow, a kernel manager which adds a kernel for each env is a great help:
-> from then on, I just have to just open the notebook and it works If I would collaborate with someone, I would add a environment.yml into the project folder (=repo) which would hold the name of the env and all packages and then would just ensure that each project member would have an easy command to create the project environment ( Only if someone does not want to use such a env based setup, you are going to need to use the default kernel (e.g. activate the project env, call
My gain was that I didn't need to manage different kernels manually (the |
@JanSchulz Hey, bro, May I ask you something? I built a separate environment for running tensor flow using python=3.5, and like you said, by using nb_conda_kernels, no matter in which environment to launch jupyter, I will always have different environment kernel. And in every environment, I will have a default kernel, which should be the python or other language I install in my environment, right? So, in my root environment, if I launch jupyter notebook, I will see kernel options, like, default, root, other_environment_kernel. My question is that in my root environment, python is 2.7, but why do I still can type, print(2) in that kernel? This drives me crazy... And another problem is that I have R kernel installed in my root environment, but I can not trigger this kernel at other environment. The whole kernel stuff is driving me to death, I have not even had lunch...Thank you so much if you can help. |
What's the output of
You need to have R and the R-kernel installed in that environment so nb_conda_kernels can detect it and create the proper kernelspec... |
@jhprinz can you please open another issue about this? Just to not distort the previous discussion. |
@damianavila Sure, will just copy this to a new issue. It is kind of a question about the docs though. I just wasn't sure if an issue is the best way to ask for help at all. Would stackoverflow be an alternative? |
@jhprinz I think an issue is good beginning, you can also free to post in stackoverflow but I am not sure if you will find people there to ask this questions. |
Closing this one since the original discussion stopped a long time ago. Please reopen (or open a new issues) if you have further questions. |
nb_conda_kernels
got added to my Anaconda installation on an update a few months back. Since then I've been confused as to what it does - it seems to make sharing Notebooks harder. What am I missing?Here's my workflow:
conda
I don't understand why this just doesn't use the default environment that
jupyter notebook
was run from. What's the advantage to me of activating an environment, starting a Notebook, then having to choose the same environment (and/or having a collaborator mess with the metadata when we're sharing Notebooks)?What's the thinking behind fixing a Notebook to a (edit - removed "machine") specific environment name?
Update - I've now upgraded to 2.0 (which still required me to choose a kernel when I started a Notebook), I see on the installation page that I could also run
python -m nb_conda_kernels.install --enable --prefix="%CONDA_PREFIX"
which I've done because it looks magic, but not because I know what it'll fix.
For reference the Notebook metadata in this case contains:
The text was updated successfully, but these errors were encountered: