-
Notifications
You must be signed in to change notification settings - Fork 129
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
# ResponseInfo GET ignored for some Kernels #159
Comments
Thanks for the report, @gavrand . Can you confirm which mode you're running the kernel gateway in, with jupyter websockets or notebook http? Basically, what's your run command for starting the kernel gateway? |
Hi Peter Thank you for the prompt reply. It is indeed notebook-http.
Thank you |
@Lull3rSkat3r mind taking a look at this? |
I'll check it out. |
@gavrand, are you running Apache Toree with the scala or python interpreter? |
Hi @Lull3rSkat3r. It is python interpreter. Again, #Get is working perfectly. It is only ResponseInfo where I am struggling and hence could not dump application/json :(. Thank you |
@gavrand, would it be possible to attach your notebook? |
I am afraid, I am unable to send it from work by compliance reasons. Andrey |
@gavrand, no worries. I will use the example notebook. Will post my results soon. |
So investigating this in more detail exposed a bug in the Toree Kernel. Essentially the PySpark interpreter in Toree is returning This might bring up the point of reading the |
Thank you very much for investigation, Corey ! Andrey |
Tracing through the issue @Lull3rSkat3r noted above, it appears this problem has been fixed by apache/incubator-toree#47. Going to close here since the problem was in the dependency. |
Hi
I am using Apache Toree as Jupyter Kernel and ResponseInfo is being ignored for it. It is working perfectly with standard Python2 Kernel. #Get does work perfectly !
I guess it is due to "shortcut" is being used to produce response_code in /services/notebooks/handlers.py (line 199)
instead of full packing for request_code (line 180)
and hence remote Apache Kernel (interacting via Jupyter protocol) might not be happy with it.
Again, it is from my limited knowledge and time spent. However Apache PySpark does not work:I
Thank you
Andrey
The text was updated successfully, but these errors were encountered: