-
-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
R kernel slow/freeze then dies with large R object - using R alone works fine #15923
Comments
This is likely a problem with the specific R kernel that you are using, not JupyterLab. Do you use irkernel, xeus-r, juniper, or yet another R kernel? |
I have only |
I would suggest trying out https://github.com/jupyter-xeus/xeus-r |
@kklot Any update on the performance when using xeus-r? |
Summarise: changing the kernel did not help, freezing after running a few cells similar to IRKernel. Other problems came, bunches of installation issues as xeus-r is not in conda channels, then vscode xeus-r can not render figure (just showed I ended up download the files locally (which I was avoiding due to their large size), and worked on it with R in terminal, which run smoothly. |
@kklot Thanks for the update! To help people reproduce and fix this, could you provide an example very large dataset, or code that would generate a synthetic example similar to the one with which you observed this bug? |
Description
Using R with large object (model output RDS file size ~600Mb). Loading in to R everything is fine, no lag, no freezing.
Running the same code in
ipynb
, even running a simplestr(x, 1)
cause it to freeze. Happened in both mac OS M1 (16GB RAM) as well as AlmaLinux server with 128GB RAM.Reproduce
I don't know if I should share my large object for this report.
Expected behavior
Same performance as in R's terminal.
The text was updated successfully, but these errors were encountered: