Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve performance on large text files #9972

Open
dclong opened this issue Mar 20, 2021 · 2 comments
Open

Improve performance on large text files #9972

dclong opened this issue Mar 20, 2021 · 2 comments

Comments

@dclong
Copy link

dclong commented Mar 20, 2021

Problem

I'm frustrated when opening large text files (around 300M) in JupyterLab. It causes JupyterLab to freeze (even if there were 32G memory available).

Proposed Solution

Load data from the text file when needed similar to virtual notebook.

Additional context

@apcamargo
Copy link

apcamargo commented Mar 21, 2021

JupyterLab uses CodeMirror as its text editor. This issue probably should be reported in CodeMirror's repository.

@ellisonbg
Copy link
Contributor

Yeah, I am guessing this is an issue with CodeMirror

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants