Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rust: cargo build produces huge load on file watcher (Linux) #82427

Closed
u8983478934 opened this issue Oct 12, 2019 · 30 comments
Closed

Rust: cargo build produces huge load on file watcher (Linux) #82427

u8983478934 opened this issue Oct 12, 2019 · 30 comments
Assignees
Labels
bug Issue identified by VS Code Team member as probable bug file-watcher File watcher freeze-slow-crash-leak VS Code crashing, performance, freeze and memory leak issues linux Issues with VS Code on Linux upstream Issue identified as 'upstream' component related (exists outside of VS Code) upstream-issue-linked This is an upstream issue that has been reported upstream verified Verification succeeded
Milestone

Comments

@u8983478934
Copy link

u8983478934 commented Oct 12, 2019

RLS on Linux is already very buggy and causes memory leak, but it stabilizes after eating around 1GB, but today after I updated VSCode to 1.39.1, now VSCode itself, namely code /BIN_HOME/visual-studio-code/resources/app/out/bootstrap-fork --type=watcherService component eats more RAM on each file save until the system totally freezes. I am not sure whether something changed in this component in 1.39 release. Downgrading solves the problem, also note that rls-vscode extension hasn't been updated for a while so it's not the extension's problem.

@bpasero bpasero added file-watcher File watcher freeze-slow-crash-leak VS Code crashing, performance, freeze and memory leak issues linux Issues with VS Code on Linux info-needed Issue requires more information from poster labels Oct 14, 2019
@bpasero
Copy link
Member

bpasero commented Oct 14, 2019

@gophobic any chance you maybe have a cyclic symbolic link?

@musikid
Copy link

musikid commented Oct 14, 2019

I got the same problem with even worse performance...
memory
I tested for circular and broken symbolic links but didn't find anything.

@bpasero
Copy link
Member

bpasero commented Oct 15, 2019

@gophobic @musikid any chance the folder you open is something that is shareable (open source) so that I could try to reproduce?

@musikid
Copy link

musikid commented Oct 15, 2019

I can reproduce it with all Rust repos but you can try with ripgrep-all

@bpasero
Copy link
Member

bpasero commented Oct 15, 2019

@musikid link?

@bpasero bpasero added this to the October 2019 milestone Oct 15, 2019
@musikid
Copy link

musikid commented Oct 15, 2019

@bpasero
Copy link
Member

bpasero commented Oct 15, 2019

@musikid does it reproduce if you open that folder as part of a multi-root setup? We use a different file watcher in that case, just File > Save Workspace As and add another folder.

@musikid
Copy link

musikid commented Oct 15, 2019

It consumes much less RAM so I think it's not reproducing this bug.
mem leak

@bpasero
Copy link
Member

bpasero commented Oct 16, 2019

@musikid I cannot reproduce with that folder. It also does not contain a lot of files it seems, can you share how many files you see in that repository?

image

If you have a single large folder with lots of files, try setting the files.watcherExclude setting to ignore it.

@bpasero bpasero removed this from the October 2019 milestone Oct 16, 2019
@musikid
Copy link

musikid commented Oct 16, 2019

It's weird, I got much more files in my folder.
files
Maybe you can try to remove the target folder and restart vscode.

@bpasero
Copy link
Member

bpasero commented Oct 16, 2019

@musikid did you maybe run some kind of installation tool in that repo that added lots of files into? If so, can you share it or find another sample?

@musikid
Copy link

musikid commented Oct 16, 2019

Nothing special, I just removed the target folder because RLS was building indefinitely.

@bpasero
Copy link
Member

bpasero commented Oct 17, 2019

@musikid without a reproducible case (= a folder I can checkout locally), there is not much I can do, sorry.

@musikid
Copy link

musikid commented Oct 17, 2019

I uploaded my repo so you can take a look at this: https://send.firefox.com/download/b3e7d4f9973b2402/#vnfmN78g-TJlydhApd7i-g
You can check the VirusTotal scan if you don't trust it (just 2 false positives): https://www.virustotal.com/gui/file/f861d64440e13668845c16f3dfcf284d33e58e1cf1126a18f1d8ac2024cd9ce3/detection

@bpasero
Copy link
Member

bpasero commented Oct 17, 2019

Thank you but I would prefer a GitHub OSS repository and not download something from elsewhere.

@musikid
Copy link

musikid commented Oct 17, 2019

I created a debug branch on my fork with all files included in: https://github.com/MusiKid/ripgrep-all

@bpasero
Copy link
Member

bpasero commented Oct 17, 2019

@musikid great thanks, however I do not see a spike on my Ubuntu VM:

image

Can you share your htop?

@musikid
Copy link

musikid commented Oct 19, 2019

Sure.
Capture d’écran de 2019-10-19 18-56-25
Also, thanks for taking time to resolve this problem!

@elomatreb
Copy link

I am also seeing this behavior, and I am also experiencing crashes (SIGABRT) of the affected process. It stops happening when I switch the project to multiple root directories as well.

@bpasero
Copy link
Member

bpasero commented Oct 20, 2019

@musikid @elomatreb can we try an experiment to ensure no extension or setting is involved.

  • download VSCode insiders (can be run alongside stable): https://code.visualstudio.com/insiders/
  • open a command line prompt
  • git clone https://github.com/MusiKid/ripgrep-all.git
  • git checkout debug
  • code-insiders --disable-extensions --user-data-dir <some empty directory>
  • open the checked out folder
  • watch htop on the watcher service process of code insiders (!)

What is the result?

@elomatreb
Copy link

elomatreb commented Oct 20, 2019

Following the above steps, the process does not start to grow in memory by itself. When I start a cargo build however, it consumes multiple cores worth of CPU, and its memory use starts to grow rapidly. When I stop the build or it finishes, the memory use does not go back down, and when I repeat this it sometimes results in the SIGABRT I mentioned above (and the resulting coredump file being written causes the entire PC to slow down almost to freezing).

I can provide one of these coredump files if there's nothing sensitive in them, but apparently they were too large and they are "truncated", so I don't know if they're still useful

@bpasero
Copy link
Member

bpasero commented Oct 20, 2019

@elomatreb I assume cargo build is causing tons of files to be created in some build directory? Typically it is wise to exclude such folders via the files.watcherExclude setting. By default we exclude node_modules but we can tweak this setting default to exclude more if you can find that folder.

@elomatreb
Copy link

Cargo puts lots of build artifacts into target/, adding a pattern for it to the watcher ignore list does fix the behavior. I'm not sure why this wasn't a problem before, but this is an easy fix, thanks for the suggestion.

@bpasero
Copy link
Member

bpasero commented Oct 20, 2019

@musikid can you confirm you are doing the same?

@elomatreb would you think it makes sense to ignore this folder by default? I am wondering if people would ever configure this or always assume "target" as folder.

@bpasero bpasero added this to the October 2019 milestone Oct 20, 2019
@bpasero bpasero removed the info-needed Issue requires more information from poster label Oct 20, 2019
@bpasero bpasero changed the title Huge memory leak after update to 1.39.1 from 1.38.0 with Rust RLS on Linux Rust: cargo build produces huge load on file watcher (Linux) Oct 20, 2019
@elomatreb
Copy link

Ignoring the folder in Rust projects would probably make sense in 99.9% of cases, but since it's such a generic name I'm not sure.

@bpasero
Copy link
Member

bpasero commented Oct 20, 2019

Yeah I am worried that it would cause issues for people using this folder for a different purpose.

@bpasero
Copy link
Member

bpasero commented Oct 21, 2019

I reported this as paulmillr/chokidar#922 and we have to wait for an official fix. Until then, excluding this directory is the correct workaround.

@bpasero bpasero added upstream Issue identified as 'upstream' component related (exists outside of VS Code) upstream-issue-linked This is an upstream issue that has been reported upstream labels Oct 21, 2019
@bpasero bpasero modified the milestones: October 2019, Backlog Oct 21, 2019
@escape0707
Copy link

escape0707 commented Oct 26, 2019

Hi, there. I think I just get caught by the same issue but this time it's under a g++ compiling process. I first encountered this problem on my Arch laptop, and I've reproduced it just a min ago using Ubuntu 19.10 inside a Hyper-V VM on latest Win 10 stable version.

image

You can pull my repo on GitHub and give it a test. The repo contains only some (stupid, sorry, just ignore the logic errors in it for now) exercises code for C++ Primer and don't contain a lot of files even during building process. So I don't think the problem is affected by the number of files inside the workspace.

# on latest Ubuntu
git clone https://github.com/escape0707/cplusplus-primer-exercises.git
cd cplusplus-primer-exercises
git checkout vscode-debug
code .
g++ -std=c++17 StrBlobPtr.h -o a.out

The g++ compiler should give compile error quickly after a short time and finished. But the watcherService shows that it's continuously eating up CPU and RAM resources. And when this phenomenon appears on my Arch laptop, the laptop will finally get frozen.

Note that compiling some other files like 1.1.cpp, 1.11.cpp, Date.hdon't seems to trigger this issue, but Sales_item.h and StrBlobPtr.h will. If it doesn't happen, maybe compile StrBlobPtr.h for a second time will do.

And when I try to reproduce this on latest Manjaro distro which comes with a VSCode version 1.28.1, the problem will not show.

I wish the info provided above could help to locate the issue clearer.

@bpasero bpasero modified the milestones: Backlog, October 2019 Oct 26, 2019
@bpasero bpasero added the bug Issue identified by VS Code Team member as probable bug label Oct 28, 2019
@joaomoreno joaomoreno added the verified Verification succeeded label Oct 30, 2019
@joaomoreno
Copy link
Member

Verified by following the steps from paulmillr/chokidar#922

The file watcher stayed well under 700MB, despite consuming high CPU during the whole build.

image

@bpasero
Copy link
Member

bpasero commented Oct 30, 2019

Thanks!

@vscodebot vscodebot bot locked and limited conversation to collaborators Dec 16, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Issue identified by VS Code Team member as probable bug file-watcher File watcher freeze-slow-crash-leak VS Code crashing, performance, freeze and memory leak issues linux Issues with VS Code on Linux upstream Issue identified as 'upstream' component related (exists outside of VS Code) upstream-issue-linked This is an upstream issue that has been reported upstream verified Verification succeeded
Projects
None yet
Development

No branches or pull requests

6 participants