Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

System hangs and crashes after consuming all 32GB of RAM #286

Closed
codygman opened this issue Aug 4, 2020 · 17 comments
Closed

System hangs and crashes after consuming all 32GB of RAM #286

codygman opened this issue Aug 4, 2020 · 17 comments
Labels
performance Issues about memory consumption, responsiveness, etc.

Comments

@codygman
Copy link

codygman commented Aug 4, 2020

Problem

My and my team's system information My system information:
haskell-language-server 0.2.2.0 compiled from source with Nix
AMD Ryzen 7 3800X 8-Core @3.5472 GHz
haskell-language-server version: 0.2.2.0
Linux 5.7.2, NixOS, 20.09pre229574.0a146054bdf (Nightingale)

My teammates system information:

haskell-language-server 0.2.2.0 compiled from source with Nix
AMD Ryzen 7 3800X 8-Core @3.5472 GHz
32GB Ram
Ubuntu 20.04 LTS
haskell-language-server version: 0.2.2.0 (same pinned nix channel)

I get system hangs a 4-6 times per month and many require hard rebooting the system. One teammate I convinced to finally try haskell-language-server had his machine crash twice this past Friday.

These are pretty beefy machines Ubuntu and NixOS machines with 32GB of ram and AMD Ryzen 7 3800X 8-Core @3.5472 GHz.

Project GHC Version and size:

ghc-8.8.3 with stack/stackage lts-16.6

$ cloc .
    1140 text files.
    1140 unique files.                                          
       1 file ignored.

github.com/AlDanial/cloc v 1.86  T=0.23 s (4893.6 files/s, 397089.8 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Haskell                       1138          11632           3153          76751
Markdown                         1            282              0            605
-------------------------------------------------------------------------------
SUM:                          1139          11914           3153          77356
-------------------------------------------------------------------------------

This project also has 77 template Haskell splices, with the majority I think being aesonQQ splices.

Some profiling information

I don't have any memory usage information, but I can see the 4 files with mkPersistent blocks take about 3.8 seconds to typecheck. I'm guessing that probably equates to higher memory usage too.

I know some improved memory usage work for template Haskell was done at some point, perhaps my revision isn't new enough to have that?

Nix install info

Uses this fork of hls-nix to build haskell-language-server. I might be able to create a test project that you can just do nix-shell --pure --run "haskell-language-server" from to debug the same build if needed when I get time.

I attempted to use static binaries but have separate issues with those and would prefer to focus on the performance issue before spending time debugging the static binary issue. So the above nix-based hls install is the only one I've been able to get working for this project.

@lukel97
Copy link
Collaborator

lukel97 commented Aug 4, 2020

That's a lot of memory, how large is the project that you're opening?

@codygman
Copy link
Author

codygman commented Aug 4, 2020

Hi @bubba, added project size to the description above.

@lukel97 lukel97 added the performance Issues about memory consumption, responsiveness, etc. label Aug 4, 2020
@codygman
Copy link
Author

codygman commented Aug 5, 2020

It looks like when this effort is finished:

https://www.reddit.com/r/haskell/comments/i3kv7t/ide_2020_measuring_memory_usage_of_haskell_values/

I'd be able to see what large maps are in memory for instance and narrow this down better.

I'll also have to get a profiling build of haskell-language-server, though at the moment I don't know how to do it with Nix by heart. It looks like I was investigating how to do that at input-output-hk/haskell.nix#793.

@codygman
Copy link
Author

codygman commented Aug 30, 2020

I complied hls with --profiling enabled and then ran with -p, but it looks like the profile is empty after I pkill haskell-language-server. Do I need to send it a certain code maybe?

Edit: It looks like you shouldn't hit ctrl-c more than once is the answer 😆

@unhammer
Copy link

Should the readme perhaps mention minimum memory usage for now? I know it says "very early stage", but a note about memory requirements might avoid some frustration from potential users. (cloc says I have 11331 lines of Haskell code, so not that much, but I still keep getting mmap: failed or libpthread.so.0: failed to map segment from shared object on my 16GB machine.) I was told in #haskell that 32GB is a minimum.

@wz1000
Copy link
Collaborator

wz1000 commented Sep 18, 2020

@unhammer do you have a hie.yaml file? Otherwise, it shouldn't really be that much. I hack on ghc just fine on my 16gb machine, which is hundreds of thousands of code.

@unhammer
Copy link

unhammer commented Sep 18, 2020

I do have a hie.yaml file. Hopefully I'll find time to make a minimal reproduction some day :)

(though I've been running haskell/ghcide@20ce9d3 just fine, after wrapping it in ulimit -Sv unlimited; htop says that uses 2G RES, 1T VIRT)

@georgefst
Copy link
Collaborator

georgefst commented Sep 28, 2020

Should the readme perhaps mention minimum memory usage for now?

IME, the minimum requirement in practice for serious work is likely to be somewhere around 16GB. I've been finding that I need to restart my editor every half hour or so when working on any reasonably large project (Ghcide, Ormolu...) on my 8GB laptop, whereas my work machine with 32GB runs for weeks on a project of similar size.

Those numbers probably don't sound like that much to a lot of us, but it rules out a fair number of hobbyists, students etc.

I've been wondering whether HLS has much room to improve here, or whether I should just buy more RAM... Is there any obvious low-hanging fruit for space optimisations?

@ndmitchell
Copy link
Collaborator

The belief is those memory hungry values are GHC things (e.g. type check results) stored by the IDE-Shake graph. But it's probably worth proving that. We could clear the whole Shake graph and perform a GC and see what the resulting memory is. We could clear selective nodes to get an approximate profile of each node in turn. It's likely that would show up some surprise memory usage.

@pepeiborra
Copy link
Collaborator

pepeiborra commented Sep 29, 2020

Take a look at haskell/ghcide#835 and haskell/ghcide#826

@mpickering
Copy link
Contributor

Probably will be helped by haskell/ghcide#836

@codygman
Copy link
Author

codygman commented Oct 5, 2020

Thanks for the help! I'm trying to get our work project building with nix again so I can try out the newest version and see if these issues are resolved.

@jneira jneira added the status: needs info Not actionable, because there's missing information label Oct 5, 2020
@codygman
Copy link
Author

codygman commented Oct 13, 2020

Got HLS working yesterday for our project again. Initial impressions are that memory usage is better, but before confirming I'll use it normally for a week.

@unhammer
Copy link

I still have to run this in a wrapper that sets ulimit -Sv unlimited – is that something the language server (or ghcide) could do itself? (I mean, it's fine for me, but I'd rather not have to write in our tech docs "to use language server first make a wrapper script with this magical incantation".)

@jneira
Copy link
Member

jneira commented Nov 11, 2020

@unhammer so hls continues eating memory in your case, just in case, what version are you using? The change mentioned by @mpickering that may alliviate the situation is only in master (and in the incoming 0.6.0)

@unhammer
Copy link

0.6.0 does fix the resident memory usage for me too :-)
I see that on most systems, ulimit -Sv does default to unlimited – not sure why my computer has it lower, so perhaps that's not something for the language server to worry about.

@jneira
Copy link
Member

jneira commented Dec 2, 2020

Ok, thanks for confirming it, we are gonna close it. Feel free to reopen if you hit it again

@jneira jneira closed this as completed Dec 2, 2020
@jneira jneira removed the status: needs info Not actionable, because there's missing information label Dec 2, 2020
pepeiborra pushed a commit that referenced this issue Dec 27, 2020
* Require hie-bios 0.3.2 or above

* Update stack.yaml files

* Use newer parser-combinators on GHC 8.4

* Bump parser combinators on 8.6

Co-authored-by: Moritz Kiefer <moritz.kiefer@purelyfunctional.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
performance Issues about memory consumption, responsiveness, etc.
Projects
None yet
Development

No branches or pull requests

9 participants