New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loo crashes R when dealing with large log-likelihood matrices #35

Closed
daltonhance opened this Issue Oct 26, 2016 · 4 comments

Comments

Projects
None yet
3 participants
@daltonhance

daltonhance commented Oct 26, 2016

loo keeps crashing R when I call loo() on a log likelihood matrix of around 100 MB. Size is 5250 iterations by 2552 observations of log-likelihood. I've played around with reducing the size by limiting the number of rows and I am able to get it to run without crashing. Also seems to happen more often in RStudio than stand alone R.

I suspect an issue with running out memory?

Session info:
R version 3.3.1 (2016-06-21)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 7 x64 (build 7601) Service Pack 1

8 core laptop with 64 MB of RAM.

@jgabry

This comment has been minimized.

Show comment
Hide comment
@jgabry

jgabry Oct 27, 2016

Member

Yeah it's probably a memory issue, but with as much memory as you have (I assume you meant 64 GB) that's a bit surprising. Can you try manually setting cores=1 when you call loo? It might have something to do with extra copies being unnecessarily made when parallelizing.

I think that we can make big improvements to the loo internals to better deal with memory consumption. The loo.function method is supposed to be better than loo.matrix, so you could try that maybe, but even the function method can be improved in that regard. This is on the short term to-do list.

Member

jgabry commented Oct 27, 2016

Yeah it's probably a memory issue, but with as much memory as you have (I assume you meant 64 GB) that's a bit surprising. Can you try manually setting cores=1 when you call loo? It might have something to do with extra copies being unnecessarily made when parallelizing.

I think that we can make big improvements to the loo internals to better deal with memory consumption. The loo.function method is supposed to be better than loo.matrix, so you could try that maybe, but even the function method can be improved in that regard. This is on the short term to-do list.

@jgabry

This comment has been minimized.

Show comment
Hide comment
@jgabry

jgabry Oct 27, 2016

Member

Yeah it's probably a memory issue, but with as much memory as you have
that's a bit surprising. Can you try manually setting cores=1 when you call
loo? It might have something to do with extra copies being unnecessarily
made when parallelizing.

I think that we can make big improvements to the loo internals to better
deal with memory consumption. The loo.function method is supposed to be
better than loo.matrix, so you could try that maybe, but even the function
method can be improved in that regard. This is on the short term to-do
list.

Jonah

On Wednesday, October 26, 2016, daltonhance notifications@github.com
wrote:

loo keeps crashing R when I call loo() on a log likelihood matrix of
around 100 MB. Size is 5250 iterations by 2552 observations of
log-likelihood. I've played around with reducing the size by limiting the
number of rows and I am able to get it to run without crashing. Also seems
to happen more often in RStudio than stand alone R.

I suspect an issue with running out memory?

Session info:
R version 3.3.1 (2016-06-21)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 7 x64 (build 7601) Service Pack 1

8 core laptop with 64 MB of RAM.


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
#35, or mute the thread
https://github.com/notifications/unsubscribe-auth/AHb4QxDs6MchbE9BTlrf5LfF2lM8-qe3ks5q3-JfgaJpZM4KhyS9
.

Member

jgabry commented Oct 27, 2016

Yeah it's probably a memory issue, but with as much memory as you have
that's a bit surprising. Can you try manually setting cores=1 when you call
loo? It might have something to do with extra copies being unnecessarily
made when parallelizing.

I think that we can make big improvements to the loo internals to better
deal with memory consumption. The loo.function method is supposed to be
better than loo.matrix, so you could try that maybe, but even the function
method can be improved in that regard. This is on the short term to-do
list.

Jonah

On Wednesday, October 26, 2016, daltonhance notifications@github.com
wrote:

loo keeps crashing R when I call loo() on a log likelihood matrix of
around 100 MB. Size is 5250 iterations by 2552 observations of
log-likelihood. I've played around with reducing the size by limiting the
number of rows and I am able to get it to run without crashing. Also seems
to happen more often in RStudio than stand alone R.

I suspect an issue with running out memory?

Session info:
R version 3.3.1 (2016-06-21)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 7 x64 (build 7601) Service Pack 1

8 core laptop with 64 MB of RAM.


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
#35, or mute the thread
https://github.com/notifications/unsubscribe-auth/AHb4QxDs6MchbE9BTlrf5LfF2lM8-qe3ks5q3-JfgaJpZM4KhyS9
.

@daltonhance

This comment has been minimized.

Show comment
Hide comment
@daltonhance

daltonhance Oct 28, 2016

Hey Jonah,

Sorry for the delay in getting back to you. Yep. I meant 64 GB of RAM.

I was able to find a moment to restore my workspace and set cores to 1 using a global options statement. R (in RStudio) crashed on the second call to loo. (I'm working with 16 log-likelihood matrices each 102.2 MB in size, so it crashed on the second one of these. )

I was able to get loo to run successfully using the standard R console by making sure I ran each call to loo one at a time (i.e. pressing Ctrl-R on one line and waiting for loo to finish before providing the next command).

daltonhance commented Oct 28, 2016

Hey Jonah,

Sorry for the delay in getting back to you. Yep. I meant 64 GB of RAM.

I was able to find a moment to restore my workspace and set cores to 1 using a global options statement. R (in RStudio) crashed on the second call to loo. (I'm working with 16 log-likelihood matrices each 102.2 MB in size, so it crashed on the second one of these. )

I was able to get loo to run successfully using the standard R console by making sure I ran each call to loo one at a time (i.e. pressing Ctrl-R on one line and waiting for loo to finish before providing the next command).

@cfhammill

This comment has been minimized.

Show comment
Hide comment
@cfhammill

cfhammill Dec 12, 2017

Contributor

The issue here is probably related to #53, your 8 core machine was making 8 forks of your R session, depending on what was in there to begin with 8 forks could be a lot of memory. Setting mc.cores would not have fixed this, psislw looks at loo.cores instead. Switching from rstudio to R in the terminal probably fixed this by reducing overhead, possibly not restoring the full environment.

Contributor

cfhammill commented Dec 12, 2017

The issue here is probably related to #53, your 8 core machine was making 8 forks of your R session, depending on what was in there to begin with 8 forks could be a lot of memory. Setting mc.cores would not have fixed this, psislw looks at loo.cores instead. Switching from rstudio to R in the terminal probably fixed this by reducing overhead, possibly not restoring the full environment.

@jgabry jgabry closed this Apr 22, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment