Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running out of RAM with random effects for detection #14

Closed
patchcervan opened this issue Oct 18, 2022 · 4 comments
Closed

Running out of RAM with random effects for detection #14

patchcervan opened this issue Oct 18, 2022 · 4 comments

Comments

@patchcervan
Copy link
Contributor

patchcervan commented Oct 18, 2022

Hi @doserjef and thanks so much for the great package!

The other day I was running quite a big spatial model with random effects for the detection process, I ran out of RAM and the whole R session blew up. I have about 17000 visits to a total of ca. 5000 sites with max visits 160 and 2300 observers.

I think I have been able to determine that the problem is with the random effects for site and observer I included in the detection model. More precisely, I think I've been able to track down the culprit section to the preparation of lambda.p to build the output object. There seem to be some array/matrix manipulations that use a lot of memory.

I've managed to bypass those manipulations and get the code to work for my particular case, but I am not sure that this can be generalized to any case...

I've created a pull request, so that you can review the fix. #15

I don't know why the commit shows so many changes... I think I have only modified lines 278-282 and 1653-1662.

Sorry, I am not very familiar with GitHub procedures, so please let me know if you need anything else or different!

@doserjef
Copy link
Owner

Hi @patchcervan, and thanks for reaching out about this!

I ironically ran into the same problem the other day fitting a model with a random observer effect with about 1500 observers when I similarly saw lambda.p was very memory intensive. I'm planning to update the package within the next month or so with some new functionality, and I've added this to my todo list to include in the update to remove the dependence on lambda.p in all functions. What you did in the pull request sounds great, I'll take a look. I'm also by no means an expert with GitHub procedures, so I'll let you know if I need anything different from you.

Thanks again for this, I appreciate it!

@patchcervan
Copy link
Contributor Author

Cool, thank you! Let me know if you have any questions. Looking forward to seeing the new changes!

@doserjef
Copy link
Owner

doserjef commented Nov 9, 2022

Okay, I finally got around to taking a look at this. Thanks again for bringing this up! I actually ended up doing something slightly different to get rid of lambda.p entirely, but the end result is the same (less memory is used for the detection random effects) and I've implemented this for all spOccupancy functions.

Also, for what it's worth, in the most recent version on GitHub I did make some changes in the underlying C++ code for computing the random effects in the MCMC sampler, which is a decent amount faster than the previous version, in particular for cases like what you have where the random effects have a large number of levels. That should be up on CRAN in the next couple of weeks.

I'll go ahead and close this issue. Thanks again for the help!

@doserjef doserjef closed this as completed Nov 9, 2022
@patchcervan
Copy link
Contributor Author

Sounds great, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants