Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: cannot allocate vector of size 18.6 Gb #444

Closed
erigdon opened this issue Aug 6, 2021 · 10 comments
Closed

Error: cannot allocate vector of size 18.6 Gb #444

erigdon opened this issue Aug 6, 2021 · 10 comments
Labels
enhancement New feature or request Resolved Review existing code Issues related to reviewing existing code

Comments

@erigdon
Copy link

erigdon commented Aug 6, 2021

Estimating a model with n = 50,000 fabricated data
Using lavaan syntax:
model2<-'f1=~y1+y2+y3
f2=~y4+y5+y6
f3=~y7+y8+y9
f1f2
f2
f3
f1~~f3'

I can run csem with "GSCA" approach weights and disattenuation off:

csemout.datacc<-csem(.data=datacc,.model=model2,.approach_weights = "GSCA",.disattenuate=F)

but if I try that with disattenuation on:

csemout.datacc.dis<-csem(.data=datacc.df,.model=model2,.approach_weights = "GSCA")

I get the error message:

Error: cannot allocate vector of size 18.6 Gb

The fabricated dataset is attached (I think).

@erigdon
Copy link
Author

erigdon commented Aug 6, 2021

f1f2
f1
f3
f2~~f3

@FloSchuberth
Copy link
Owner

Hey @erigdon,

unfortunately, the dataset is not attached, so I cannot replicate the error. Could you send me your file via email: f.schuberth@utwente.nl

It sounds that this is rather an R 'problem/a problem of your machine than a problem of cSEM. The error messages means that one produced object is too large so that R cannot handle it. In general, R saves all objects in your RAM. Once the RAM is full, you receive such an error. Of course, there might be more clever ways of implementing GSCA, e.g., avoid that such big objects can occur, which overcome this issue. If you send me your dataset, I will have a closer look.

Best regards,
Florian

@FloSchuberth
Copy link
Owner

The problem is identified: It is casued by the qr.Q function to obtain the Q matrix of the QR decomposition used in GSCAm (line 671 in estimators_weights.R). To solve this issue, we need to find another way to perform the QR decomposition.

@FloSchuberth
Copy link
Owner

Perhaps the bigalgebra package can be used.

@erigdon
Copy link
Author

erigdon commented Aug 13, 2021 via email

@FloSchuberth
Copy link
Owner

FloSchuberth commented Aug 13, 2021 via email

@erigdon
Copy link
Author

erigdon commented Aug 13, 2021 via email

@erigdon
Copy link
Author

erigdon commented Aug 13, 2021 via email

@FloSchuberth
Copy link
Owner

FloSchuberth commented Aug 15, 2021 via email

@M-E-Rademaker M-E-Rademaker added Review existing code Issues related to reviewing existing code Resolved enhancement New feature or request labels Aug 27, 2021
@FloSchuberth
Copy link
Owner

@erigdon : I have now implemented the new GSCAm version using singular value decomposition. Now your example should work. Note that you have to donwload the new version from the master branch.

Best regards,
Flo

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request Resolved Review existing code Issues related to reviewing existing code
Projects
None yet
Development

No branches or pull requests

3 participants