-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can combine multiple covariates and load as single covariates file #70
Comments
Hello,
when you have calculated the PCs and put them into the |
hello @hyacz , converting plink file to rmvplibrary(rMVP) running FarmCPU GWASgenotype <- attach.big.matrix("mvp.199sample_HF.geno.desc") for(i in 2:ncol(phenotype)){ |
@hyacz , By running the above code , the log file is showing that 'Number of provided covariates of FarmCPU: 540'. 179s_PC5_scf_for_mvp.csv is having 179 samples ( 5 pcs+ 3 scaling factor value, 172* 8=1432 values ). I would like to know why 'Number of provided covariates of FarmCPU' is showing 540 ? |
Then the number of covariates mentioned in the log depends on the number of columns of variable I'm not sure if I understand your data correctly. If SCF is a categorical variable, this is ok. If SCF is a quantitative variable, then in addition, it should be noted that the order of individuals in cv needs to be consistent with the phenotype and genotype. |
Hello Team rMVP,
First of all thank you so much for your wonderful software.
I would like to clarify some doubt regarding multiple covariates. I have Five PCs in PC.txt file (pc1, pc2, pc3, pc4, pc5)and three scaling factor value in scf.txt file (scf1, scf2, scf3). Can I combine these two file into a single file (pc_scf.txt) and load as covaries file using below command ? if no, how can Iuse PC.txt and scf.txt file as covariates file ?
note : pc_scf.txt file is having 8 column ( pc1, pc2, pc3, pc4, pc5, scf1, scf2, scf3 )
MVP.Data.PC("pc_scf.txt", out="mvp.pc_scf",sep='\t')
Covariates_PC <- bigmemory::as.matrix(attach.big.matrix("mvp.pc_scf"))
The text was updated successfully, but these errors were encountered: