-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Regarding the questions about normalization and integration of gene expression matrices #14
Comments
Thank you for your attention to our work. The normalization step will be done when one creates the PRECASTObject using the function |
Thank you for your response! After reviewing the ProFAST package, I found that it has some similarities with the PRECAST package, except for the different methods used for dimensionality reduction and clustering. Which package performs better in terms of effectiveness? Do you have any suggestions?
|
ProFAST operates at a superior pace compared to PRECAST, primarily due to its exclusive emphasis on dimension reduction. In contrast, PRECAST undertakes a triad of tasks encompassing dimension reduction, clustering, and embedding alignment all at once. For substantial datasets, particularly those with spot quantities surpassing 500,000, I advocate for the utilization of ProFAST. Conversely, for smaller datasets, the choice aligns favorably with PRECAST. |
Thank you for your response!
|
Hello! I read your article and think you did a great job.
I have two questions here. The first question is about the tutorial at https://feiyoung.github.io/PRECAST/articles/PRECAST.BreastCancer.html. Do I need to normalize the input gene expression matrix when I perform the analysis, or does your model already include the normalization step? The second question is regarding the step of integrating multiple sample expression matrices in IntegrateSpaData. Can it only integrate the top n highly variable genes that are identifiable, or can it integrate all genes from all samples?
The text was updated successfully, but these errors were encountered: