You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@WangHong007 would be good to implement additional methods for features normalization apart from quantile. Quantile method is quite strong method, removing most of the variability across peptides. Would be good to enable other normalization methods (e.g median) which less impact on the data. Here a good research paper about peptide normalization
The text was updated successfully, but these errors were encountered:
We tried several normalization method: msstats, qnorm (fast quantile), and now additional MedScale. Here are some points we need to focus:
For now, the object of normalization we apply is peptidoform (feature), and the normalization of peptide is optional.
In peptidoform normalization, should remove_low_frequency_peptides first, followed by peptidoform selection and polymerization, and finally normalized? When comparing several normalization methods, the global standard deviation increased after these steps.
We dropna many times in this process, but when we do the PivotTable, it produced many null values, which affects the result of the aggregate function.
What are the effects of fractions, biological replication, conditions, and Run in the sample? For example, biological repeats could map to samples (one to one), it will increase impossible combinations if any other index value shouldn’t appear in this biological replication. And that’s how pandas pivot_table works. Here before normalization (except msstats normalization):
@WangHong007 would be good to implement additional methods for features normalization apart from quantile. Quantile method is quite strong method, removing most of the variability across peptides. Would be good to enable other normalization methods (e.g median) which less impact on the data. Here a good research paper about peptide normalization
The text was updated successfully, but these errors were encountered: