You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, we rely upon the integrity of unique indexing in the data.variants_df for binarymap, but also for Model.get_variants_df(), and Model.add_phenotypes_to_df() Methods. This is not a problem when collapsing barcodes because the index is naturally reset during the aggregation step. However, we have no checks and this caused a great amount of pain when debugging a recent CYP analysis for the Fowler lab. To fix this:
If we are not aggregating barcodes, we should simply reset the index - rather than a simple copy - to ensure the integrity of indices.
Make clear in the docs that the index of the passed in functional score dataframe is not maintained in the returned versions of variants dataframe for both Data.variants_df, and Model.get_variants_df method docstring.
For Model.add_phenotypes_to_df, we need to ensure that the passed dataframe has unique indices - but no need to reset - maintaining the passed index seems like the desired behavior, here. We can do this using a simple check
The text was updated successfully, but these errors were encountered:
Currently, we rely upon the integrity of unique indexing in the
data.variants_df
forbinarymap
, but also forModel.get_variants_df()
, andModel.add_phenotypes_to_df()
Methods. This is not a problem when collapsing barcodes because the index is naturally reset during the aggregation step. However, we have no checks and this caused a great amount of pain when debugging a recent CYP analysis for the Fowler lab. To fix this:Data.variants_df
, andModel.get_variants_df
method docstring.Model.add_phenotypes_to_df
, we need to ensure that the passed dataframe has unique indices - but no need to reset - maintaining the passed index seems like the desired behavior, here. We can do this using a simple checkThe text was updated successfully, but these errors were encountered: