Merged
Conversation
- documentation on how to downsample - documentation on how to provide the input - not tested
Contributor
There was a problem hiding this comment.
Pull Request Overview
This PR enables the pipeline to use a precomputed depths table, allowing users to bypass depth computation from BAM files. This feature is particularly useful for applying custom downsampling or re-running analyses with modified depth data without re-computing depths from scratch.
Key Changes
- Added configuration parameters
use_custom_depthsandcustom_depths_tableto control custom depths table usage - Updated the depth analysis subworkflow to conditionally use the custom depths table or compute depths from BAMs
- Included documentation and an example Jupyter notebook demonstrating how to prepare a custom depths table
Reviewed Changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
| nextflow.config | Added two new parameters to enable and specify custom depths table |
| nextflow_schema.json | Added schema definition for the custom depths table configuration options |
| subworkflows/local/depthanalysis/main.nf | Implemented conditional logic to use custom depths table when enabled |
| docs/usage.md | Documented the new feature with setup instructions and requirements |
| assets/useful_scripts/downsample_depths.ipynb | Added example notebook showing how to downsample depths and prepare custom table |
Comments suppressed due to low confidence (1)
assets/useful_scripts/downsample_depths.ipynb:1
- The VAF calculation uses the original filtered_df index but calculates from the unfiltered df, which may cause index misalignment. The calculation should use filtered_df['ALT_DEPTH'] / filtered_df['DEPTH'] instead.
{
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Add one new option to run deepCSA with a custom depths table, this is particularly useful for applying tailored downsampling outside the pipeline and then re-running the analysis with the updated sets of mutations and depths.
AI summary
This pull request adds support for using a precomputed depths table in the pipeline, allowing users to skip the depth computation step when such data is already available. This enhancement makes the workflow more flexible and can save significant compute time for large cohorts. The changes include new configuration parameters, updates to the documentation, schema additions, and logic in the depth analysis subworkflow.
Documentation and configuration updates:
docs/usage.mddescribing how to use a precomputed depths table, including setup instructions and requirements for the input file format.use_custom_depthsandcustom_depths_tableinnextflow.configto control the use of a custom depths table.Pipeline and schema enhancements:
nextflow_schema.jsonwith a newcustom_depths_table_usageobject, detailing configuration options for using a precomputed depths table.subworkflows/local/depthanalysis/main.nf) to use the custom depths table when enabled, bypassing depth computation from BAM files.