workflow for single-cell pacbio long-read data.
This pipeline is intended to demultiplex and align PacBio CCS reads from single cells using the IsoSeq3 pipeline and the guidelines developed by Liz Tseng (here). This pipeline is currently modified to align cDNA that has dual barcodes, one on either end of the read, and no UMIs. It produces a .csv file in the collate/
results folder that has the isoform id, ccs, and the identity of the top-matching barcode for either end.
The pipeline is built using Nextflow, a workflow tool to run tasks across multiple compute infrastructures in a very portable manner. It comes with docker containers making installation trivial and results highly reproducible.
i. Install nextflow
ii. Install one of docker
, singularity
or conda
iii. Download the pipeline and test it on a minimal dataset with a single command
nextflow run nf-core/scisoseq -profile test,<docker/singularity/conda>
iv. Start running your own analysis!
nextflow run nf-core/scisoseq -profile <docker/singularity/conda> --input 'data/*.bam' --genome GRCm38
See usage docs for all of the available options when running the pipeline.
The nf-core/scisoseq pipeline comes with documentation about the pipeline, found in the docs/
directory:
- Installation
- Pipeline configuration
- Running the pipeline
- Output and how to interpret the results
- Troubleshooting
nf-core/scisoseq was originally written by Geoff Stanley.
If you would like to contribute to this pipeline, please see the contributing guidelines.
For further information or help, don't hesitate to get in touch on Slack (you can join with this invite).
You can cite the nf-core
pre-print as follows:
Ewels PA, Peltzer A, Fillinger S, Alneberg JA, Patel H, Wilm A, Garcia MU, Di Tommaso P, Nahnsen S. nf-core: Community curated bioinformatics pipelines. bioRxiv. 2019. p. 610741. doi: 10.1101/610741.
The packages used herein were largely developed by Pacific Biosciences (Isoseq), Elizabeth Tseng (SQANTI2), and the Conesa lab (SQANTI).
I have packaged them into a convenient NextFlow pipeline with Docker so that they can be easily run on any cloud service, private server cluster, or local machine.