-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom scaling and smoothening enhancements #5
Comments
Hi, I am guessing by custom y-scale, you meant to have manual limits for y-axis for every track? |
Sure thing-- Yes I meant setting a manual y-scale limit for each track (or group of tracks). For window averaging, it would be similar to an option in deeptools/bamcompare e.g. --smoothLength "The smooth length defines a window, larger than the binSize, to average the number of reads. For example, if the –binSize is set to 20 and the –smoothLength is set to 60, then, for each bin, the average of the bin and its left and right neighbors is considered. Any value smaller than –binSize will be ignored and no smoothing will be applied." |
I will see what I can do.. |
I would also appreciate a lot to be able to provide a vector of y-max values. Sometimes with ChIPseq data I want to show that there is no signal across a particular genomic window, and I would like to set the scale based on the signal of a known target where my protein binds. Right now if I do that the scale will be set based on the background levels. Thanks! |
I have added |
I think that instead of writing:
it should be:
Thanks a lot for the update! |
Indeed it was the case. I have corrected it. Thanks for checking.. |
Hi! Is the smoothening window enhancement now applied? P.S. I find this a very simple and easy to use pipeline. Thanks for the package! |
Hello,
I have been looking for a package like this for a long time so I am a big fan. Any chance there may eventually be options for choosing our own y-scaling and track smoothening/window averaging?
All the best,
Ryan
The text was updated successfully, but these errors were encountered: