-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
split really long songs #26
Comments
Can we add a catch for really long songs? Not a complete fix on how to handle them but at least let the user skip before it crashes chipper. |
Added a catch for long songs, allowing user to not load but skip if recommended it is too long. #39 |
@JamesPino So, I merged the changes and closed this issue today because it seemed like we had everything working. However, as I worked on other issues I realized that by checking the number of columns we are having to create the sonogram first which means we still have chipper crash if someone puts in a super long song they have not split up yet. Would it be difficult to change the catch to look at size of the wav file rather than column number of the sonogram? |
The easy answer is to just hard code a file size (basically what we did not now but do it right before the load). I can do this today. |
I agree I think the easy answer is enough. I will send you a file that is even longer so you can see how someone might try to load a full recording rather than a small clip which makes it crash. When I started using the first way you fixed it I realized I was more controlling what looked "good" on chipper not what was technically manageable by the computer to load. I think we should go after the second (by file size) and let the user have the discretion to what looks adequate to parse. |
Changed to check file size is complete and merged. |
flag for too large of file; split really long songs, subsequently stitch back together before writing (@asearfos can integrate kivy popup to warn users not to change the sliders between songs)
The text was updated successfully, but these errors were encountered: