Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature implementation #8

Open
jeremydeanw opened this issue Jan 21, 2021 · 3 comments
Open

Feature implementation #8

jeremydeanw opened this issue Jan 21, 2021 · 3 comments

Comments

@jeremydeanw
Copy link

Hi I was wondering if it would be possible to add the ability of playing the midi file created, but in real-time as a loop on the page. So you could hear what is being changed on the fly while messing around with the numbers (or knobs?) before exporting?

@atamocius
Copy link
Owner

atamocius commented Jan 24, 2021

It is possible. What makes it a bit "involved" is knowing what sound to make available.

The app uses the MIDI parsing module of Tone.js to manipulate the data. So adding in tone.js to add a sound module is a natural step.

However, I believe Tone.js is a synth (ie. oscillators, envelopes, etc.). I think what you are looking for is a General MIDI implementation (something like midi-js-soundfonts) so a user can choose drums, piano, or whatever sound available to play the notes.

Unfortunately, I do not have a lot of bandwidth right now to make changes. Feel free to fork and see if you could integrate a General MIDI module and feed it the notes (you might need to detect what MIDI channel the original notes are on and alter it so you can play the notes on screen - ie. channel 10 to play drums, etc.)

Take note though that the app is really meant as a quick and dirty tool to add some sort of humanization. It is barely noticeable unless you exaggerate the parameters. It is really intended for softening repeated notes (ie. humanizing snare fills in 16th notes, or to simulate a laid back drummer, etc.).

The intended workflow would be to export a MIDI track from your DAW, upload to the app to humanize, and then import it back into your DAW to see the results.

@jeremydeanw
Copy link
Author

Makes sense, thanks for the response. I'm not the best coder hah. A question about the timing, are the milliseconds a range or is the amount exact? For instance if I had timing set to 70ms, is it a range from 0s to 70s or only 70s throughout?
I'm trying it out with midi files from my drum machine, it works really well actually.

@atamocius
Copy link
Owner

atamocius commented Jan 26, 2021

It is a range. As per your example, if you set timing to 70ms, it is actually a range of random offset values from -70ms to +70ms applied to every note (I actually had to take a peek at the code to confirm this 😅.)

This is also true for the velocity parameter.

You can see the meat and potatoes here: https://github.com/atamocius/midi-humanizer-app/blob/master/src/js/logic/Humanizer.js

I am glad it is of use to people other than myself 😁.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants