Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add ability to access raw wave file #16

Closed
wants to merge 1 commit into from
Closed

add ability to access raw wave file #16

wants to merge 1 commit into from

Conversation

mtmckenna
Copy link
Contributor

Hello! I'd like to be able to get the raw wave data from jsfx so I can pass it through it an audio context to play later. Something like this:

var context = new AudioContext();
var source = context.createBufferSource();
var buffer = null;
var wave = jsfx.Wave(params);

context.decodeAudioData(wave, function(decodedBuffer) {
  buffer = decodedBuffer;
});

source.buffer = buffer;
source.connect(context.destination);
source.start(0);
}

You already had the ability to return a Uint8Array, so in this commit, I exposed that function. I also moved some common logic into its own function. Do you think exposing this method makes sense?

Thank you!

McKenna

@egonelbre
Copy link
Member

Wasn't it possible to use jsfx.Sound directly? Also, since it's plain PCM buffer, you should be able to use the float PCM directly. e.g.

var processor = new jsfx.Processor(params);
var block = new Float32Array(processor.getSamplesLeft())
processor.generate(block);

source.buffer = block;

I think something like this would make more sense:

jsfx.SoundData = function(params, modules){
	var processor = new Processor(params, modules);
	var block = new Float32Array(processor.getSamplesLeft())
	processor.generate(block);
	return block;
};

So the above usage could be simplified to:

source.buffer = jsfx.SoundData(params);

@egonelbre
Copy link
Member

Although, naming it jsfx.SoundBuffer would make it more consistent with AudioContext.

@egonelbre
Copy link
Member

BTW. you can already use AudioContext directly:

var node = jsfx.Node(context, params, jsfx.DefaultModules, 2048);
node.connect(context.destination);

@mtmckenna
Copy link
Contributor Author

Hello,

I did try to use jsfx.Sound directly with createMediaElementSource, but I found it harder to use than createBufferSource because I still had (I think) to play and pause using the <audio> element's play and pause functions. I'm not 100% positive, but I think that means even though I was using Web Audio API, I was still bogged down by the limitations of the <audio> tag on mobile.

Thank you for your SoundData solution--it definitely looks cleaner. I tried it in my app, and received this error: Uncaught TypeError: Failed to set the 'buffer' property on 'AudioBufferSourceNode': The provided value is not of type 'AudioBuffer'..

I'll try to create a little demo app that reproduces the problem this weekend and get back to you. I'd love SoundData to be something I can use!

Thank you for your help!

McKenna

@mtmckenna
Copy link
Contributor Author

Thanks for your help--I was able to use jsfx.Node directly with AudioContext, so that was good. The problem with it though is that ScriptProcesserNode doesn't have a start() and stop() method, so I wasn't able to find a way to play back the generated effect without creating a new Node.

My use case is that I'm building a game that uses sound effects generated by jsfx. I'd like to generate the sound effects once when the app loads and be able to replay them on demand. I could use the jsfx.Node way of doing it if I regenerate the Node each time the effect needs to be played, but that seems inefficient.

Also, I tried using adding a SoundBuffer method to jsfx as you described. However, I wasn't able to get that to work either. I created a branch on my fork of jsfx that adds a "Play w/ Sound Buffer" button to index.html that shows the error in case you're interested in looking at that.

Let me know if I can provide any other info. If it doesn't make sense to you to add a Wave type method to jsfx, that's okay--I can keep using my fork.

Thank you!

McKenna

@egonelbre
Copy link
Member

I'm guessing instead of FloatArray it would need to use AudioBuffer.

If we can't get the SoundBuffer version working then I'm fine with Wave.

@egonelbre
Copy link
Member

Hi, I've finally had some time to look at this.

This should work now:

var source = context.createBufferSource();
source.buffer = jsfx.AudioBuffer(context, params);
source.connect(context.destination);
source.start();

@egonelbre egonelbre closed this Feb 13, 2017
@mtmckenna
Copy link
Contributor Author

Thank you! Just was able to try this out today, and it worked perfectly. Thank you for taking the time to add this feature.

Would you also be willing to update the package on NPM?

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants