Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming audio file #14

Open
mhuusko5 opened this issue Dec 22, 2012 · 15 comments
Open

Streaming audio file #14

mhuusko5 opened this issue Dec 22, 2012 · 15 comments

Comments

@mhuusko5
Copy link

On my computer I have a Node.js server that serves up audio files (aac or mp3).
If I enter a url like "http://localhost:5346/?song=testaac" into my browser, the aac files starts to download. However, if in the browser I try to play the aac file with aurora, it doesn't work.
I do this:
var player = Player.fromUrl('http://localhost:5346/?song=testaac');
player.play();

The issue seems to be that that first thing aurora seems to do is make a HEAD request? Instead of just GETing the audio content?

What am I missing here/what might be a solution so I can get audio files playing?

Thanks

@comster
Copy link

comster commented Dec 22, 2012

I've spent a good amount of time using the Aurora and browser media requests against my node.js server (and mongodb gridfs files).

Straightening out my HTTP 206, and HEAD requests helped me get it working. I forget exactly how it reacted to each of my responses, but I eventually ended up with this, which might help you:
https://github.com/comster/house/blob/master/lib/endPoints/files/index.js#L105

Let me know if that helps.

@devongovett
Copy link
Member

Make sure you're serving Aurora from the same server you're serving media from, or provide CORS headers for the media files otherwise it won't be able to make cross domain requests. The HEAD request is necessary to get the file size headers (make sure you're serving those too) for the file so that it can download the file in chunks rather than waiting until the entire thing is downloaded before playback begins. Let me know if you figure it out!

@mhuusko5
Copy link
Author

comster that's exactly what i'm looking for

it kind of brings up a new issue though... what my node.js server actually does is grab video, strip out the audio, and server up that audio, all on the fly
so when that HEAD request comes in, it is impossible for the node.js server to know what the side of the audio file will be

before when i was just using an html5 audio element, i just left it as chunked encoding, gave no content-length, and everything was fine
can Aurora work this way too, or might there be a workaround?

thanks

@mpilone
Copy link

mpilone commented May 29, 2013

I'm having the same issue here. My server side converts some audio files on the fly and streams back using chunked encoding (http://en.wikipedia.org/wiki/Chunked_transfer_encoding). It seems like I would need to convert the file in the head request and cache it so I can return a valid file size but that kills end user performance because I can't stream the audio as it is converted.

@devongovett
Copy link
Member

The problem is that browsers don't offer a streaming XMLHttpRequest API, so we're forced to make a single head request to get the total file size, followed by many small range requests in order to simulate streaming. If you're serving up files on the fly, they probably won't accept returning ranges and/or will cause your server to regenerate the audio data. So, unfortunately, until browsers add a streaming API that we can make use of (basically support for chunked binary XHR), we can't do much if anything about it. The only solution as far as I can see is to basically cache the converted audio files on your server when the HEAD request comes in, and serve that file allowing range requests from then on...

@fabslab
Copy link
Contributor

fabslab commented Jul 26, 2013

This indicates a good future: http://www.w3.org/TR/streams-api/ :)

@mscdex
Copy link

mscdex commented Aug 16, 2013

What about streaming via websockets?

@fabslab
Copy link
Contributor

fabslab commented Aug 16, 2013

Aurora.js doesn't include that currently so I've just started working on a solution for it. The documentation for Aurora.js could use a lot of improvement though, it's been a bit slow to get an understanding of all the code.

@fabslab
Copy link
Contributor

fabslab commented Aug 20, 2013

Voila, WebSocket streaming in a branch on my fork ->
https://github.com/fsbdev/aurora.js/blob/WebSocket-Source/src/sources/browser/websocket.coffee

If this would be accepted into the main repo, let me know. Feedback is welcome.

The corresponding server implementation you'll need to run it, for Node.js, is here ->
https://gist.github.com/fsbdev/6277584

@fabslab
Copy link
Contributor

fabslab commented Oct 6, 2013

I've now turned the code into a plugin:
https://github.com/fsbdev/aurora-websocket

@rodhoward
Copy link

Hi fsbdev,

I have got your websocket plugin to work!! Thanks! I can now stream audio from mp3 files over the websocket and play them using the aurora mp3.js codec.

However what I can't seem to do is get it to work when streaming directly from ffmpeg from stdout using spawn child process. The first issue was the file size (for live streaming this is unknown) but that seems like its just for the progress bar so not a big deal just returned a large number. So bottom line I'm getting data into your plugin and it is emiting the data events no errors but also no sound.

Any Ideas would be more than welcome! Thanks again for a great contribution.

@fabslab
Copy link
Contributor

fabslab commented Jan 31, 2014

Yes the file size is optional. The issue is most likely with the mp3.js code. I've looked at an issue like this before where that was the case. You could try your luck with one of the other codec files like flac.js.

@pafnat
Copy link

pafnat commented Sep 12, 2014

any progress?

@MidnightJava
Copy link

fsbdev, I'm using your plugin and the vorbis codec to stream vorbis-encoded data from a software defined radio implementation into my client. It works for a while and then stops. I have to reload the web app page to get it working again. It cuts out after a variable amount of time, anywhere from a few seconds to 45 or 50 seconds.

The same WebSocketServer doesn't cut out when used with a different client implementation, which pulls the data from each call to the listener, and dumps it into an Audio buffer in a WebAudio player implementation. That doesn't stop playing, but it has drop-outs every few seconds, which is why I'm trying to use your WebSocket plugin.

We probably have something wrong in our WebSocketServer. Can you think of anything that we might be doing that would cause the WebSocket plugin to drop the connection and stop playing the audio?

@revolunet
Copy link

Here's a cool related demo that use the new fetch browser API to stream the remote file, https://jakearchibald.com/2015/thats-so-fetch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants