You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is it possible to add an end point that maps to the live audio stream being played by a core?
I'm using the Web Audio API to graph audio channels in my web player. This works fine provided I have a path to the remote stream.
So something similar to how Roon connects to Sonos via a "hash.flac" stream. If Roon could return that url in the transport payload, I'd be able to connect to it and finish the visualizer.
Or, if that's not possible, perhaps the Room Core could analyze the channel frequencies and return an array of the frequencies with the "changed" event that fires every second.
The text was updated successfully, but these errors were encountered:
We are not intending to expose streams directly like this. We might consider doing support for visualization plugins down the road, but there is nothing in the roadmap right now.
Is it possible to add an end point that maps to the live audio stream being played by a core?
I'm using the Web Audio API to graph audio channels in my web player. This works fine provided I have a path to the remote stream.
So something similar to how Roon connects to Sonos via a "hash.flac" stream. If Roon could return that url in the transport payload, I'd be able to connect to it and finish the visualizer.
Or, if that's not possible, perhaps the Room Core could analyze the channel frequencies and return an array of the frequencies with the "changed" event that fires every second.
The text was updated successfully, but these errors were encountered: