-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bitecs audio-zone-system & audio-debug-system #5944
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The audio zone system seems mostly correct. I left some suggestions, mostly around naming things."
I don't think we need to do this now, but I suspect that as we have more interactions triggered by collision volumes ("zones"), we will pretty quickly want to standardize how we check collisions (with masks, etc). In other words, AudioEmitter
s should be Collidable
s (on the "audio emitter" layer). Audio zones are just CollisionVolumes
that collide with the "audio emitter" layer. We can probably also avoid using Set
s of entity ids by reversing the relationship: Collidables
would have a zoneCollisions
bitmask which indicates which zones they are colliding with. Each CollisionVolume
would be given an id and we'd set the zoneCollisions
with 1 << zoneId
. This would limit the number of CollisionVolumes we had (e.g. 64 if zoneCollisions
was a ui64
), but this also seems like an easy limitation to remove if we needed more zones.
It seems like an AudioParams
bitecs component was introduced, but doesn't seem to be used (and would duplicate data we have elsewhere, if it is). We should remove it if that's the case.
There's some fishy business happening with the maps where we're storing extra component data and with the (singleton) AudioListener
that we should try to sort out.
I haven't had a chance to read the audio debug system yet.
|
||
const currZones = new Map<number, Set<number>>(); | ||
const prevZones = new Map<number, Set<number>>(); | ||
const aabbs = new Map<number, Box3>(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems like this is (non-numeric) component data that we can store using the standard "escape hatch".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would like to know more about what's the benefit of following the escape hatch pattern vs having module level maps. If I was an external client developer what would following this pattern buy me in this case? At first glance the benefit is having a entry point for all component bound data structures and I think that's a good thing. In this case this maps are storing transient data that don't have much interest for external systems. Is this assumption right or am I missing the point?
src/bit-systems/audio-zone-system.ts
Outdated
getScene().then(() => { | ||
(APP.store as any).addEventListener("statechanged", () => { | ||
const isDebugEnabled = APP.store.state.preferences.showAudioDebugPanel; | ||
if (isDebugEnabled) createMaterial(); | ||
defineQuery([AudioZone])(APP.world).forEach(zone => { | ||
if (isDebugEnabled) { | ||
!debugObjects.has(zone) && addZoneDebugObject(APP.world, zone); | ||
} else { | ||
debugObjects.has(zone) && releaseZoneDebugObject(APP.world, zone); | ||
} | ||
}); | ||
if (!isDebugEnabled && debugMaterial) { | ||
disposeMaterial(debugMaterial); | ||
debugMaterial = null; | ||
} | ||
}); | ||
const isDebugEnabled = APP.store.state.preferences.showAudioDebugPanel; | ||
if (isDebugEnabled) createMaterial(); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this block can be moved to a regular system. It doesn't need the to use getScene
or listen to statechange
s from the store:
let isShowing = false;
function audioDebugPanelToggleSystem(world: HubsWorld) {
const shouldShow = APP.store.state.preferences.showAudioDebugPanel;
if (isShowing && !shouldShow) {
// teardown
} else if (!isShowing && shouldShow){
// setup
}
isShowing = shouldShow;
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IIUC this would make us repeat this same code in any system where we want to read this preferences parameter (ie. we also need to do this in the audio debug system) or centralize it in a system that calls functions on other systems explicitly.
An alternative to this would be to move preferences to bitecs where we have a [Xyz]Preference
components, then any system interested could just query enter/exit on the prefference they are monitoring and then access it through the store so you only query when it has actually changed.
src/bit-systems/video-system.ts
Outdated
if (videoEl.paused) { | ||
APP.isAudioPaused.add(audioEid); | ||
} else { | ||
APP.isAudioPaused.delete(audioEid); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Whenever we change any of the audio settings maps on APP, we have to call updateAudioSettings, or else the changes won't be flushed to the audios.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
isAudioPaused
doesn't affect the audio node settings so updateAudioSettings
it's not necessary but this is confusing because and it might not be easy to know when calling updateAudioSettings
it's required. This raises a good point. We have a few state stores (isAudioPaused
, clippingState
, mutedState
, etc) that we might be able to replace for flags in the AudioEmitter
component. That would move that state to the right place and remove a bit of complexity on the maps. So I'm going to do that for isAudioPaused
and mutedState
which are the ones that we are touching now and move the rest as I migrate other parts.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The problem you encountered with having AudioParams
on an entity that is not the LoadedEntity
is a tricky one. We should keep track of any times where we are running into this problem. I don't know yet if it's something we should just accept or if we will eventually want to revisit the "load this as a child entity" pattern we have for various media types.
Left some comments about things I recommend changing, but don't want to block your progress so leaving an "approve" review.
} | ||
}; | ||
|
||
getScene().then(() => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think it's correct to use getScene
as a hook like this. Strong evidence (to me) is that you aren't using the scene returned by getScene
. (getScene
is already a hack that we should find a way to remove -- We know when the scene exists, so if there's work that needs to happen after the scene exists, then we should schedule the work then.)
If you need to do some async work before rendering begins, you can use preload
. But this doesn't seem to be async. It seems like it can happen as soon as world
and store
exist. If that's the case, you can wrap this in a function:
const audioZoneQuery = defineQuery([AudioZone]);
export function maybeToggleDebugPanel(world: HubsWorld, store: Store) {
const isDebugEnabled = store.state.preferences.showAudioDebugPanel;
if (isDebugEnabled) createMaterial();
audioZoneQuery(APP.world).forEach(zoneEid => {
if (isDebugEnabled) {
!debugObjects.has(zoneEid) && addZoneDebugObject(APP.world, zoneEid);
} else {
debugObjects.has(zoneEid) && releaseZoneDebugObject(APP.world, zoneEid);
}
});
if (!isDebugEnabled && debugMaterial) {
disposeMaterial(debugMaterial);
debugMaterial = null;
}
}
Then we can call this function whenever we need to (e.g. in hub.js or app.tsx, where you have non-global references to both store and world)
// As soon as we have store and world...
store.addEventListener("statechanged", () => {
maybeToggleDebugPanel(world, store);
);
maybeToggleDebugPanel(world, store);
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree that this is not ideal and what you mention seems reasonable. The only thing that I don't like about that is that it forces the system developer to spread system code around the codebase. I'd like to discuss more the idea that I shared in my previous comment here: #5944 (comment)
i think I'm going to leave it as is for now and add a TODO to revisit this when we have discussed a better path for preferences updates.
const isDebugEnabled = APP.store.state.preferences.showAudioDebugPanel; | ||
isDebugEnabled && debugObjects.has(zoneEid) && releaseZoneDebugObject(APP.world, zoneEid); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems like these lines belong in the audio-debug-system
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This are in charge of showing/hiding the audio zone debug objects based on the debug panel preference status when an audio zone is added/removed. In this case we can't rely on an preference status update event, we need to explicitly check.
Regarding next steps, which you listed here:
These do not all need to be rewritten. For example, (if I understand correctly), the |
1217098
to
3656167
Compare
This PR migrates the audio zones and audio debug systems (mainly because having the debugger is handy to test the audio zones).
media-video
emitters for now (will handle all the audio emitter sources at a later stage)media-video
emitters yet. I prefer to migrate the rest of the audio related components first (audio-target
,zone-audio-source
) in other separater PRs before consolidating them in the audio emitters system.MediaVideoPlaybackChanged
component to handle media-video playback state changes.AudioListenerTag
component and that makes theaudio-zone-system
to handle the audio zone entities in two different groups: emitters and listeners. Another option would have been to have a specificAudioZoneEntity
tag but in most cases that would require to double tag most of the entities so I opted for reusing existing tags.getScene
as the init point for the service, not sure if this is what we want or we prefer a direct init method call inhubs-systems
init as we currently do for the previous systems.Next steps on audio migration:
audio-target
andaudio-zone-source
audio-gain-system
audio-settings-system
avatar-audio-source
andavatar-volume-controls
) in the new bitecs audio systems.linked-media
in the new bitecs audio systems.AudioEmitter
:AudioEmitter.sourceType
,AudioEmitter.audioOverrides
, etc to keep both AFrame and BitECS systems separated.