Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix cloud #447

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from
Draft

Fix cloud #447

wants to merge 1 commit into from

Conversation

MatteoGheza
Copy link
Collaborator

@MatteoGheza MatteoGheza commented May 13, 2022

I started looking into the cloud connection.
I added support for https destinations, adding a new parameter to the settings (protocol).

I tried to use cloud connection with a local instance and a Gitpod one. It works quite well, but the tally status isn't displayed on web listeners on the cloud instance. I think I found the bug: SourceClients seems to be empty, but I don't know very well how that part of code works.

@MatteoGheza MatteoGheza requested a review from hrueger May 13, 2022 16:44
@hrueger
Copy link
Collaborator

hrueger commented May 14, 2022

To be honest, I don't fully understand the cloud workflow either. Maybe @josephdadams can clarify?

If I remember correctly, all most of the global variables were always synced with socket.io to the cloud. And therefore I think we just don't sync SourceClients. However, if we want to keep this logic, I suggest the following:
Instead of manually syncing the variables every time we change them (because that could cause issues if we forget the sync command), we could just wrap the variable in a Subject or BehaviourSubject and subscribe to changes.

@josephdadams
Copy link
Owner

So the original concept behind cloud workflow is this:

  • You have TA running locally somewhere
  • You have another TA instance somewhere else, ideally exposed to the internet but doesn't have to be, as long as the local TA can send data to the remote
  • Remote TA is configured with a "cloud key", like a stream key or API key
  • Local TA is configured with the remote TA server, port, and cloud key
  • Each TA instance keeps an array of:
    • cloud_destinations: ip, port, key
    • cloud_keys: cloud keys that can be used for accepting connections (remote)
    • cloud_clients: list of clients that have connected successfully with a key
  • If authorized (keys match), the local TA sends JSON objects of the following, any time they change:
    • sources
    • devices
    • device_sources (what maps the source address to the device)
    • listener_clients (local TA clients that are connected locally... so if you have a listener only on the local side, the remote TA server can still flash it - just can't reassign it)
  • Anytime tally data is received on the local TA, it sends that tally object along with the source ID to all cloud_destinations, and then that remote TA evaluates this tally data like it would any normal tally connection.

I designed it originally so that you can have your closed-off production network but it can relay the data to the cloud/internet server where then users' personal devices could reach that server to get tally data. It's also useful because you can have multiple local productions all send their tally data out to a central server, and then people can all be remote/off-site and receive tally of whatever device they need to monitor.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants