Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sharing device between applications still possible? #31

Closed
chriskilding opened this issue Aug 15, 2013 · 7 comments
Closed

Sharing device between applications still possible? #31

chriskilding opened this issue Aug 15, 2013 · 7 comments

Comments

@chriskilding
Copy link

I've done a cursory scan of the repo and docs for anything similar to the 'Sharing Devices Between Applications and Locking Nodes' functionality from OpenNI 1.5, but can't find anything. Does OpenNI2 still have a mode for sharing a single device between multiple applications?

More specifically, I am looking to run an OpenNI2-based application and a NiTE2-based application simultaneously, both fed by the same Kinect.

@eddiecohen
Copy link
Collaborator

Hi,

It is no longer a feature of OpenNI / NiTE.
Having multiple applications that use the same sensor creates a lot of limitations on those applications. Developers of an application should take a lot of considerations when developing such applications (for example, which application will be the "master", being able to choose resolution for example). This means that one can't just develop an application without knowing which other applications might be running in the same time.
For the same reason, webcams can only be accessed by a single application at a time.

If on your specific configuration you want to have two applications running together, your options are:

  • Merge those two application into a single one, or
  • Develop a third application that will be the one accessing the sensor and running NiTE, and have the two original applications read data from it using IPC (sockets/shared memory/etc).

@chriskilding
Copy link
Author

Ah well, time to find a workaround then.

So if both apps are merged into one, somewhere in the Nite-based module there will be a usertracker.create() call, and somewhere in the ONI-based module there will be something that opens up a VideoStream from the device. Will this be allowed to work, because they are both running in the same process (albeit in different threads)? Or will it fail?

@tomoto
Copy link
Collaborator

tomoto commented Aug 15, 2013

I think you can achieve your goal by the following structure:
(1) Create a depth stream and register a listener to achieve your first goal (in ONI-based module)
(2) Create a user tracker "by passing the depth stream above" to achieve your second goal (in NiTE-based module)

I suppose this configuration allows you to consume the same data from one depth stream by two modules.

@eddiecohen
Copy link
Collaborator

Actually, User Tracker takes a device, not a depth stream.
And yes, OpenNI architecture allows creating several streams from the same process - the User Tracker object will read depth frames from its stream, and your module will read depth frames from its own stream, without interrupting each other.

One thing to note tough in such a configuration, is that if you need a video mode other than the default one, you should set it to the depth stream before creating the User Tracker.

@tomoto
Copy link
Collaborator

tomoto commented Aug 18, 2013

Oops, my bad. Thank you for correction.

you should set it to the depth stream before creating the User Tracker.

So, the general programming guideline might be "The application needs to know that the video mode and other sensor properties may be shared between multiple streams of the same type spawned from the same device. The application is responsible for setting them up in the right order without conflict" or something, I suppose.

@chriskilding
Copy link
Author

Okay, I think I got how to do it now.

In the past few days I have wondered if sharing within the same application could be made easier with, say, a ReadOnlyDevice class, which would publicly inherit from Device, and which behaves pretty much the same as a Device, but prevents any setter operations (after initialization) that could cause conflicts - for compatibility maybe it could implement them but throw an exception. There would also need to be a way for a class user to determine the ReadOnlyDevice's current configuration at runtime before registering itself, then, if it's cool with the config, it can proceed, and if not then it can refuse to connect.

@eddiecohen
Copy link
Collaborator

A couple of comments:

  • From OpenNI point of view, there's no problem in creating multiple streams in the same process. It's the application responsibility to choose the proper configuration for all modules requiring the stream.
  • NiTE has a limitation today - it does not support changing resolution after it was initialized, and that's why it's important to set it before initializing NiTE. This limitation has nothing to do with OpenNI itself. Other middleware libraries, or even future versions of NiTE might not have this limitation.
  • Having a ReadOnlyDevice class will just complicate things. You don't expect your application to won't start on a customer's machine, telling them it couldn't get the configuration it needed, right? Selecting the proper configuration and making sure all 3rd party libraries you use in your application can work with it is part of the development of the application. The application developer is the integrator of all libraries and must make sure a specific configuration works for him. Once such a configuration was found, it is highly suggested that the application actually set the sensor to that configuration explicitly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants