Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UVC + remainder of kickstarter Unity roadmap updated ETA? #36

Open
peteTater opened this issue Jun 27, 2023 · 3 comments
Open

UVC + remainder of kickstarter Unity roadmap updated ETA? #36

peteTater opened this issue Jun 27, 2023 · 3 comments

Comments

@peteTater
Copy link

Hi Gespona, hope you're good, and thanks for plugging away on the Unity integrations. We picked up some OAK-D Lite's back in the 2021 kickstarter and have patiently been waiting on UVC, as well as the other features the team were promising during the campaign. For almost all of our intended use cases, the devices we bought are effectively duds until most of the remaining features are implemented.

Work on the Unity support side has taken a lot longer than we had been hoping for. After killing off realsense development, we thought OAK-D lite's were aiming to fill that gap for Unity native fullbody suitless mocap by this stage, but it's increasingly feeling like that may just have been some blue skies kickstarter marketing from the team if they've only allocated resources to have one person on the task, and so we're wondering if it's at all possible to have an updated timeline for completion of the remaining example implementations that are still in the works?

Also 2020 sucks! Would it be at all possible to extend support, either back to the 2019LTS final release (it's so much faster loading in a lot of situations), or just bump the completed examples to the current 2021LTS at some point in the near future?

Cheers!

@gespona
Copy link
Collaborator

gespona commented Jul 5, 2023

Hi @peteTater !

I think recently UVC support has been added to depthai-core (https://github.com/luxonis/depthai-core/releases/tag/v2.22.0) so let me check the current status and I will add an example on how to use it inside unity.

Regarding the rest of examples, it's missing head pose (that's pretty done) and hand track (this still remains bit of work). But regarding body mocap you have body pose already in the examples.

Regarding Unity version support, currently in the repository you can see we support 2021 and should work fine also on the current 2021LTS. Do you face any specific issue?

@peteTater
Copy link
Author

Hey @gespona :) Thanks for the reply.

That's great news re UVC, and glad to know head and hands are on the way. We were looking forward to getting past the days of stitching together facial feature tracking from kinnect/body from kinnect 2/and hand articulations from Leap Motion (if you've ever tried you'll know just how painful that is for realtime applications when all those beams are interfering). Having a single (or multiples of a single) device that has all of these systems integrated was what we'd been anticipating by last year based on our conversations with the team during the kickstarter.

To keep it short, a bit like https://github.com/creativeIKEP/HolisticBarracuda / https://github.com/creativeIKEP/HolisticMotionCapture but with the many additional advantages the OAK-D Lite's represent over a monocular cam for driving the avatars in realtime (depth and all the funky onboard AI stuff!). Ideal scenarios would be using the D-Lite to scan the actor/performer and auto rig head to toe, or in a volumetric video style but with rigging. Something akin to https://github.com/keijiro/4DViewsTest but perhaps it's too fanciful an idea... We can dream :)

Obviously at the time of the campaign, the open source AI animation wave hadn't hit, so amongst other things we're now hoping to use the devices for OpenPose/ControlNet e.g. https://github.com/nonnonstop/sd-webui-3d-open-pose-editor - allowing for realtime pose captures of depth/face/body/hands/feet. A fully integrated mocap and depth sensing solution could really offer a significant workflow improvement in that domain also (especially now there are several ongoing SD integrations into Unity both by the Unity devteam and other kind folk here in github land)!

I'm teaching an AI masterclass in a few hours but will test out the current commit later today with the latest 2021LTS for a reminder and get back to you shortly on the specifics. Re: 2019LTS backport, would it only be Keijiro's vfx example that couldn't work in 2019LTS? Asking as by my recollection, the project port he did from the Realsense D415 to support the OAK-D Lite was of course very cool to see but only really served to highlight the relative shortcomings of the OAK-D Lite in that area. Perhaps things have improved since last checking, but there were comparatively large and frequent distortions/random noise that the vfx drew a lot of attention to. As Keijiro himself said, that's not why we're interested in the devices anyway (not saying it isn't neat as a bonus if the noise factor can somehow be addressed).

We still have quite a few projects in the final 2019LTS release that we can't upgrade, which would benefit greatly from having the use of the OAK-D Lite's, but if it would be non-trivial to support the final 2019LTS for any reason beyond Keijiro's vfx example not working, then that's fair enough.

Many thanks gespona, will get back to you soon. Cheers!

@gespona
Copy link
Collaborator

gespona commented Jul 28, 2023

Hi @peteTater ! just quick follow-up here.
Re UVC you can check this example https://github.com/luxonis/depthai-python/blob/main/examples/ColorCamera/rgb_uvc.py I've been testing and works as expected, so I will try to port the example to unity, as soon as I have small time.
Re 2019LTS, I tested with 2019.4.33f1 and as expected, the only example not working is Keijiro's vfx because some minimum dependencies, but updating versions of URP is enough to run all the other examples.
In general, depthai unity core is pretty agnostic in terms of render pipeline

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants