New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request: libcamera support #519
Comments
|
@aleasto @erfanoabdi I'm probably going to try to knock these out over the weekend. Does the scope I suggested under "Describe the solution you'd like" sound right to you? Anything I'm missing that I may want to consider? |
|
libcamera already implements the android HAL interface? |
|
Yes tree link, although it's based on camera3.h, which is only supported on Android < 8 according to the docs. So part of the scope might also have to include porting libcamera's HAL linkage from standard C++ to AIDL with XML metadata. Not a weekend project anymore. |
|
No it's ok. The default hidl implementations already wrap around the legacy hal format. Everybody uses that. |
|
However it doesn't look like libcamera builds inline with the aosp build system |
|
No, it doesn't. Think that will be an issue? I was thinking I'd build a .so file, find the right place to drop it (e.g. /vendor), and hopefully it would be picked up correctly. |
|
The right place would be There's a way to include a prebuilt blob in aosp but it's less than ideal. We don't do this for any other native component. You also need to figure out how to build against the android NDK rather than your host's glibc & co. Since the android part seems to be supported upstream, they might already have tools for this. |
|
kbingham/libcamera#26 Here's an effort to build inline with aosp following the same approach as freedesktop/mesa |
|
@aleasto I'm stumped... I've got libcamera building inline now and updated Waydroid to use libcamera (see WIP section of this guide. The files are there:
And yet, according to Android, based on EDIT: I was able to get a little farther, to where the .so seems to load but still no cameras: I'm working with the libcamera team to support a full inline build, but I need to be able to validate that this works first. Any ideas or guesses as to what I might be missing? I'm really stumped and would appreciate your help. libcamera_debug.log Thanks in advance! |
|
It looks correctly hooked up. BTW I'm not sure anybody has verified that the camera stack still works in lineage-18.1 builds. Have you verified on a compatible camera that the current V4L2 hal works prior to this work? |
|
Can you try to set LIBCAMERA_LOG_LEVELS=:0 as an environment variable for whatever process space is loading the libcamera HAL module please? (and share the logs) |
|
@aleasto The V4L2 HAL doesn't work, which isn't surprising. The RK3399 SoC my phone is based on is kind of weird in that it needs a userland control loop to really work. libcamera has a special pipeline (rkisp2 IIRC) that implements this. Though the camera technically implements V4L2, a quick read-through of Google's V4L2 implementation and TODO(b/*) comments makes it pretty clear that Google's V4L2 HAL will reject cameras from that SoC. |
|
on a compatible camera was the key there |
|
@kbingham I believe that the libcamera logger is broken, and have filed a ticket with more details: https://bugs.libcamera.org/show_bug.cgi?id=161 It looks like this expects (but does not find) /dev/media[0-3], which looks like a pretty good clue. Investigating... Key output: |
|
Add the devices to |
|
Is waydroid containerised? Does it need to pass through access to the video and media nodes? Or all of /dev? |
It is! We bind mount only the useful dev nodes |
|
@aleasto New errors, any ideas what I need to set to fix the permissions on /dev/media[0-3]? |
|
Looks like for other nodes we chmod 777. |
|
Sounds good. I've got an active waydroid fork too now :). Looks like I'll need to do just a bit more work to test this, likely creating some sort of config file for libcamera EDIT: This seems like a subset of the errors from kbingham/libcamera#28 and matches EDIT: This was a red herring. My kernel: 5.19.8-1-MANJARO-ARM |
|
So, it turns out the IPAs rely on IPC, which in turn relies on building a functioning executable. That's turning out to be tricky in this situation. @kbingham Long version: Would you be open to disabling IPA isolation on Android if we can't get this to work? Another option might be to use their build system. My best guess is that Android has some kind of special sauce in their CRT kind of like Microsoft's mainCRTStartup() business on Windows. They go to an awful lot of trouble to explicitly specify their CRT over the LLVM default (Reference: Android code search). Even if I do find a hack, I'm not sure it will be worth the added maintenance burden over disabling IPA isolation on Android, which would also solve the problem of having to build executable in Android.mk too. 64-bit rkisp1_ipa_proxy_worker 32-bit rkisp1_ipa_proxy_worker |
|
It's easy to disable IPA isolation in your build to get this working. I think rsglobal what had a patch doing that. If you can't find it let me know and I'll dig out the pointers. Isolation can be disabled, but It's always up to the distribution to deal with if isolation is used or not, and how. Right now, you're the distribution, so you can just hack it out. The IPA still has to be loaded via dlopen though |
|
@kbingham , I posted a patch file and logs to the development list. Looking forward to reading what folks have to say. Looks like the message is being held for moderation since the logs are long. @aleasto and others, I need your help with validation (trying out the HAL)! I've documented my setup here, with libcamera-specific changes under the heading "[WIP] Camera Support for PinePhone Pro". I'll be transparent-- on the PinePhone Pro, this only somewhat works. I see the camera devices correctly in an Android diagnostic data app, but the Java camera2 implementation raises an exception when apps try to use it. I'm not sure whether this is a problem with the rkisp1 ISP (Rockchip, as in the PinePhone Pro's SOC) or with libcamera's Android HAL. It will help me debugging to learn if this works for other devices, if this is broken in the same way for all devices, or broken in different ways for different devices. |
|
@rothn I remember megi working on camera support for Pinephone Pro and I am not sure how much is missing on mainstream kernel / libcamera side. Here is his latest blog post regarding this topic: https://xnux.eu/log/#070 |
|
Thanks ok I see it on the list. I'll reply later or tomorrow as I'm away from home currently. It looks like you've squashed everything into a single patch, which is against our code development practices, so there's a fair bit of work for cleanup. But let's focus on getting it working first. |
|
@kbingham Fundamentally, it does seem to work -- based on the logs it just seems like camera2 doesn't like the provided video modes from the rkisp1 ISP + my image sensors + how the HAL provides video modes. I would not be surprised if Android camera2 works usefully on top of this for other devices with different ISPs and sensors. I'm going to continue trying to get my hardware configuration working, but I'd like to keep submitting small incremental improvements rather than have a gigantic review once the whole thing works just how I want it to, although of course that's ultimately up to you. Also, see email on your list for how I suggest splitting the patch and LMK if that looks good to you. @darkdragon-001 This builds on top of megi@'s mainline kernel driver, so in a way he made this possible. I shot him an email asking for feedback. |
|
@darkdragon-001 To paraphrase, he's aware of libcamera, though he notes that the rkisp1 backend could use auto white-balance, per-light source color profiles, and light source detection. He's done some work on calibration tools (which I would welcome since my hardware defaults to "uncalibrated.yaml" in libcamera), and though that's on hold for now he expects to pick it back up early next year. |
You should also try |
|
Smaller commits posted incrementally as soon as they are ready is certainly more appreciated. Your latest series is working in that direction, so that's much more helpful thanks. Regarding
The RKISP1 is used in Chrometabs, and is expected to function through the android HAL, though it's not been tested enough in libcamera. But you should certainly expect that it should be possible to operate successfully. |
|
@rmader Writing an android camera HAL is a lot of work, and libcamera already has a presumably full-featured one that needs some minor fixes to build inline with AOSP for. I’ve looked into writing one, and it’s quite a lot of effort. I’d be happy to provide help where I can if someone would be interested in doing this, but I don’t have the time unfortunately. Do you? |
Same here, unfortunately not, I just got side-tracked into camera stuff. But I'm currently trying to look into some gaps in Pipewire which will hopefully help in case you'll need to switch approaches at some point (most importantly the 90 deg. camera rotations in many phones). In any case, much appreciate your work, thanks a lot! |
|
Indeed, I don't quite see pipewire fitting here exactly yet. Android expects a specific HAL API implementing, with direct control over the camera, which pipewire itself is a long way from supporting. I won't say never though |
|
I'm not up to speed with Android internals, but as far as I get it there's no role for anything like Pipewire in Android. The Android frameworks for audio/video (as well as all others I presume) are designed as vertical components that span from high level application-facing APIs to the "platform" adaption code, and their implementation is core Android stuff which seems rather hard to replace or plumb into. Unless there are already plans to move Android to use pipewire, which I'm not aware of at the moment |
|
@jmondi the goal would be to interface pipewire with the camera HAL, which to be clear, I think is the right solution in the future. as far as I am aware, libcamera works exclusively and doesn't offer multiplexing right? one gets access at a time? and the goal of pipewire here is to mediate control. I think this means that the host would not have access to the camera when waydroid is running. this poses a set of issues that gets solved via pipewire. I would argue that pipewire is the "best solution" in this regard, however until pipewire offers all the direct controls that would need to be exposed, it's pretty much a non starter. the only other solution that I can see as viable would be to somehow get v4l2loopback working, and use that as a bridge to feed libcamera to android, however we this would present a whole slew of drawbacks. so while I think pipewire is the route forward, until it gets the functionality it needs libcamera passthrough seems like the best route. |
|
@rothn: regarding your kernel work on ov8858/imx258 for the Pinephone Pro, one open issue I see is that the 90 degree rotation is not yet detected/reported in Context here is that I just managed to implement support for rotations in Pipewire but needed to override the value in libcamera. Edit: I assume this is something needed for Waydroid as well. |
|
Are you referring to CameraConfiguration in libcamera generally? If so, it
would be helpful to file a ticket in the bug tracker:
https://bugs.libcamera.org/index.cgi. Feel free to assign it to me if
that's possible or just link it back here. Please include specifics like
whether the rotation is specified in the device tree.
This might be a libcamera-wide issue. In Android, for example, libcamera
picks up rotation from a HAL configuration file.
… Message ID: ***@***.***>
|
This comment was marked as off-topic.
This comment was marked as off-topic.
|
I probably just realized your concern is arbitration between the container and the host, not within the container itself ? Then I have to ask: how does it work for other subsystems, in example audio ? If an application running in the container plays music, do you expect a native application running on the Linux host to be able to access the same audio device at the same time ? Why are cameras any different ? If an application running in the container accesses the camera, how could another application, running in a different context space access the same hw device at the same time ? What if they want to stream different formats/resolutions concurrently ? Pipewire handles multiplexing at an higher level, by sharing references to the video buffers produced by the camera between different applications, and it possibly applies transformation by piping the buffers to other components (encoders, display, scalers etc) but it doesn't multiplex access to the camera HW configuration which is unique and constant per each streaming session. The "media session daemon" role in Android is played by the Camera Service, probably not in the same way as Pipewire does, as Android defines APIs for application to handle post-processing/endcoding/displaying by themselves, as applications, by definition, realize a specific use case by bundling together a set of system resources they then manage and combine together. As I understand it, if you expose you're media devices to the container and the container uses them, the host won't be able to access the same devices concurrently, but this doesn't seem different than any other subsystem like audio (graphics I assume it's a bit more complicated as it certainly has to be shared between the container and the host, but I presume this is handled by lxc maybe ?) |
|
@rothn @rmader If Robert is asking "does your ov8858 driver register the V4L2_CID_ROTATION and V4L2_CID_ORIENTATION" controls (from which libcamera parses properties::Rotation and properties::Location) I guess the answer is that the first submission by Nicholas doesn't but all it takes for the controls to be parsed and registered by a sensor driver is Also if there are any comments on the driver send them to the linux-media mailing list pretty please, not to leave them behind in the github walled garden |
That's exactly what I was trying to say and what I'd love to see added :) Edit: from the libcamera side https://patchwork.libcamera.org/patch/17894/ is also needed IIUC. |
yes? If the host couldn't play audio at the same time, that would be a major flaw. I currently use waydroid to play some android games while watching video on the host, audio works there too. I know plenty of people that would like to use the camera's simultaneously, in fact, this is already possible on linux side, I personally do this with pw-v4l2 as it lets me record and stream on discord (through firefox) at the same time. graphics, audio, input etc can all be used by the guest and the host at the same time (Input SHOULDN'T be, but that's a known issue). the reason why lxc is used is explicitly because it gives that level of cooperation between the the host and waydroid. there are plenty of people who use waydroid for the "full android environment" and people who only use one or two apps between waydroid, so if at all possible, if the waydroid and the host could use the camera at the same time, that would be greatly preferred. I can't say whose usecase it would be to utilize android on the host and guest at the same time, I can think of a couple theoretical ones. but in the end, I think ultimately more flexibility is best |
I see, I was not expecting this, but I certainly can't disagree with "the more flexibility, the better". On your linux side it's fine that you can multiplex access to your camera, as pipewire does the muxing for you, but in the end it's pipewire itself the sole user of the camera hw. Arbitrating that between the host and the container is something I can't currently picture in my mind but I understand would be desirable... |
|
Ok, I got a chat with other libcamera devs (@kbingham) and that's my current understanding Basically, right now we could "easily" get here: While you would like this In this last case it means you have to write a camera HAL on top of pipewire, and even if that happens the camera configuration between the host and the container should be the same. Hope this clarifies my understanding and sorry for having got you wrong in my first reply |
|
Thanks for the diagrams @jmondi ! Yes, I agree. Pipewire could certainly implement an Android HAL which would let the container talk to a pipewire service on the host, and would then talk to libcamera. This would be a fair amount of work I expect, and my worry would be - what use case does it solve? Any camera application within the Waydroid would probably expect direct control of the camera, which means the stream can't be shared... It would not be 'reasonable' for instance to have the host in a video call, and then for waydroid to want to take a still image capture with the same camera. That would require reconfiguration of the camera, to support the still image, and would be prohibited if the pipewire daemon was actively streaming to a video call. the only use case I could imagine currently being feasible for multiple camera access is if there were a simultaneous video call on the host and within a camera app in the waydroid environment. That would only then support a single video stream, and not be possible to reconfigure. So - to support that, you'll need a Pipewire camera HAL as above. .. but that's a lot of work for ... one use case, that I don't currently see being widely used. Now - me not thinking up other use cases isn't a reason not to implement an Android HAL on top of pipewire - but ... it will require some effort to do so. So I understand the 'flexibility' is best, but - Someone will have to put in the time and effort to do the work here if that's what you really want. Otherwise, using the libcamera Android HAL should already work. |
|
@jmondi thats correct, that would be the "Ideal" situation, @kbingham there are a few other potential use cases. possibly being streaming + augmented reality stuff etc. but even with them, I do agree that pipewire implementation is more work then what it is currently worth. however I do think that this could benefit users other then waydroid down the line, (I myself was looking into, briefly AGL+Linux which could potentially reap benefits) it's out of scope for waydroid to worry about now anyways. |
@wtay
|
|
FYI: I opened https://github.com/megous/linux/pull/17 and https://github.com/megous/linux/pull/18 with kernel driver and dts changes for rotation discovery (also needs https://patchwork.libcamera.org/patch/17891/). I'm not sure if that's needed for the Waydroid work, but I guess in the long run it would be nice if rotation values are picked up from the kernel device tree instead of profile files. |
|
Original issue: #170 |
|
I was able to get a black-and-white image to show up by hacking the API but a lot of things (e.g. taking a picture in Snapchat) would just crash the whole camera stack. This is because instead of properly converting from the NV12 stream libcamera produces in its HAL, I was directly copying that into Waydroid's buffer. Waydroid's Android image uses Mesa for rendering, which doesn't support YUV or NV12. So the result was undefined behavior, and it kind of (barely) worked. The code for all of this is still on my Github and I'd happily walk anyone interested through it. Next steps would be to figure out how to tell the libcamera Android HAL to produce an RGB stream. Probably by adding a flag into libcamera that defines what "IMPLEMENTATION_DEFINED" means, since that's the format libcamera generates and just assumes is NV12. For Mesa, this should be an RGB format, and while building this inline in AOSP we should then pass the specific RGB format we intend to use to the HAL's BUILD file (Bazel) or Makefile (legacy make). Additional Format Considerations Specific to Pinephone/RKISP1 Let me know if you'd like to discuss more. I pushed this forward for awhile but got burnt out on this, perhaps bit off more than I could chew so to speak. I'd love to collaborate on this with someone who has the time this work deserves. |
|
@rothn Thanks for the update. I guess your fork https://github.com/rothn/waydroid also fails when trying to scan a QR code with WhatsApp to access the web version? Fortunately WhatsApp now added a way to access the web version without scanning a QR code. |
|
Yes, it gets close though. Very close. The Uber Scooter app shows the preview window for capturing a barcode, but the camera stack crashes before recognizing the barcode. I suspect Uber and Snapchat both ask for a higher-quality stream for capture, and switching streams is just too many hacks on top of hacks for this stack. |
|
@rothn Do you think it would be possible to have camera support using the code from QEMU like done on the fork of Anbox or it would be more difficult than libcamera? See this comment (anbox/anbox#727 (comment)):
|
|
With a fake camera? Probably much easier than writing the Android HAL that libcamera wrote, but you'd have to write a HAL, which is probably more challenging than fixing configuration issues as making libcamera work would require. |
|
A virtual camera using v4l2loopback is better than nothing but someone said the Anbox fork even worked with a real webcam: |
|
I think what I'm trying to say is that it might be easier to reuse the existing real HAL than create a virtual camera HAL from scratch. But it looks like I might also be missing something important. |
|
Dear @rothn, had you had time to work further on supporting libcamera on Waydroid? I switched to a PinePhone Pro recently and am looking for a way to handle the applications which require QR-code for login. |
Currently, libcamera isolates any IPAs whose signatures cannot be verified. Shared objects are created at build-time, and then signed. The public signing key is embedded in a .cpp file, and libcamera verifies IPA signatures at runtime. When libcamera cannot authenticate an IPA, it runs it out-of-process. This is problematic on three levels: * IPA signing fundamentally does not work on Android for vendor modules like HALs (discussed below) * Executables built to run out-of-process are not ABI-compatible with Android, making isolation infeasible [1] * Linux phone hardware tends to be low-end because of the FOSS requirement, so the performance hit from out-of-process IPA isolation is significant IPA signing fundamentally does not work for Android vendor modules: After we "meson install" built .so files to a known location, Android explicitly access them in PREBUILT_SHARED_LIBRARY or BUILD_PREBUILIT to transform the .so files by stripping symbols among other things [2]. By modifying prebuilt libraries after we have already signed them, the build system renders our signatures useless on Android. Android distribution maintainers can use this flag to disable signature verification, which will allow them to use libcamera. [1] waydroid/waydroid#519 [2] https://cs.android.com/android/platform/superproject/+/master:build/make/core/cc_prebuilt_internal.mk?q=cc_prebuilt_internal Signed-off-by: Nicholas Roth <nicholas@rothemail.net>


Is your feature request related to a problem? Please describe.
Waydroid currently supports mainline Linux cameras by passing low-level V4L2 camera devices through to Android, which sets them up as extremely limited external webcams. This is not ideal, e.g. for Android apps that use the front-facing camera to capture QR codes.
Describe the solution you'd like
Describe alternatives you've considered
Additional context
The text was updated successfully, but these errors were encountered: