Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor examples etc. to use android camera2 #163

Open
lfdversluis opened this issue Jun 18, 2015 · 21 comments
Open

Refactor examples etc. to use android camera2 #163

lfdversluis opened this issue Jun 18, 2015 · 21 comments

Comments

@lfdversluis
Copy link
Contributor

hardware.camera is deprecated.
The examples should be updated to use camera2.

If I find some time I will take a look at it.

@saudet
Copy link
Member

saudet commented Jun 18, 2015

Great, thanks!

@lfdversluis
Copy link
Contributor Author

I noticed that the HttpBuilder is also removed and as of sdk 23 you should use URLConnection, see this post.

I was thinking, once I toy around with a sample of my own, I may want to include http://square.github.io/okhttp/. @saudet, are you ok with using this library? It's well-known and handles some errors and stuff.

@saudet
Copy link
Member

saudet commented Oct 29, 2015

Where are we using HttpBuilder? I'm not seeing it in the samples.

Anyway, the sample directory in JavaCV is meant to contain small self-contained samples as a sort of reference for the API. For more complex samples, they are very welcome as well, but it's probably best to provide projects files as well, and they would belong in this repository:
https://github.com/bytedeco/sample-projects
What do you think? Sounds good?

@lfdversluis
Copy link
Contributor Author

Ah sorry my bad. It's been a while since I have been working on some experiments using the (awesome) library. I had some HTTP call and assumed it belonged to the sample, which I should've checked before. Ignore my previous post 😄

Once I toy around a bit again with the current version, I hope to find some time to look into the camera2 API

@kmlx
Copy link

kmlx commented Mar 10, 2016

I've written the following in #298:

Use RGBA_8888, 8 bytes, 4 channels

yuvImage = new Frame(width, height, Frame.DEPTH_UBYTE, 4);

Initialise the ImageReader with PixelFormat.RGBA_8888, like so:

imageReader = ImageReader.newInstance(VIDEO_WIDTH, VIDEO_HEIGHT, PixelFormat.RGBA_8888, 1);
imageReader.setOnImageAvailableListener(onImageAvailableListener, null);

then on ImageReader.onImageAvailableListener get the bytes and process the image:

ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() {
        @Override
        public void onImageAvailable(final ImageReader reader){
            mBackgroundHandler.post(new Runnable() {
                @Override
                public void run() {
                    Image img = reader.acquireNextImage();
                    final ByteBuffer buffer = img.getPlanes()[0].getBuffer();
                    byte[] bytes = new byte[buffer.remaining()];
                    buffer.get(bytes, 0, bytes.length);
                    img.close();
...
                     ((ByteBuffer) yuvImage.image[0].position(0)).put(bytes);
                     ffmpegRecorder.record(yuvImage);

...
                }
            });

@lfdversluis
Copy link
Contributor Author

@kmlx Hey thanks for sharing that piece of knowledge. I actually did some work on capturing the frames but ran into the issue where the ImageReader cannot be initialized with NV21. I am unfortunately not that familiar with imaging etc. (but I learned now what image strides and planes are!) So your answer is one of the pieces I was still looking for.

@lfdversluis
Copy link
Contributor Author

@kmlx I recalled having tried what you suggested. I have reimplemented it and confirmed that I got what a i saw before: java.nio.BufferOverflowException. I printed both buffer sizes and noticed that the buffer from the image is much bigger than the yuvImage buffer.

@kmlx
Copy link

kmlx commented Mar 11, 2016

@lfdversluis Not a problem. Thank you @saudet and @lfdversluis for providing support and keeping these forums alive!

Regarding your issue, I'm assuming you're adding the imageReader as a target to the previewBuilder. In which case you'll need to make sure that the preview images aren't too big to fit in the yuvImage.
For example, you'll need to choose an optimal image size:

mVideoSize = chooseVideoSize(map.getOutputSizes(SurfaceTexture.class));
mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
                            width, height, mVideoSize);

In my case, this outputs an mPreviewSize of 768x432, and the yuvImage width and height are set at 320x240. Of course, I'm getting all of these from chooseVideoSize and chooseOptimalSize, both available in the Camera2Video sample from google. This produces a decent quality stream, with minimal CPU usage.

Also, RGBA_8888 has 4 channels, 8 bytes/channel, so yuvImage will need to be instantiated as:

yuvImage = new Frame(width, height, Frame.DEPTH_UBYTE, 4);

where:
width - image width
height - frame height
Frame.DEPTH_UBYTE - 8 bytes/channel
4 - 4 channels

Or you could make yuvImage bigger in order to fit your images.
But once they fit, one way or the other, you should see an RGB stream on the other end.

@lfdversluis
Copy link
Contributor Author

@kmlx Thanks for your response. If I initialize the ImageReader like this mImageReader = ImageReader.newInstance(DISPLAY_WIDTH, DISPLAY_HEIGHT, PixelFormat.RGBA_8888, 2); and my Frame like yuvImage = new Frame(DISPLAY_WIDTH, DISPLAY_HEIGHT, Frame.DEPTH_UBYTE, 4); then I do get the buffer overflow exception. Interesting that it works for you.
To be clear, I am targeting the record example.

@kmlx
Copy link

kmlx commented Mar 17, 2016

@lfdversluis

  1. I wouldn't recommend targeting the record example. Best case scenario would be to convert the record example into a ffmpeg-only class. Then do all the other processing in another camera class.
    Opening a camera2 is completely different from camera, and the record example doesn't apply.
  2. Image formats are device dependent. You will need to find out what formats are supported by the device using isOutputSupportedFor (docs). If the device does not support rgb then I would recommend converting the images from the imageReader to rgb using renderscript (e.g. ScriptIntrinsicYuvToRGB link)

@saudet
Copy link
Member

saudet commented May 19, 2016

Hi guys, I've released version 1.2 :) Any updates on this?

@lfdversluis
Copy link
Contributor Author

Hi @kmlx it has been a while, but I haven't forgotten this issue yet :)

Do you have that code sample with camera2 and using the mPreviewSize like you mentioned?
I have not yet been able to make it work, so I am curious what I am missing here. I am not an imaging expert or anything close, so if you have a snippet maybe I can spot my mistake by comparing the approaches.

@rahulsnitd1014
Copy link

Hi @kmlx
I am having the same problem. Can't able to record using camera 2 . Getting Green screen on recorded video. Any solutions will be highly appreciated .

@kmlx
Copy link

kmlx commented Sep 8, 2016

@lfdversluis @rahulsnitd1014
Start here.
Then follow-up here.

You should be able to get javacv to work with the Camera2Video sample from Google.
That should be your objective.
Then you'll be hitting this.

@rahulsnitd1014
Copy link

Thanks.

I have already seen these and implemented but its still giving green frame.

@vishalghor
Copy link

Hi @lfdversluis @saudet ,

where u able to implement FFmpegframerecorderor sample RecordActivity.java for Camera2 api.If yes then can u please share the git repo for it ,it would be really helpfull.
Thanks

@lfdversluis
Copy link
Contributor Author

Hi @vishalghor, I managed to create some code that converts an Image object from an ImageReader to a NV21 byte array that can be used with the RecordActivity like it is now. I have not (yet) created an Activity that uses such an ImageReader in combination with the camera2 API.

@kmlx
Copy link

kmlx commented Jan 8, 2017

@vishalghor
check the previous messages in this thread.
there's a blocker in adding camera2 example: not all devices support PixelFormat.RGBA_8888.
ffmpeg frame recorder relies on rgb byte arrays; this format is not available on all cameras when using camera2; FYI you can fully access rgb byte arrays for any device using camera1 API.

more detail (camera2basic demo Camera2BasicFragment.java#L515):

imageReader = ImageReader.newInstance(VIDEO_WIDTH, VIDEO_HEIGHT, PixelFormat.RGBA_8888, 1);

PixelFormat.RGBA_8888 is not supported on all devices.
JPEG is standard, but doesn't help since we need an rgb byte array, and .jpeg just means we'll have to consume CPU in order to transform it to rgb byte array. doesn't help when you have to produce at least 20fps.

Image formats are device dependent. You will need to find out what formats are supported by the device using isOutputSupportedFor (docs). If the device does not support rgb then I would recommend converting the images from the imageReader to rgb using renderscript (e.g. ScriptIntrinsicYuvToRGB link)

So yeah, not an easy task if you're not hitting a device that works with RGB.

Sure, I could work on a demo that works with just a couple of devices (ones that support RGBA_8888), but then we'll have a lot of issues on github. And producing a renderscript demo is currently too time consuming for me.

Also, getting an rgb byte array from camera2 isn't really an issue for these forums, more like stackoverflow.

So that's where we are right now:

  • you could easily adapt camera2basic from google to work with RGBA_8888, but this would only work for a couple of devices
  • or you could get the supported output formats of camera2, check if rgba is supported and use it, otherwise run a renderscript which would produce rgb byte arrays from whatever that device supports.

And I haven't even mentioned performance :)

@xdeop
Copy link

xdeop commented Apr 13, 2017

Hi @kmlx you've stated that you can fully access rgb byte arrays for any device using camera1 API... How it is done? I'm using ffmpegframerecorder with camera1 api (onPreviewFrame) and with a Nexus 4 I'm getting green frames...

Thanks.

@kmlx
Copy link

kmlx commented Apr 18, 2017

@xdeop green frames means the resolution or the camera settings are not supported.
choose the right resolution/settings and it will work.
you can find how to choose the right resolution on stack overflow.

@xdeop
Copy link

xdeop commented Apr 18, 2017 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants