Skip to content
This repository has been archived by the owner on Feb 22, 2023. It is now read-only.

Add byte streaming capability for the camera #965

Merged
merged 39 commits into from
Jan 3, 2019

Conversation

bparrishMines
Copy link
Contributor

No description provided.

@@ -258,7 +278,8 @@ public boolean onRequestPermissionsResult(int id, String[] permissions, int[] gr
private CameraDevice cameraDevice;
private CameraCaptureSession cameraCaptureSession;
private EventChannel.EventSink eventSink;
private ImageReader imageReader;
private ImageReader pictureImageReader;
private ImageReader byteImageReader; // Used to pass bytes to dart side.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to separate readers because they use different formats.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be explained in a comment in the code.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

- (void)captureToFile:(NSString *)filename result:(FlutterResult)result;
@end

@implementation FLTCam
FourCharCode const videoFormat = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange;
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Format that I believe is recommended for iOS 10+, which is minimum requirement for this plugin.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add this explanation as a comment for future readers to understand the code.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

@@ -269,7 +337,66 @@ - (CVPixelBufferRef)copyPixelBuffer {
while (!OSAtomicCompareAndSwapPtrBarrier(pixelBuffer, nil, (void **)&_latestPixelBuffer)) {
pixelBuffer = _latestPixelBuffer;
}
return pixelBuffer;

return [self convertYUVImageTOBGRA:pixelBuffer];
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since video format was changed to kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange we have to convert image to a usable format for flutter textures. Which is kCVPixelFormatType_32BGRA.

@dwach414
Copy link

dwach414 commented Dec 14, 2018

@bparrishMines
Ya I think the iPhone Xs thing is either a bug with the CameraPreview specifically or something unrelated that I caused. The OCR business logic still works on preset high even when the image doesn't show up. It was still broken as well when I reverted to 0.2.6. I'll do some more digging on that end.

Copy link
Contributor

@sigurdm sigurdm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

I would like for some of the other current plugin-maintainers to also review this.

I worry about the efficiency of transferring the data across the messageChannel.
I guess we do not have a lot of choice currently - but it seems very heavy, and not the intended use.

A future development would be to have a platform-image plugin (or image-stream plugin) that allows the images to stay on the native side, and be referenced from dart. It could then provide a number of compositions of images, and the ability to project an image into a Texture, or - if really needed - transfer the bytes to dart.

The camera-plugin would know about the image-plugin and be able to send the images there...

@@ -258,7 +278,8 @@ public boolean onRequestPermissionsResult(int id, String[] permissions, int[] gr
private CameraDevice cameraDevice;
private CameraCaptureSession cameraCaptureSession;
private EventChannel.EventSink eventSink;
private ImageReader imageReader;
private ImageReader pictureImageReader;
private ImageReader byteImageReader; // Used to pass bytes to dart side.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be explained in a comment in the code.

@@ -183,7 +182,7 @@
TargetAttributes = {
97C146ED1CF9000F007C117D = {
CreatedOnToolsVersion = 7.3.1;
DevelopmentTeam = EQHXZ8M8AV;
DevelopmentTeam = S8QB4VV633;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think changes to this file should be checked in.

- (void)captureToFile:(NSString *)filename result:(FlutterResult)result;
@end

@implementation FLTCam
FourCharCode const videoFormat = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add this explanation as a comment for future readers to understand the code.

packages/camera/lib/camera.dart Outdated Show resolved Hide resolved
packages/camera/lib/camera.dart Outdated Show resolved Hide resolved
///
/// Although not all image formats are planar on iOS, we treat 1-dimensional
/// images as single planar images.
class CameraImage {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps add a helper to make it easy to turn this into a https://docs.flutter.io/flutter/painting/ImageProvider-class.html

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The current implementation doesn't include a compatible image format with 'ImageProvider'. This impl uses yuv420.

I also include in CameraImage comments that The [CameraImage] is not directly usable as a UI resource.

packages/camera/CHANGELOG.md Outdated Show resolved Hide resolved
///
/// Throws a [CameraException] if byte streaming or video recording has
/// already started.
Future<void> startByteStream(OnLatestImageAvailable onAvailable) async {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this also take a resolution, and a fps-setting?
It seems many times you would want a lower resolution setting for the bytestream

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Due to a limitation I ran into on iOS. I wouldn't be able to have a different resolution for image streaming and the [CameraPreview] widget.

Although, we could scale the image on the platform side before we send it over the platform channel. I think this additional code would be best in a separate PR though.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FPS Setting could be added as well. I added a TODO to add these options.

packages/camera/lib/src/camera.dart Outdated Show resolved Hide resolved
@mikuhl-dev
Copy link

Need this for barcode scanning!

@slightfoot
Copy link
Member

@bparrishMines Do you know when this is going to be merged?

@mikuhl-dev
Copy link

mikuhl-dev commented Dec 21, 2018

@bparrishMines I am getting java.lang.IllegalStateException: Session has been closed; further changes are illegal. whenever I open the camera the second time. If I DON'T dispose the controller, this does not happen.

@lifenautjoe
Copy link

Need this for mustache filters !

@bparrishMines
Copy link
Contributor Author

@MichaelPriebe I wasn't running into any issues for with the example camera app. What do you mean by 'open the camera the second time'. Are you creating a new CameraController and calling initialize or are you creating a new CameraPreview. I would also check that you are using await with the dispose() method and initialize() method.

@Alby-o
Copy link

Alby-o commented Jan 17, 2019

@bparrishMines Thanks for this! I was wondering if there's a way to convert the CameraImage for an API I wanted to use. The requirements:

  • Blob of image bytes up to 5 MBs.
  • Type: Base64-encoded binary data object

Alternatively if I could save the image to a file that would also be great! Thanks

andreidiaconu pushed a commit to andreidiaconu/plugins that referenced this pull request Feb 17, 2019
andreidiaconu added a commit to andreidiaconu/plugins that referenced this pull request Feb 17, 2019
Akachu pushed a commit to Akachu/flutter_camera that referenced this pull request Apr 27, 2020
@ollyde
Copy link

ollyde commented Apr 29, 2020

Hey guys, the image stream is not sufficient for many use cases and doesn't include audio sources by default (like live RMTP/HLS streams). Is there any possibility to get startByteStream? At the moment we save to file but as soon as something like FFMPEG accesses that file the camera stops streaming too it :-(

@PanikPlunder
Copy link

@bparrishMines Thanks for this! I was wondering if there's a way to convert the CameraImage for an API I wanted to use. The requirements:

  • Blob of image bytes up to 5 MBs.
  • Type: Base64-encoded binary data object

Alternatively if I could save the image to a file that would also be great! Thanks

You could encode the camera image to an image like jpg and send the bytes as blob or encode it to base64 and send it
Example:

Repo: https://pub.dev/packages/image

Example code:
CameraImage image
imglib.Image img = imglib.Image.fromBytes(image.planes[0].width, image.planes[0].height, image.planes[0].bytes);

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.