Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

takePicture() Method does not work #62

Closed
Loopex2019 opened this issue Jul 2, 2019 · 31 comments · Fixed by #97
Closed

takePicture() Method does not work #62

Loopex2019 opened this issue Jul 2, 2019 · 31 comments · Fixed by #97

Comments

@Loopex2019
Copy link

Hi there i want to add The Capturing Image Feature on the Camera using the CameraMlVisionState
using a Global key :
onTap: () async {
final path = join(
(await getTemporaryDirectory()).path,
'${DateTime.now()}.png',
);
await _scanKey.currentState.takePicture(path);
Navigator.push(
context,
MaterialPageRoute(
builder: (context) => DisplayPicture(imagePath: path),
),
);
},

Now the Problem is that it doesn't take a Picture and it doesn't even navigate to the DisplayPage.
But when i removed the takePicture Line , it navigated normally to the DisplayPage but of course it showed a White screen since there is no photo captured. Is there a Solution or i just wrote a wrong Code . That Code is actually similar to the Flutter Dev CooKBook Page which gives an example of capturing an image from the camera : https://flutter.dev/docs/cookbook/plugins/picture-using-camera.

@Kleak
Copy link
Contributor

Kleak commented Jul 5, 2019

Hi can you wrap the code in a try/catch and give back the error message ?

@bmabir17
Copy link

I also tried to do the same and found the following error
Exception has occurred. CameraException (CameraException(error, CaptureRequest contains unconfigured Input/Output Surface!))

@imasif
Copy link

imasif commented Jul 16, 2019

also having the same issue

I also tried to do the same and found the following error
Exception has occurred. CameraException (CameraException(error, CaptureRequest contains unconfigured Input/Output Surface!))

@bradd-pktlzyer
Copy link

bradd-pktlzyer commented Oct 29, 2019

I also have this issue, this is my stack trace from android:

E/MethodChannel#plugins.flutter.io/camera(10244): java.lang.IllegalArgumentException: CaptureRequest contains unconfigured Input/Output Surface!
E/MethodChannel#plugins.flutter.io/camera(10244): 	at android.hardware.camera2.CaptureRequest.convertSurfaceToStreamId(CaptureRequest.java:674)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at android.hardware.camera2.impl.CameraDeviceImpl.submitCaptureRequest(CameraDeviceImpl.java:1044)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at android.hardware.camera2.impl.CameraDeviceImpl.capture(CameraDeviceImpl.java:914)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at android.hardware.camera2.impl.CameraCaptureSessionImpl.capture(CameraCaptureSessionImpl.java:173)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at io.flutter.plugins.camera.Camera.takePicture(Camera.java:253)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at io.flutter.plugins.camera.MethodCallHandlerImpl.onMethodCall(MethodCallHandlerImpl.java:77)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at io.flutter.plugin.common.MethodChannel$IncomingMethodCallHandler.onMessage(MethodChannel.java:222)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at io.flutter.embedding.engine.dart.DartMessenger.handleMessageFromDart(DartMessenger.java:96)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at io.flutter.embedding.engine.FlutterJNI.handlePlatformMessage(FlutterJNI.java:656)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at android.os.MessageQueue.nativePollOnce(Native Method)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at android.os.MessageQueue.next(MessageQueue.java:326)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at android.os.Looper.loop(Looper.java:160)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at android.app.ActivityThread.main(ActivityThread.java:6718)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at java.lang.reflect.Method.invoke(Native Method)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:493)
E/MethodChannel#plugins.flutter.io/camera(10244): 	at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:858)
I/flutter (10244): CameraException(error, CaptureRequest contains unconfigured Input/Output Surface!)

I get this calling takePicture.

Flutter 1.9.1+hotfix.6, 
firebase_ml_vision: ^0.9.2+2
flutter_camera_ml_vision: ^2.2.3

Let me know if there is anything else that might help.

@Dancovich
Copy link

I have this same issue.

I found this issue on the camera plugin: flutter/flutter#46082

It's not related but according to this comment you have to stop the streaming before taking a picture.

There is no way of doing it through CameraMlVisionState as it doesn't expose CameraController.stopImageStream() and takePicture doesn't stop the stream automatically.

Also for some reason CameraMlVisionState.stop doesn't stop the streaming either and throws the same exception CaptureRequest contains unconfigured Input/Output Surface!.

I can do further testing if needed.

@eliaweiss
Copy link

eliaweiss commented Jan 11, 2020

here how I solved it:

this is a clone of CameraMlVision that expose the controller:

import 'dart:async';
import 'dart:io';
import 'dart:typed_data';
import 'dart:ui';

import 'package:camera/camera.dart';
import 'package:device_info/device_info.dart';
import 'package:firebase_ml_vision/firebase_ml_vision.dart';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
import 'package:flutter_widgets/flutter_widgets.dart';
import 'package:path_provider/path_provider.dart';

part 'utils.dart';

typedef HandleDetection<T> = Future<T> Function(FirebaseVisionImage image);
typedef Widget ErrorWidgetBuilder(BuildContext context, CameraError error);

enum CameraError {
  unknown,
  cantInitializeCamera,
  androidVersionNotSupported,
  noCameraAvailable,
}

enum _CameraState {
  loading,
  error,
  ready,
}

class ZCameraMlVision<T> extends StatefulWidget {
  final HandleDetection<T> detector;
  final Function(T) onResult;
  final WidgetBuilder loadingBuilder;
  final ErrorWidgetBuilder errorBuilder;
  final WidgetBuilder overlayBuilder;
  final CameraLensDirection cameraLensDirection;
  final ResolutionPreset resolution;
  final Function onDispose;
  CameraController cameraController;
  ZCameraMlVision({
    Key key,
    @required this.onResult,
    @required this.detector,
    this.cameraController,
    this.loadingBuilder,
    this.errorBuilder,
    this.overlayBuilder,
    this.cameraLensDirection = CameraLensDirection.back,
    this.resolution,
    this.onDispose,
  }) : super(key: key);

  @override
  ZCameraMlVisionState createState() => ZCameraMlVisionState<T>();
}

class ZCameraMlVisionState<T> extends State<ZCameraMlVision<T>> {
  String _lastImage;
  Key _visibilityKey = UniqueKey();

  ImageRotation _rotation;
  _CameraState _cameraMlVisionState = _CameraState.loading;
  CameraError _cameraError = CameraError.unknown;
  bool _alreadyCheckingImage = false;
  bool _isStreaming = false;
  bool _isDeactivate = false;

  @override
  void initState() {
    super.initState();
    _initialize();
  }

  Future<void> stop() async {
    if (widget.cameraController != null) {
      if (_lastImage != null && File(_lastImage).existsSync()) {
        await File(_lastImage).delete();
      }

      Directory tempDir = await getTemporaryDirectory();
      _lastImage = '${tempDir.path}/${DateTime.now().millisecondsSinceEpoch}';
      try {
        await widget.cameraController.takePicture(_lastImage);
      } on PlatformException catch (e) {
        debugPrint('$e');
      }

      _stop(false);
    }
  }

  void _stop(bool silently) {
    Future.microtask(() async {
      if (widget.cameraController?.value?.isStreamingImages == true && mounted) {
        await widget.cameraController.stopImageStream();
      }
    });

    if (silently) {
      _isStreaming = false;
    } else {
      setState(() {
        _isStreaming = false;
      });
    }
  }

  void start() {
    if (widget.cameraController != null) {
      _start();
    }
  }

  void _start() {
    widget.cameraController.startImageStream(_processImage);
    setState(() {
      _isStreaming = true;
    });
  }

  CameraValue get cameraValue => widget.cameraController?.value;
  ImageRotation get imageRotation => _rotation;

  Future<void> Function() get prepareForVideoRecording =>
      widget.cameraController.prepareForVideoRecording;

  Future<void> startVideoRecording(String path) async {
    await widget.cameraController.stopImageStream();
    return widget.cameraController.startVideoRecording(path);
  }

  Future<void> stopVideoRecording() async {
    await widget.cameraController.stopVideoRecording();
    await widget.cameraController.startImageStream(_processImage);
  }

//  Future<void> Function(String path) get takePicture =>
//      widget.cameraController.takePicture;

  Future<void> _initialize() async {
    if (Platform.isAndroid) {
      final deviceInfo = DeviceInfoPlugin();
      final androidInfo = await deviceInfo.androidInfo;
      if (androidInfo.version.sdkInt < 21) {
        debugPrint('Camera plugin doesn\'t support android under version 21');
        if (mounted) {
          setState(() {
            _cameraMlVisionState = _CameraState.error;
            _cameraError = CameraError.androidVersionNotSupported;
          });
        }
        return;
      }
    }

    CameraDescription description =
        await _getCamera(widget.cameraLensDirection);
    if (description == null) {
      _cameraMlVisionState = _CameraState.error;
      _cameraError = CameraError.noCameraAvailable;

      return;
    }
//    widget.cameraController = CameraController(
//      description,
//      widget.resolution ??
//          ResolutionPreset
//              .low, // As the doc says, better to set low when streaming images to avoid drop frames on older devices
//      enableAudio: false,
//    );
    if (!mounted) {
      return;
    }

    try {
      await widget.cameraController.initialize();
    } catch (ex, stack) {
      debugPrint('Can\'t initialize camera');
      debugPrint('$ex, $stack');
      if (mounted) {
        setState(() {
          _cameraMlVisionState = _CameraState.error;
          _cameraError = CameraError.cantInitializeCamera;
        });
      }
      return;
    }

    if (!mounted) {
      return;
    }

    setState(() {
      _cameraMlVisionState = _CameraState.ready;
    });
    _rotation = _rotationIntToImageRotation(
      description.sensorOrientation,
    );

    //FIXME hacky technique to avoid having black screen on some android devices
    await Future.delayed(Duration(milliseconds: 200));
    start();
  }

  @override
  void dispose() {
    if (widget.onDispose != null) {
      widget.onDispose();
    }
    if (_lastImage != null && File(_lastImage).existsSync()) {
      File(_lastImage).delete();
    }
    if (widget.cameraController != null) {
      widget.cameraController.dispose();
    }
    widget.cameraController = null;
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    if (_cameraMlVisionState == _CameraState.loading) {
      return widget.loadingBuilder == null
          ? Center(child: CircularProgressIndicator())
          : widget.loadingBuilder(context);
    }
    if (_cameraMlVisionState == _CameraState.error) {
      return widget.errorBuilder == null
          ? Center(child: Text('$_cameraMlVisionState $_cameraError'))
          : widget.errorBuilder(context, _cameraError);
    }

    Widget cameraPreview = AspectRatio(
      aspectRatio: widget.cameraController.value.aspectRatio,
      child: _isStreaming
          ? CameraPreview(
              widget.cameraController,
            )
          : _getPicture(),
    );
    if (widget.overlayBuilder != null) {
      cameraPreview = Stack(
        fit: StackFit.passthrough,
        children: [
          cameraPreview,
          widget.overlayBuilder(context),
        ],
      );
    }
    return VisibilityDetector(
      child: FittedBox(
        alignment: Alignment.center,
        fit: BoxFit.cover,
        child: SizedBox(
          width:
          widget.cameraController.value.previewSize.height *
              widget.cameraController.value.aspectRatio,
          height: widget.cameraController.value.previewSize.height,
          child: cameraPreview,
        ),
      ),
      onVisibilityChanged: (VisibilityInfo info) {
        if (info.visibleFraction == 0) {
          //invisible stop the streaming
          _isDeactivate = true;
          _stop(true);
        } else if (_isDeactivate) {
          //visible restart streaming if needed
          _isDeactivate = false;
          _start();
        }
      },
      key: _visibilityKey,
    );
  }

  void _processImage(CameraImage cameraImage) async {
    if (!_alreadyCheckingImage && mounted) {
      _alreadyCheckingImage = true;
      try {
        final T results =
            await _detect<T>(cameraImage, widget.detector, _rotation);
//        print("cameraImage width"+cameraImage.width.toString());
//        print("cameraImage height"+cameraImage.height.toString());
//        print("widget.cameraController.value.aspectRatio"+widget.cameraController.value.aspectRatio.toString());
//        print("widget.cameraController.value.previewSize width"+widget.cameraController.value.previewSize.width.toString());
//        print("widget.cameraController.value.previewSize height"+widget.cameraController.value.previewSize.height.toString());
        widget.onResult(results);
      } catch (ex, stack) {
        debugPrint('$ex, $stack');
      }
      _alreadyCheckingImage = false;
    }
  }

  void toggle() {
    if (_isStreaming && widget.cameraController.value.isStreamingImages) {
      stop();
    } else {
      start();
    }
  }

  Widget _getPicture() {
    if (_lastImage != null) {
      final file = File(_lastImage);
      if (file.existsSync()) {
        return Image.file(file);
      }
    }

    return Container();
  }
}

usage:

FutureBuilder<void>(
              future: _initializeControllerFuture,
              builder: (context, snapshot) {
                if (snapshot.connectionState == ConnectionState.done) {
                  // If the Future is complete, display the preview.
                  return SafeArea(
                    child: SizedBox(
                      width: MediaQuery.of(context).size.width,
                      child: CustomPaint(
                          foregroundPainter:
                              new DetectedTextBoxPainter(_ValueNotifier),
                          //new TestPainter(_ValueNotifier),
                          child: new AspectRatio(
                              aspectRatio: deviceRatio,//_controller.value.aspectRatio,
                              //child: CameraPreview(_controller)
                              child: ZCameraMlVision<VisionText>(
                                cameraController: _controller,
                                resolution: ResolutionPreset.max,
                                detector: FirebaseVision.instance
                                    .textRecognizer()
                                    .processImage,
                                onResult: (VisionText vt) {
                                  if (!mounted) {
                                    return;
                                  }
                                  if (vt.text.length == 0) return;

//                              print(">>>>>>>>>>>>>>>>>> "+vt.text.toString());
                                  //triger repaint by updateimg the value

                                  print(
                                      "deviceRatio " + deviceRatio.toString());
                                  _ValueNotifier.value = new Tuple2(vt,
                                      deviceRatio); //_controller.value.aspectRatio
                                  //_controller.value.previewSize);
                                  //_ValueNotifier.notifyListeners();
                                },
                              ))),
                    ),
                  );
                } else {
                  // Otherwise, display a loading indicator.
                  return Center(child: CircularProgressIndicator());
                }
              },
            ),

take pic

                    await _controller.stopImageStream();

                    // Attempt to take a picture and log where it's been saved.
                    await _controller.takePicture(path);

@eliaweiss
Copy link

This solution works only partly...sometime it throws:

I/flutter (32399): CameraException(error, Attempt to invoke virtual method 'int android.hardware.camera2.CameraCaptureSession.capture(android.hardware.camera2.CaptureRequest, android.hardware.camera2.CameraCaptureSession$CaptureCallback, android.os.Handler)' on a null object reference)

@eliaweiss
Copy link

instead I used this code:

                try {
                  await _controller.takePicture(path);
                } catch (e) {
                  await _controller.takePicture(path);
                }

@athlona64
Copy link

+1 same error

@athlona64
Copy link

ok i can solved this.

i'm migrate this library and added some function

in stop function old

//old
Future<void> stop() async {
  if (_cameraController != null) {

    if (_lastImage != null && File(_lastImage).existsSync()) {

      await File(_lastImage).delete();
    }

    Directory tempDir = await getTemporaryDirectory();

    _lastImage = '${tempDir.path}/${DateTime.now().millisecondsSinceEpoch}';
    try {
      await _cameraController.takePicture(_lastImage);
    } on PlatformException catch (e) {

      debugPrint('$e');
    }

    _stop(false);

  }
}
// new
Future<void> stop() async {
  await _cameraController.stopImageStream(); // add this line
  if (_cameraController != null) {

    if (_lastImage != null && File(_lastImage).existsSync()) {

      await File(_lastImage).delete();
    }

    Directory tempDir = await getTemporaryDirectory();

    _lastImage = '${tempDir.path}/${DateTime.now().millisecondsSinceEpoch}';
    try {
      await _cameraController.takePicture(_lastImage);
    } on PlatformException catch (e) {

      debugPrint('$e');
    }

    _stop(false);

  }
}

and take a picture liek this:

     floatingActionButton: FloatingActionButton(
        child: Icon(Icons.camera_alt),
        // Provide an onPressed callback.
        onPressed: () async {
          // Take the Picture in a try / catch block. If anything goes wrong,
          // catch the error.
          try {
            // Ensure that the camera is initialized.

            // Construct the path where the image should be saved using the path
            // package.
            final path = join(
              // Store the picture in the temp directory.
              // Find the temp directory using the `path_provider` plugin.
              (await getTemporaryDirectory()).path,
              '${DateTime.now()}.png',
            );

               await _scanKey.currentState.stop();
            // Attempt to take a picture and log where it's been saved.
              await _scanKey.currentState.takePicture(path);

              await _scanKey.currentState.start();
                Navigator.push(
                  context,
                  MaterialPageRoute(
                    builder: (context) => DisplayPictureScreen(imagePath: path),
                  ),
                );
          } catch (e) {
            // If an error occurs, log the error to the console.
            print(e);
          }
        },
      ),

@MohammedAkhil
Copy link

Hi there i want to add The Capturing Image Feature on the Camera using the CameraMlVisionState
using a Global key :
onTap: () async {
final path = join(
(await getTemporaryDirectory()).path,
'${DateTime.now()}.png',
);
await _scanKey.currentState.takePicture(path);
Navigator.push(
context,
MaterialPageRoute(
builder: (context) => DisplayPicture(imagePath: path),
),
);
},

Now the Problem is that it doesn't take a Picture and it doesn't even navigate to the DisplayPage.
But when i removed the takePicture Line , it navigated normally to the DisplayPage but of course it showed a White screen since there is no photo captured. Is there a Solution or i just wrote a wrong Code . That Code is actually similar to the Flutter Dev CooKBook Page which gives an example of capturing an image from the camera : https://flutter.dev/docs/cookbook/plugins/picture-using-camera.

I've got the same issue. Is it possible to expose the camera controller and release an update to the library

@948911908
Copy link

Is the problem solved?I have the same problem.

@bmabir17
Copy link

bmabir17 commented Feb 8, 2020

@MohammedAkhil
https://github.com/rushio-consulting/flutter_camera_ml_vision/pull/66/files
This marge request fixes the problem and it worked for me. But the owner's approval is still pending and still causing error in the library.

@pratamatama
Copy link

Is it fixed already? I got this error as well in ^2.2.4

Please have a look at the log from my try/catch block.

E/MethodChannel#plugins.flutter.io/camera(14862): Failed to handle method call
E/MethodChannel#plugins.flutter.io/camera(14862): java.lang.IllegalArgumentException: CaptureRequest contains unconfigured Input/Output Surface!
E/MethodChannel#plugins.flutter.io/camera(14862): 	at android.hardware.camera2.CaptureRequest.convertSurfaceToStreamId(CaptureRequest.java:674)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at android.hardware.camera2.impl.CameraDeviceImpl.submitCaptureRequest(CameraDeviceImpl.java:1056)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at android.hardware.camera2.impl.CameraDeviceImpl.capture(CameraDeviceImpl.java:926)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at android.hardware.camera2.impl.CameraCaptureSessionImpl.capture(CameraCaptureSessionImpl.java:173)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at io.flutter.plugins.camera.Camera.takePicture(Camera.java:253)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at io.flutter.plugins.camera.MethodCallHandlerImpl.onMethodCall(MethodCallHandlerImpl.java:77)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at io.flutter.plugin.common.MethodChannel$IncomingMethodCallHandler.onMessage(MethodChannel.java:226)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at io.flutter.embedding.engine.dart.DartMessenger.handleMessageFromDart(DartMessenger.java:85)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at io.flutter.embedding.engine.FlutterJNI.handlePlatformMessage(FlutterJNI.java:631)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at android.os.MessageQueue.nativePollOnce(Native Method)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at android.os.MessageQueue.next(MessageQueue.java:326)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at android.os.Looper.loop(Looper.java:165)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at android.app.ActivityThread.main(ActivityThread.java:6810)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at java.lang.reflect.Method.invoke(Native Method)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:547)
E/MethodChannel#plugins.flutter.io/camera(14862): 	at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:873)
I/flutter (14862): CameraException(error, CaptureRequest contains unconfigured Input/Output Surface!)
I/flutter (14862): /data/user/0/com.prisma.flutter_prismahr/cache/2020-03-11-070556.067318.png

And here is how I reproduce it.

import 'dart:async';

import 'package:firebase_ml_vision/firebase_ml_vision.dart';
import 'package:flutter/material.dart';
import 'package:flutter_camera_ml_vision/flutter_camera_ml_vision.dart';
import 'package:geocoder/geocoder.dart';
import 'package:geolocator/geolocator.dart';
import 'package:intl/intl.dart';
import 'package:path/path.dart';
import 'package:path_provider/path_provider.dart';

class CameraPage extends StatefulWidget {
  CameraPage({Key key}) : super(key: key);

  @override
  _CameraPageState createState() => _CameraPageState();
}

class _CameraPageState extends State<CameraPage> {
  final _scanKey = GlobalKey<CameraMlVisionState>();

  String _timeString;
  Timer _timer;
  List<Face> _faces;
  GeolocationStatus _geolocationStatus;
  Position _position;
  bool _isLocationServiceEnabled;
  Address _address;

  CameraLensDirection cameraLensDirection = CameraLensDirection.front;
  FaceDetector detector =
      FirebaseVision.instance.faceDetector(FaceDetectorOptions(
    enableTracking: true,
    mode: FaceDetectorMode.accurate,
    enableContours: true,
    enableLandmarks: true,
    enableClassification: true,
  ));

  String _formatDateTime(DateTime dateTime) =>
      DateFormat('HH:mm:ss').format(dateTime);

  @override
  void initState() {
    super.initState();
    _timeString = _formatDateTime(DateTime.now());
    _timer = Timer.periodic(Duration(seconds: 1), (Timer t) => _getTime());

    _getGeolocationStatus();
    _getCurrentLocation().then((position) {
      _getCurrentLocationName(position.latitude, position.longitude);
      setState(() => _position = position);
    });
  }

  @override
  void dispose() {
    _timer.cancel();
    super.dispose();
  }

  void _getGeolocationStatus() async {
    GeolocationStatus geolocationStatus =
        await Geolocator().checkGeolocationPermissionStatus();
    bool isLocationServiceEnabled =
        await Geolocator().isLocationServiceEnabled();

    setState(() {
      _geolocationStatus = geolocationStatus;
      _isLocationServiceEnabled = isLocationServiceEnabled;
    });

    print(_geolocationStatus);
    print(_isLocationServiceEnabled);
  }

  Future<Position> _getCurrentLocation() async {
    return await Geolocator()
        .getCurrentPosition(desiredAccuracy: LocationAccuracy.best);
  }

  void _getTime() {
    final DateTime now = DateTime.now();
    final String formattedDateTime = _formatDateTime(now);
    setState(() {
      _timeString = formattedDateTime;
    });
  }

  void _getCurrentLocationName(double latitude, double longitude) async {
    final coordinates = Coordinates(latitude, longitude);
    final addresses =
        await Geocoder.local.findAddressesFromCoordinates(coordinates);

    setState(() {
      _address = addresses.first;
    });
  }

  void _onSubmit() async {
    final String path =
        join((await getTemporaryDirectory()).path, '${DateTime.now()}.png')
            .replaceAll(' ', '-')
            .replaceAll(':', '');

    try {
      await _scanKey.currentState.takePicture(path);
    } catch (e) {
      print(e);
    }

    print(path);
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      body: Stack(
        children: <Widget>[
          SizedBox.expand(
            child: CameraMlVision<List<Face>>(
              key: _scanKey,
              cameraLensDirection: cameraLensDirection,
              detector: detector.processImage,
              resolution: ResolutionPreset.high,
              overlayBuilder: (c) {
                return CustomPaint(
                  painter: FaceDetectorPainter(
                      _scanKey.currentState.cameraValue.previewSize.flipped,
                      _faces,
                      reflection:
                          cameraLensDirection == CameraLensDirection.front),
                );
              },
              onResult: (faces) {
                if (faces == null || faces.isEmpty || !mounted) {
                  setState(() {
                    _faces = faces;
                  });
                  return;
                }
                setState(() {
                  _faces = []..addAll(faces);
                });
              },
              onDispose: () {
                detector.close();
              },
            ),
          ),
          Align(
            alignment: Alignment.bottomLeft,
            child: Container(
              width: double.infinity,
              height: 180.0,
              padding: EdgeInsets.only(left: 20.0),
              child: Align(
                alignment: Alignment.topLeft,
                child: Row(
                  mainAxisAlignment: MainAxisAlignment.spaceBetween,
                  crossAxisAlignment: CrossAxisAlignment.start,
                  children: <Widget>[
                    Column(
                      crossAxisAlignment: CrossAxisAlignment.start,
                      children: <Widget>[
                        Text(
                          _timeString,
                          style: TextStyle(
                            color: Colors.white,
                            fontSize: 50.0,
                            fontFamily: 'Roboto',
                          ),
                        ),
                        Row(
                          children: <Widget>[
                            _address != null
                                ? Text(
                                    _address.addressLine.length > 45
                                        ? _address.addressLine
                                                .substring(0, 45) +
                                            '...'
                                        : _address.addressLine,
                                    style: TextStyle(color: Colors.white),
                                  )
                                : Container(),
                          ],
                        ),
                      ],
                    ),
                  ],
                ),
              ),
            ),
          ),
          Align(
            alignment: Alignment.bottomCenter,
            child: Container(
              width: double.infinity,
              height: 120.0,
              // color: Colors.white,
              padding: EdgeInsets.all(20.0),
              child: _faces != null && _faces.length > 0
                  ? _isLocationServiceEnabled != null &&
                          _isLocationServiceEnabled
                      ? _position != null && _position.mocked != true
                          ? _faces.length > 1
                              ? _buildMultipleFacesWarning()
                              : _buildCaptureButton(context)
                          : _buildGPSMockedWarning()
                      : _buildGPSWarning()
                  : _buildNoFaceWarning(),
            ),
          ),
        ],
      ),
    );
  }

  Widget _buildGPSMockedWarning() {
    return Align(
      alignment: Alignment.center,
      child: Container(
        width: double.infinity,
        padding: EdgeInsets.symmetric(vertical: 20.0),
        decoration: BoxDecoration(
          color: Colors.red,
          borderRadius: BorderRadius.circular(10.0),
        ),
        child: Text(
          'Fake location detected, unable to proceed.',
          style: TextStyle(color: Colors.white),
          textAlign: TextAlign.center,
        ),
      ),
    );
  }

  Widget _buildGPSWarning() {
    return Align(
      alignment: Alignment.center,
      child: Container(
        width: double.infinity,
        padding: EdgeInsets.symmetric(vertical: 20.0),
        decoration: BoxDecoration(
          color: Colors.red,
          borderRadius: BorderRadius.circular(10.0),
        ),
        child: Text(
          'Please enable you GPS!',
          style: TextStyle(color: Colors.white),
          textAlign: TextAlign.center,
        ),
      ),
    );
  }

  Widget _buildNoFaceWarning() {
    return Align(
      alignment: Alignment.center,
      child: Container(
        width: double.infinity,
        padding: EdgeInsets.symmetric(vertical: 20.0),
        decoration: BoxDecoration(
          color: Colors.red,
          borderRadius: BorderRadius.circular(10.0),
        ),
        child: Text(
          'Please make sure your face is detected!',
          style: TextStyle(color: Colors.white),
          textAlign: TextAlign.center,
        ),
      ),
    );
  }

  Widget _buildMultipleFacesWarning() {
    return Align(
      alignment: Alignment.center,
      child: Container(
        width: double.infinity,
        padding: EdgeInsets.symmetric(vertical: 20.0),
        decoration: BoxDecoration(
          color: Colors.red,
          borderRadius: BorderRadius.circular(10.0),
        ),
        child: Text(
          'Please do not include your friends!',
          style: TextStyle(color: Colors.white),
          textAlign: TextAlign.center,
        ),
      ),
    );
  }

  Widget _buildCaptureButton(BuildContext context) {
    return Align(
      alignment: Alignment.center,
      child: SizedBox(
        width: double.infinity,
        height: 60.0,
        child: RaisedButton(
          elevation: 0.5,
          color: Theme.of(context).primaryColor,
          textColor: Colors.white,
          child: Icon(Icons.camera_alt),
          shape: RoundedRectangleBorder(
            borderRadius: BorderRadius.circular(10.0),
          ),
          onPressed: () {
            _onSubmit();
          },
        ),
      ),
    );
  }
}

class FaceDetectorPainter extends CustomPainter {
  FaceDetectorPainter(this.imageSize, this.faces, {this.reflection = false});

  final bool reflection;
  final Size imageSize;
  final List<Face> faces;

  @override
  void paint(Canvas canvas, Size size) {
    final Paint paint = Paint()
      ..style = PaintingStyle.stroke
      ..strokeWidth = 2.0
      ..color =
          faces != null && faces.length < 2 ? Colors.grey[200] : Colors.red;

    if (faces != null) {
      for (Face face in faces) {
        final faceRect =
            _reflectionRect(reflection, face.boundingBox, imageSize.width);
        canvas.drawRect(
          _scaleRect(
            rect: faceRect,
            imageSize: imageSize,
            widgetSize: size,
          ),
          paint,
        );
      }
    }
  }

  @override
  bool shouldRepaint(FaceDetectorPainter oldDelegate) {
    return oldDelegate.imageSize != imageSize || oldDelegate.faces != faces;
  }
}

Rect _reflectionRect(bool reflection, Rect boundingBox, double width) {
  if (!reflection) {
    return boundingBox;
  }
  final centerX = width / 2;
  final left = ((boundingBox.left - centerX) * -1) + centerX;
  final right = ((boundingBox.right - centerX) * -1) + centerX;
  return Rect.fromLTRB(left, boundingBox.top, right, boundingBox.bottom);
}

Rect _scaleRect({
  @required Rect rect,
  @required Size imageSize,
  @required Size widgetSize,
}) {
  final scaleX = widgetSize.width / imageSize.width;
  final scaleY = widgetSize.height / imageSize.height;

  final scaledRect = Rect.fromLTRB(
    rect.left.toDouble() * scaleX,
    rect.top.toDouble() * scaleY,
    rect.right.toDouble() * scaleX,
    rect.bottom.toDouble() * scaleY,
  );
  return scaledRect;
}

Here is also my pubspec.yaml

dependencies:
  flutter:
    sdk: flutter

  # The following adds the Cupertino Icons font to your application.
  # Use with the CupertinoIcons class for iOS style icons.
  cupertino_icons: ^0.1.2
  shared_preferences: ^0.5.6+2
  http: ^0.12.0+4
  intl: ^0.16.1
  fl_chart: ^0.8.1
  firebase_ml_vision: ^0.9.3+5
  flutter_camera_ml_vision: ^2.2.4
  flutter_calendar_carousel: ^1.4.11
  flutter_bloc: ^3.2.0
  equatable: ^1.1.0
  video_player: ^0.10.8+1
  geolocator: ^5.3.0
  permission_handler: ^4.4.0
  geocoder: ^0.2.1
  path: ^1.6.4
  path_provider: ^1.6.5

@chaochaox1990
Copy link

2.2.4 I still get the same error CameraException(error, CaptureRequest contains unconfigured Input/Output Surface!)

@Kleak Kleak reopened this Mar 31, 2020
@Kleak
Copy link
Contributor

Kleak commented Apr 1, 2020

@pratamatama and @chaochaox1990 could you try the branch fix_take_picture and tell me if it fix your problem ?

@chaochaox1990
Copy link

@Kleak Thanks for u reply. I tried the branch fix_take_picture as well use the method Future takePicture(String path) async {
await _stop(false);
await _cameraController.takePicture(path);
_start();
}
But I get the different error as below.
CameraException(error, Attempt to invoke virtual method 'int android.hardware.camera2.CameraCaptureSession.capture(android.hardware.camera2.CaptureRequest, android.hardware.camera2.CameraCaptureSession$CaptureCallback, android.os.Handler)' on a null object reference)

@chaochaox1990
Copy link

@Kleak It works for me when I use
Future takePicture(String path) async {
await _stop(false);
try {
await _cameraController.initialize();
await _cameraController.takePicture(path);
} on PlatformException catch (e) {
debugPrint('$e');
}
_start();
}

@chaochaox1990
Copy link

Thanks for your branch @Kleak

@Kleak Kleak closed this as completed Apr 3, 2020
@Kleak Kleak reopened this Apr 3, 2020
@Kleak
Copy link
Contributor

Kleak commented Apr 3, 2020

Will update it soon 😉

@pratamatama
Copy link

Hi @Kleak sorry for late reply. I've tried your branch. It works for me too, thanks for your branch, but then I got this error:

I/flutter (31882): CameraException(error, Attempt to invoke virtual method 'int android.hardware.camera2.CameraCaptureSession.capture(android.hardware.camera2.CaptureRequest, android.hardware.camera2.CameraCaptureSession$CaptureCallback, android.os.Handler)' on a null object reference)

@chaochaox1990 would you please pass me a fiddle about your fix? I really have no idea what happened here.

@Kleak
Copy link
Contributor

Kleak commented Apr 11, 2020

Hi all,
sorry for the delay.
i have merged a fix in master #110 if you could try and tell me if it's ok it would be awesome.
thanks.

@pratamatama
Copy link

pratamatama commented Apr 11, 2020

Screen Shot 2020-04-11 at 20 32 06

hi @Kleak , your fix does works. but sometimes it produce the error I've shown above.
It turns white screen when I put it on a try catch block

@Kleak
Copy link
Contributor

Kleak commented Apr 11, 2020

Seems you don’t have the last version of master on the screenshot

@pratamatama
Copy link

Screen Shot 2020-04-11 at 21 57 00

Hi @Kleak , my mistake, I'm sorry for that. I have updated it to the last version of master and still got the error

@catalin-apostu
Copy link

Hi @Kleak!
I have also just tried the fix. Getting a completely black picture.
Cheers!

@chaochaox1990
Copy link

@catalin-apostu Do the issue happen on IOS device. my android is working fine. when IOS taking picture I just get a completely black picture, I already try 6 ,6s, 7. @Kleak could you help us check many thanks

@catalin-apostu
Copy link

catalin-apostu commented Apr 14, 2020

@chaochaox1990 This happened on an Android 10 phone. I was able to fix it meanwhile. When debugging flutter_camera_ml_vision.dart I could observe that if I set a breakpoint on await _cameraController.takePicture(path); and continue, then the black screen and black photos do no happen anymore. So I did add this line of code https://github.com/rushio-consulting/flutter_camera_ml_vision/blob/master/lib/flutter_camera_ml_vision.dart#L239 before taking the picture. Reinitializing _cameraController is also not necessary anymore.
Stop the streaming, have a delay, take picture, and then start the streaming again, seems to work. It works on iOS 13 too (tested with an iPad).

Future<void> takePicture(String path) async {
  await _stop(false);
  //FIXME hacky technique to avoid having black screen on some android devices
  await Future.delayed(Duration(milliseconds: 200));
  await _cameraController.takePicture(path);
  _start();
}

@chaochaox1990
Copy link

@catalin-apostu
Future takePicture(String path) async {
await _stop(false);
//FIXME hacky technique to avoid having black screen on some android devices
await Future.delayed(Duration(milliseconds: 200));
await _cameraController.takePicture(path);
_start();
}
Thanks .It works for me.

@jaumard
Copy link
Collaborator

jaumard commented Apr 30, 2021

Closing as should be ok since then :D feel free to reopen

@jaumard jaumard closed this as completed Apr 30, 2021
@maykhid
Copy link

maykhid commented Jan 7, 2023

I have this same issue.

I found this issue on the camera plugin: flutter/flutter#46082

It's not related but according to this comment you have to stop the streaming before taking a picture.

There is no way of doing it through CameraMlVisionState as it doesn't expose CameraController.stopImageStream() and takePicture doesn't stop the stream automatically.

Also for some reason CameraMlVisionState.stop doesn't stop the streaming either and throws the same exception CaptureRequest contains unconfigured Input/Output Surface!.

I can do further testing if needed.

I could give you a hug man... You just saved me!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.