New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
takePicture() Method does not work #62
Comments
Hi can you wrap the code in a try/catch and give back the error message ? |
I also tried to do the same and found the following error |
also having the same issue
|
I also have this issue, this is my stack trace from android:
I get this calling takePicture.
Let me know if there is anything else that might help. |
I have this same issue. I found this issue on the It's not related but according to this comment you have to stop the streaming before taking a picture. There is no way of doing it through Also for some reason I can do further testing if needed. |
here how I solved it: this is a clone of
usage:
take pic
|
This solution works only partly...sometime it throws: I/flutter (32399): CameraException(error, Attempt to invoke virtual method 'int android.hardware.camera2.CameraCaptureSession.capture(android.hardware.camera2.CaptureRequest, android.hardware.camera2.CameraCaptureSession$CaptureCallback, android.os.Handler)' on a null object reference) |
instead I used this code:
|
+1 same error |
ok i can solved this. i'm migrate this library and added some function in stop function old
and take a picture liek this:
|
I've got the same issue. Is it possible to expose the camera controller and release an update to the library |
Is the problem solved?I have the same problem. |
@MohammedAkhil |
Is it fixed already? I got this error as well in Please have a look at the log from my try/catch block.
And here is how I reproduce it. import 'dart:async';
import 'package:firebase_ml_vision/firebase_ml_vision.dart';
import 'package:flutter/material.dart';
import 'package:flutter_camera_ml_vision/flutter_camera_ml_vision.dart';
import 'package:geocoder/geocoder.dart';
import 'package:geolocator/geolocator.dart';
import 'package:intl/intl.dart';
import 'package:path/path.dart';
import 'package:path_provider/path_provider.dart';
class CameraPage extends StatefulWidget {
CameraPage({Key key}) : super(key: key);
@override
_CameraPageState createState() => _CameraPageState();
}
class _CameraPageState extends State<CameraPage> {
final _scanKey = GlobalKey<CameraMlVisionState>();
String _timeString;
Timer _timer;
List<Face> _faces;
GeolocationStatus _geolocationStatus;
Position _position;
bool _isLocationServiceEnabled;
Address _address;
CameraLensDirection cameraLensDirection = CameraLensDirection.front;
FaceDetector detector =
FirebaseVision.instance.faceDetector(FaceDetectorOptions(
enableTracking: true,
mode: FaceDetectorMode.accurate,
enableContours: true,
enableLandmarks: true,
enableClassification: true,
));
String _formatDateTime(DateTime dateTime) =>
DateFormat('HH:mm:ss').format(dateTime);
@override
void initState() {
super.initState();
_timeString = _formatDateTime(DateTime.now());
_timer = Timer.periodic(Duration(seconds: 1), (Timer t) => _getTime());
_getGeolocationStatus();
_getCurrentLocation().then((position) {
_getCurrentLocationName(position.latitude, position.longitude);
setState(() => _position = position);
});
}
@override
void dispose() {
_timer.cancel();
super.dispose();
}
void _getGeolocationStatus() async {
GeolocationStatus geolocationStatus =
await Geolocator().checkGeolocationPermissionStatus();
bool isLocationServiceEnabled =
await Geolocator().isLocationServiceEnabled();
setState(() {
_geolocationStatus = geolocationStatus;
_isLocationServiceEnabled = isLocationServiceEnabled;
});
print(_geolocationStatus);
print(_isLocationServiceEnabled);
}
Future<Position> _getCurrentLocation() async {
return await Geolocator()
.getCurrentPosition(desiredAccuracy: LocationAccuracy.best);
}
void _getTime() {
final DateTime now = DateTime.now();
final String formattedDateTime = _formatDateTime(now);
setState(() {
_timeString = formattedDateTime;
});
}
void _getCurrentLocationName(double latitude, double longitude) async {
final coordinates = Coordinates(latitude, longitude);
final addresses =
await Geocoder.local.findAddressesFromCoordinates(coordinates);
setState(() {
_address = addresses.first;
});
}
void _onSubmit() async {
final String path =
join((await getTemporaryDirectory()).path, '${DateTime.now()}.png')
.replaceAll(' ', '-')
.replaceAll(':', '');
try {
await _scanKey.currentState.takePicture(path);
} catch (e) {
print(e);
}
print(path);
}
@override
Widget build(BuildContext context) {
return Scaffold(
body: Stack(
children: <Widget>[
SizedBox.expand(
child: CameraMlVision<List<Face>>(
key: _scanKey,
cameraLensDirection: cameraLensDirection,
detector: detector.processImage,
resolution: ResolutionPreset.high,
overlayBuilder: (c) {
return CustomPaint(
painter: FaceDetectorPainter(
_scanKey.currentState.cameraValue.previewSize.flipped,
_faces,
reflection:
cameraLensDirection == CameraLensDirection.front),
);
},
onResult: (faces) {
if (faces == null || faces.isEmpty || !mounted) {
setState(() {
_faces = faces;
});
return;
}
setState(() {
_faces = []..addAll(faces);
});
},
onDispose: () {
detector.close();
},
),
),
Align(
alignment: Alignment.bottomLeft,
child: Container(
width: double.infinity,
height: 180.0,
padding: EdgeInsets.only(left: 20.0),
child: Align(
alignment: Alignment.topLeft,
child: Row(
mainAxisAlignment: MainAxisAlignment.spaceBetween,
crossAxisAlignment: CrossAxisAlignment.start,
children: <Widget>[
Column(
crossAxisAlignment: CrossAxisAlignment.start,
children: <Widget>[
Text(
_timeString,
style: TextStyle(
color: Colors.white,
fontSize: 50.0,
fontFamily: 'Roboto',
),
),
Row(
children: <Widget>[
_address != null
? Text(
_address.addressLine.length > 45
? _address.addressLine
.substring(0, 45) +
'...'
: _address.addressLine,
style: TextStyle(color: Colors.white),
)
: Container(),
],
),
],
),
],
),
),
),
),
Align(
alignment: Alignment.bottomCenter,
child: Container(
width: double.infinity,
height: 120.0,
// color: Colors.white,
padding: EdgeInsets.all(20.0),
child: _faces != null && _faces.length > 0
? _isLocationServiceEnabled != null &&
_isLocationServiceEnabled
? _position != null && _position.mocked != true
? _faces.length > 1
? _buildMultipleFacesWarning()
: _buildCaptureButton(context)
: _buildGPSMockedWarning()
: _buildGPSWarning()
: _buildNoFaceWarning(),
),
),
],
),
);
}
Widget _buildGPSMockedWarning() {
return Align(
alignment: Alignment.center,
child: Container(
width: double.infinity,
padding: EdgeInsets.symmetric(vertical: 20.0),
decoration: BoxDecoration(
color: Colors.red,
borderRadius: BorderRadius.circular(10.0),
),
child: Text(
'Fake location detected, unable to proceed.',
style: TextStyle(color: Colors.white),
textAlign: TextAlign.center,
),
),
);
}
Widget _buildGPSWarning() {
return Align(
alignment: Alignment.center,
child: Container(
width: double.infinity,
padding: EdgeInsets.symmetric(vertical: 20.0),
decoration: BoxDecoration(
color: Colors.red,
borderRadius: BorderRadius.circular(10.0),
),
child: Text(
'Please enable you GPS!',
style: TextStyle(color: Colors.white),
textAlign: TextAlign.center,
),
),
);
}
Widget _buildNoFaceWarning() {
return Align(
alignment: Alignment.center,
child: Container(
width: double.infinity,
padding: EdgeInsets.symmetric(vertical: 20.0),
decoration: BoxDecoration(
color: Colors.red,
borderRadius: BorderRadius.circular(10.0),
),
child: Text(
'Please make sure your face is detected!',
style: TextStyle(color: Colors.white),
textAlign: TextAlign.center,
),
),
);
}
Widget _buildMultipleFacesWarning() {
return Align(
alignment: Alignment.center,
child: Container(
width: double.infinity,
padding: EdgeInsets.symmetric(vertical: 20.0),
decoration: BoxDecoration(
color: Colors.red,
borderRadius: BorderRadius.circular(10.0),
),
child: Text(
'Please do not include your friends!',
style: TextStyle(color: Colors.white),
textAlign: TextAlign.center,
),
),
);
}
Widget _buildCaptureButton(BuildContext context) {
return Align(
alignment: Alignment.center,
child: SizedBox(
width: double.infinity,
height: 60.0,
child: RaisedButton(
elevation: 0.5,
color: Theme.of(context).primaryColor,
textColor: Colors.white,
child: Icon(Icons.camera_alt),
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.circular(10.0),
),
onPressed: () {
_onSubmit();
},
),
),
);
}
}
class FaceDetectorPainter extends CustomPainter {
FaceDetectorPainter(this.imageSize, this.faces, {this.reflection = false});
final bool reflection;
final Size imageSize;
final List<Face> faces;
@override
void paint(Canvas canvas, Size size) {
final Paint paint = Paint()
..style = PaintingStyle.stroke
..strokeWidth = 2.0
..color =
faces != null && faces.length < 2 ? Colors.grey[200] : Colors.red;
if (faces != null) {
for (Face face in faces) {
final faceRect =
_reflectionRect(reflection, face.boundingBox, imageSize.width);
canvas.drawRect(
_scaleRect(
rect: faceRect,
imageSize: imageSize,
widgetSize: size,
),
paint,
);
}
}
}
@override
bool shouldRepaint(FaceDetectorPainter oldDelegate) {
return oldDelegate.imageSize != imageSize || oldDelegate.faces != faces;
}
}
Rect _reflectionRect(bool reflection, Rect boundingBox, double width) {
if (!reflection) {
return boundingBox;
}
final centerX = width / 2;
final left = ((boundingBox.left - centerX) * -1) + centerX;
final right = ((boundingBox.right - centerX) * -1) + centerX;
return Rect.fromLTRB(left, boundingBox.top, right, boundingBox.bottom);
}
Rect _scaleRect({
@required Rect rect,
@required Size imageSize,
@required Size widgetSize,
}) {
final scaleX = widgetSize.width / imageSize.width;
final scaleY = widgetSize.height / imageSize.height;
final scaledRect = Rect.fromLTRB(
rect.left.toDouble() * scaleX,
rect.top.toDouble() * scaleY,
rect.right.toDouble() * scaleX,
rect.bottom.toDouble() * scaleY,
);
return scaledRect;
} Here is also my pubspec.yaml
|
2.2.4 I still get the same error CameraException(error, CaptureRequest contains unconfigured Input/Output Surface!) |
@pratamatama and @chaochaox1990 could you try the branch |
@Kleak Thanks for u reply. I tried the branch fix_take_picture as well use the method Future takePicture(String path) async { |
@Kleak It works for me when I use |
Thanks for your branch @Kleak |
Will update it soon 😉 |
Hi @Kleak sorry for late reply. I've tried your branch. It works for me too, thanks for your branch, but then I got this error:
@chaochaox1990 would you please pass me a fiddle about your fix? I really have no idea what happened here. |
Hi all, |
hi @Kleak , your fix does works. but sometimes it produce the error I've shown above. |
Seems you don’t have the last version of master on the screenshot |
Hi @Kleak , my mistake, I'm sorry for that. I have updated it to the last version of master and still got the error |
Hi @Kleak! |
@catalin-apostu Do the issue happen on IOS device. my android is working fine. when IOS taking picture I just get a completely black picture, I already try 6 ,6s, 7. @Kleak could you help us check many thanks |
@chaochaox1990 This happened on an Android 10 phone. I was able to fix it meanwhile. When debugging
|
@catalin-apostu |
Closing as should be ok since then :D feel free to reopen |
I could give you a hug man... You just saved me! |
Hi there i want to add The Capturing Image Feature on the Camera using the CameraMlVisionState
using a Global key :
onTap: () async {
final path = join(
(await getTemporaryDirectory()).path,
'${DateTime.now()}.png',
);
await _scanKey.currentState.takePicture(path);
Navigator.push(
context,
MaterialPageRoute(
builder: (context) => DisplayPicture(imagePath: path),
),
);
},
Now the Problem is that it doesn't take a Picture and it doesn't even navigate to the DisplayPage.
But when i removed the takePicture Line , it navigated normally to the DisplayPage but of course it showed a White screen since there is no photo captured. Is there a Solution or i just wrote a wrong Code . That Code is actually similar to the Flutter Dev CooKBook Page which gives an example of capturing an image from the camera : https://flutter.dev/docs/cookbook/plugins/picture-using-camera.
The text was updated successfully, but these errors were encountered: