Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[firebase_ml_vision] batchAnnotateImages call failed with exception: java.net.SocketTimeoutException: timeout on CLoud Text Recognizer #2724

Closed
kartikpateldev opened this issue Jun 8, 2020 · 5 comments
Labels
plugin: ml_vision type: bug Something isn't working

Comments

@kartikpateldev
Copy link

kartikpateldev commented Jun 8, 2020

Getting **"Unhandled Exception: PlatformException(textRecognizerError, Cloud Vision batchAnnotateImages call failure, null)"** exception while processing image for cloudTextRecognizer()

**StackTrace:**
W/com.foloos( 9647): Accessing hidden method Lsun/misc/Unsafe;->getObject(Ljava/lang/Object;J)Ljava/lang/Object; (greylist, linking, allowed)
E/ImageAnnotatorTask( 9647): **batchAnnotateImages call failed with exception:** 
E/ImageAnnotatorTask( 9647): **java.net.SocketTimeoutException: timeout**
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.okio.Okio$3.newTimeoutException(Okio.java:214)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.okio.AsyncTimeout.exit(AsyncTimeout.java:263)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.okio.AsyncTimeout$2.read(AsyncTimeout.java:217)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.okio.RealBufferedSource.indexOf(RealBufferedSource.java:307)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.okio.RealBufferedSource.indexOf(RealBufferedSource.java:301)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.okio.RealBufferedSource.readUtf8LineStrict(RealBufferedSource.java:197)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.internal.http.Http1xStream.readResponse(Http1xStream.java:188)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.internal.http.Http1xStream.readResponseHeaders(Http1xStream.java:129)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.internal.http.HttpEngine.readNetworkResponse(HttpEngine.java:750)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.internal.http.HttpEngine.readResponse(HttpEngine.java:622)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.internal.huc.HttpURLConnectionImpl.execute(HttpURLConnectionImpl.java:475)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.internal.huc.HttpURLConnectionImpl.getResponse(HttpURLConnectionImpl.java:411)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.internal.huc.HttpURLConnectionImpl.getResponseCode(HttpURLConnectionImpl.java:542)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.internal.huc.DelegatingHttpsURLConnection.getResponseCode(DelegatingHttpsURLConnection.java:106)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.internal.huc.HttpsURLConnectionImpl.getResponseCode(HttpsURLConnectionImpl.java:30)
E/ImageAnnotatorTask( 9647): 	at com.google.android.gms.internal.firebase_ml.zzhn.<init>(com.google.firebase:firebase-ml-vision@@24.0.1:5)
E/ImageAnnotatorTask( 9647): 	at com.google.android.gms.internal.firebase_ml.zzhk.zzgh(com.google.firebase:firebase-ml-vision@@24.0.1:43)
E/ImageAnnotatorTask( 9647): 	at com.google.android.gms.internal.firebase_ml.zzgu.zzgb(com.google.firebase:firebase-ml-common@@22.0.1:131)
E/ImageAnnotatorTask( 9647): 	at com.google.android.gms.internal.firebase_ml.zzgb.zzfj(com.google.firebase:firebase-ml-vision@@24.0.1:51)
E/ImageAnnotatorTask( 9647): 	at com.google.android.gms.internal.firebase_ml.zzrc.zza(com.google.firebase:firebase-ml-vision@@24.0.1:68)
E/ImageAnnotatorTask( 9647): 	at com.google.android.gms.internal.firebase_ml.zzqz.zza(com.google.firebase:firebase-ml-vision@@24.0.1:23)
E/ImageAnnotatorTask( 9647): 	at com.google.android.gms.internal.firebase_ml.zzpj.zza(com.google.firebase:firebase-ml-common@@22.0.1:31)
E/ImageAnnotatorTask( 9647): 	at com.google.android.gms.internal.firebase_ml.zzpl.call(Unknown Source:8)
E/ImageAnnotatorTask( 9647): 	at com.google.android.gms.internal.firebase_ml.zzpf.zza(com.google.firebase:firebase-ml-common@@22.0.1:32)
E/ImageAnnotatorTask( 9647): 	at com.google.android.gms.internal.firebase_ml.zzpe.run(Unknown Source:4)
E/ImageAnnotatorTask( 9647): 	at android.os.Handler.handleCallback(Handler.java:883)
E/ImageAnnotatorTask( 9647): 	at android.os.Handler.dispatchMessage(Handler.java:100)
E/ImageAnnotatorTask( 9647): 	at com.google.android.gms.internal.firebase_ml.zze.dispatchMessage(com.google.firebase:firebase-ml-common@@22.0.1:6)
E/ImageAnnotatorTask( 9647): 	at android.os.Looper.loop(Looper.java:228)
E/ImageAnnotatorTask( 9647): 	at android.os.HandlerThread.run(HandlerThread.java:67)
E/ImageAnnotatorTask( 9647): Caused by: java.net.SocketException: socket is closed
E/ImageAnnotatorTask( 9647): 	at com.android.org.conscrypt.ConscryptFileDescriptorSocket$SSLInputStream.read(ConscryptFileDescriptorSocket.java:554)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.okio.Okio$2.read(Okio.java:138)
E/ImageAnnotatorTask( 9647): 	at com.android.okhttp.okio.AsyncTimeout$2.read(AsyncTimeout.java:213)
E/ImageAnnotatorTask( 9647): 	... 27 more
E/flutter ( 9647): [ERROR:flutter/lib/ui/ui_dart_state.cc(157)] **Unhandled Exception: PlatformException(textRecognizerError, Cloud Vision batchAnnotateImages call failure, null)**
E/flutter ( 9647): #0      StandardMethodCodec.decodeEnvelope (package:flutter/src/services/message_codecs.dart:569:7)
E/flutter ( 9647): #1      MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:156:18)
E/flutter ( 9647): <asynchronous suspension>
E/flutter ( 9647): #2      MethodChannel.invokeMethod (package:flutter/src/services/platform_channel.dart:329:12)
E/flutter ( 9647): #3      MethodChannel.invokeMapMethod (package:flutter/src/services/platform_channel.dart:356:48)
E/flutter ( 9647): #4      TextRecognizer.processImage (package:firebase_ml_vision/src/text_recognizer.dart:40:38)
E/flutter ( 9647): #5      DocumentScanningState._getMRZ (package:Foloos/View/DocumentScanning.dart:454:56)
E/flutter ( 9647): #6      DocumentScanningState.recognizeTextAndFaces (package:Foloos/View/DocumentScanning.dart:502:46)
E/flutter ( 9647): <asynchronous suspension>
E/flutter ( 9647): #7      DocumentScanningState.onClicked (package:Foloos/View/DocumentScanning.dart:396:43)
E/flutter ( 9647): <asynchronous suspension>
E/flutter ( 9647): #8      DocumentScanningState.customButton.<anonymous closure> (package:Foloos/View/DocumentScanning.dart:358:22)
E/flutter ( 9647): #9      GestureRecognizer.invokeCallback (package:flutter/src/gestures/recognizer.dart:182:24)
E/flutter ( 9647): #10     TapGestureRecognizer.handleTapUp (package:flutter/src/gestures/tap.dart:504:11)
E/flutter ( 9647): #11     BaseTapGestureRecognizer._checkUp (package:flutter/src/gestures/tap.dart:282:5)
E/flutter ( 9647): #12     BaseTapGestureRecognizer.handlePrimaryPointer (package:flutter/src/gestures/tap.dart:217:7)
E/flutter ( 9647): #13     PrimaryPointerGestureRecognizer.handleEvent (package:flutter/src/gestures/recognizer.dart:475:9)
E/flutter ( 9647): #14     PointerRouter._dispatch (package:flutter/src/gestures/pointer_router.dart:76:12)
E/flutter ( 9647): #15     PointerRouter._dispatchEventToRoutes.<anonymous closure> (package:flutter/src/gestures/pointer_router.dart:122:9)
E/flutter ( 9647): #16     _LinkedHashMapMixin.forEach (dart:collection-patch/compact_hash.dart:379:8)
E/flutter ( 9647): #17     PointerRouter._dispatchEventToRoutes (package:flutter/src/gestures/pointer_router.dart:120:18)
E/flutter ( 9647): #18     PointerRouter.route (package:flutter/src/gestures/pointer_router.dart:106:7)
E/flutter ( 9647): #19     GestureBinding.handleEvent (package:flutter/src/gestures/binding.dart:218:19)
E/flutter ( 9647): #20     GestureBinding.dispatchEvent (package:flutter/src/gestures/binding.dart:198:22)
E/flutter ( 9647): #21     GestureBinding._handlePointerEvent (package:flutter/src/gestures/binding.dart:156:7)
E/flutter ( 9647): #22     GestureBinding._flushPointerEventQueue (package:flutter/src/gestures/binding.dart:102:7)
E/flutter ( 9647): #23     GestureBinding._handlePointerDataPacket (package:flutter/src/gestures/binding.dart:86:7)
E/flutter ( 9647): #24     _rootRunUnary (dart:async/zone.dart:1196:13)
E/flutter ( 9647): #25     _CustomZone.runUnary (dart:async/zone.dart:1085:19)
E/flutter ( 9647): #26     _CustomZone.runUnaryGuarded (dart:async/zone.dart:987:7)
E/flutter ( 9647): #27     _invoke1 (dart:ui/hooks.dart:275:10)
E/flutter ( 9647): #28     _dispatchPointerDataPacket (dart:ui/hooks.dart:184:5)

**MyCode:** 
/*imagePath is (1280x720) image file path captured from CameraController.takePicture(imagePath)*/
FirebaseVisionImage visionImage = FirebaseVisionImage.fromFilePath(imagePath);
    TextRecognizer textRecognizer = FirebaseVision.instance.cloudTextRecognizer();
    VisionText visionText = await textRecognizer.processImage(visionImage);

Flutter doctor

Doctor summary (to see all details, run flutter doctor -v):
[✓] Flutter (Channel stable, v1.17.1, on Linux, locale en_US.UTF-8)
 
[!] Android toolchain - develop for Android devices (Android SDK version 29.0.2)
    ✗ Android license status unknown.
      Try re-installing or updating your Android SDK Manager.
      See https://developer.android.com/studio/#downloads or visit visit https://flutter.dev/docs/get-started/install/linux#android-setup for
      detailed instructions.
[✓] Android Studio (version 3.5)
[✓] VS Code (version 1.38.1)
[!] Connected device
    ! No devices available

! Doctor found issues in 2 categories.

Thanks in Advance,

@TahaTesser
Copy link

Hi @kp67201
Can you please provide a minimal complete reproducible code sample.
Thank you

@TahaTesser TahaTesser added the blocked: customer-response Waiting for customer response, e.g. more information was requested. label Jun 8, 2020
@kartikpateldev
Copy link
Author

Below is my source code:

import 'package:firebase_ml_vision/firebase_ml_vision.dart';
import 'package:flutter/material.dart';
import 'package:camera/camera.dart';
import 'package:path/path.dart' as path ;
import 'package:path_provider/path_provider.dart';

class DocumentScanning extends StatefulWidget {

  @override
  DocumentScanningState createState() => DocumentScanningState();
}

class DocumentScanningState extends State<DocumentScanning> {

  CameraController _controller;
  Future<void> _initializeControllerFuture;

  @override
  void initState(){
    super.initState();
    WidgetsBinding.instance
        .addPostFrameCallback((_) => _startCamera());
  }

  void _startCamera() async {
    List<CameraDescription> cameras;
    cameras = await availableCameras();
    _controller = CameraController(cameras.first, ResolutionPreset.high, enableAudio: false);
    _initializeControllerFuture = _controller.initialize();
  }


  @override
  void dispose() {
    _controller?.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {

    return SafeArea(
      child: Scaffold(
        body: LayoutBuilder(
          builder: (context, constraints) => Container(
            child: Column(
              children: <Widget>[
                Center(child: cameraPreview(context)),
                RaisedButton(
                  onPressed: () async {
                    String imagePath = await captureImage(context);
                    FirebaseVisionImage visionImage = FirebaseVisionImage.fromFilePath(imagePath);
                    TextRecognizer textRecognizer = FirebaseVision.instance.cloudTextRecognizer();
                    VisionText visionText = await textRecognizer.processImage(visionImage);
                    String scannedText =  visionText.text;
                  },
                ),
              ],
            ),
          ),
        ),
      ),
    );
  }  // This widget is the root of your application.

  Widget cameraPreview(context){
    var screenWidth = MediaQuery.of(context).size.width * 0.9;
    return FutureBuilder<void>(
      future: _initializeControllerFuture,
      builder: (context, snapshot) {
        if (snapshot.connectionState == ConnectionState.done) {
          return Container(
//            color: AppColors.goldTextColor,
            width: screenWidth,
            height: screenWidth,
            child: ClipRect(
              child: OverflowBox(
                alignment: Alignment.center,
                child: FittedBox(
                  fit: BoxFit.fitWidth,
                  child: Container(
                    width: screenWidth,
                    height: screenWidth /_controller.value.aspectRatio,
                    child: CameraPreview(_controller),
                  ),
                ),
              ),
            ),
          );
        } else {
          // Otherwise, display a loading indicator.
          return Center(child: CircularProgressIndicator());
        }
      },
    ); // this is my CameraPreview
  }

  Future<String> captureImage(BuildContext context) async{
    String imagePath = "";
    try{
      // Ensure that the camera is initialized.
      await _initializeControllerFuture;
      // Construct the path where the image should be saved using the path
      imagePath = path.join((await getTemporaryDirectory()).path, '${DateTime.now()}.png',);
      await _controller.takePicture(imagePath);
    } catch (e) {
      print(e);
    }
    return imagePath;
  }
}

Thank you @TahaTesser for responding,

@TahaTesser TahaTesser added plugin: ml_vision type: bug Something isn't working and removed blocked: customer-response Waiting for customer response, e.g. more information was requested. labels Jun 9, 2020
@TahaTesser
Copy link

flutter doctor -v
[✓] Flutter (Channel dev, 1.19.0-4.0.pre, on Mac OS X 10.15.5 19F101, locale en-GB)
    • Flutter version 1.19.0-4.0.pre at /Users/tahatesser/Code/flutter_dev
    • Framework revision 2f7a59a8da (4 days ago), 2020-06-05 03:44:02 -0700
    • Engine revision d17c84e7af
    • Dart version 2.9.0 (build 2.9.0-13.0.dev 02915ec5ce)

 
[✓] Android toolchain - develop for Android devices (Android SDK version 29.0.3)
    • Android SDK at /Users/tahatesser/Code/sdk
    • Platform android-29, build-tools 29.0.3
    • ANDROID_HOME = /Users/tahatesser/Code/sdk
    • Java binary at: /Applications/Android Studio.app/Contents/jre/jdk/Contents/Home/bin/java
    • Java version OpenJDK Runtime Environment (build 1.8.0_242-release-1644-b3-6222593)
    • All Android licenses accepted.

[✓] Xcode - develop for iOS and macOS (Xcode 11.5)
    • Xcode at /Applications/Xcode.app/Contents/Developer
    • Xcode 11.5, Build version 11E608c
    • CocoaPods version 1.9.3

[✓] Chrome - develop for the web
    • Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome

[✓] Android Studio (version 4.0)
    • Android Studio at /Applications/Android Studio.app/Contents
    • Flutter plugin version 46.0.2
    • Dart plugin version 193.7361
    • Java version OpenJDK Runtime Environment (build 1.8.0_242-release-1644-b3-6222593)

[✓] VS Code (version 1.45.1)
    • VS Code at /Applications/Visual Studio Code.app/Contents
    • Flutter extension version 3.11.0

[✓] Connected device (4 available)
    • SM M305F   • 32003c30dc19668f • android-arm64  • Android 10 (API 29)
    • macOS      • macOS            • darwin-x64     • Mac OS X 10.15.5 19F101
    • Web Server • web-server       • web-javascript • Flutter Tools
    • Chrome     • chrome           • web-javascript • Google Chrome 83.0.4103.97

• No issues found!

@DziakStudio
Copy link

Same issue here. TextRecognizer also has a similar error but it says something like "Waiting to be downloaded.."

@Salakar
Copy link
Member

Salakar commented May 10, 2021

Hey, 👋, the firebase_ml_vision package is now discontinued since its APIs have been deprecated and removed from the Android & iOS Firebase SDKs.

I'd recommend switching to the alternatives now;

  • for on-device vision APIs; switch to Google's standalone ML Kit library via google_ml_kit
  • for cloud vision APIs; the recommended approach now is to use Firebase
    Authentication and Functions, which gives you a managed, serverless gateway to the Google Cloud Vision APIs. For an example Functions project see the vision-annotate-images sample project. You'd deploy this project for example and then call the cloud function from your app with your image data to get the annotation result.

Apologies for any inconvenience here and best of luck switching over the the new APIs. firebase_ml_custom is not affected by this deprecation.

Thanks

@Salakar Salakar closed this as completed May 10, 2021
@firebase firebase locked and limited conversation to collaborators Jun 10, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
plugin: ml_vision type: bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants