Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

URGENT!!!!! Please HELP!!! Unhandled Exception: PlatformException(Failed to run model, Attempt to invoke virtual method 'org.tensorflow.lite.Tensor org.tensorflow.lite.Interpreter.getInputTensor(int)' on a null object reference, java.lang.NullPointerException: Attempt to invoke virtual method 'org.tensorflow.lite.Tensor org.tensorflow.lite.Interpreter.getInputTensor(int)' on a null object reference #295

Open
khaiyue opened this issue Apr 28, 2024 · 0 comments

Comments

@khaiyue
Copy link

khaiyue commented Apr 28, 2024

This is the whole debugging log shown. This happen after the image is chosen, no matter is from camera or gallery.
D/VRI[MainActivity](31231): debugCancelDraw some OnPreDrawListener onPreDraw return false,cancelDraw=true,count=50,android.view.ViewRootImpl@2025556 D/FirebaseAuth(31231): Notifying id token listeners about user ( PCBsXViqy9U628twsSRcGPv1SGW2 ). I/TensorFlowLite(31231): Loaded native library: tensorflowlite_jni I/TensorFlowLite(31231): Didn't load native library: tensorflowlite_jni_gms_client I/tflite (31231): Initialized TensorFlow Lite runtime. W/libc (31231): Access denied finding property "ro.hardware.chipname" I/tflite (31231): Created TensorFlow Lite XNNPACK delegate for CPU. I/Quality (31231): Skipped: false 1 cost 29.663422 refreshRate 16683856 bit true processName com.example.fyp2.fyp2 I/flutter (31231): Failed to load model. D/VRI[MainActivity](31231): registerCallbacksForSync syncBuffer=false D/VRI[MainActivity](31231): Received frameCommittedCallback lastAttemptedDrawFrameNum=1 didProduceBuffer=true syncBuffer=false W/Parcel (31231): Expecting binder but got null! D/VRI[MainActivity](31231): debugCancelDraw cancelDraw=false,count = 220,android.view.ViewRootImpl@2025556 D/VRI[MainActivity](31231): draw finished. I/Quality (31231): TaskTrackInfo {"Pkg":com.example.fyp2.fyp2,"Window":com.example.fyp2.fyp2.MainActivity,"Type":1,"Action":10,"Cost":4135,"FV":16,"JC":16-15-0-1-0-0-1-0-4-1-0-0-0-0-0-0-0-0-1-0-4124-3849-924470182-1538,"SF":no-need,"UITid":31231,"RTid":3532} TTI:Access fail D/VRI[MainActivity](31231): onFocusEvent true D/ProfileInstaller(31231): Installing profile for com.example.fyp2.fyp2 D/OplusInputMethodManagerInternal(31231): get inputMethodManager extension: com.android.internal.view.IInputMethodManager$Stub$Proxy@e971ea8 D/VRI[MainActivity](31231): onFocusEvent false D/CompatibilityChangeReporter(31231): Compat change id reported: 78294732; UID 10446; state: ENABLED D/VRI[MainActivity](31231): onFocusEvent true I/flutter (31231): G O T !!!!!!! Image E/flutter (31231): [ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled Exception: PlatformException(Failed to run model, Attempt to invoke virtual method 'org.tensorflow.lite.Tensor org.tensorflow.lite.Interpreter.getInputTensor(int)' on a null object reference, java.lang.NullPointerException: Attempt to invoke virtual method 'org.tensorflow.lite.Tensor org.tensorflow.lite.Interpreter.getInputTensor(int)' on a null object reference E/flutter (31231): at sq.flutter.flutter_tflite.TflitePlugin.feedInputTensor(TflitePlugin.java:368) E/flutter (31231): at sq.flutter.flutter_tflite.TflitePlugin.feedInputTensorImage(TflitePlugin.java:430) E/flutter (31231): at sq.flutter.flutter_tflite.TflitePlugin.detectObjectOnImage(TflitePlugin.java:624) E/flutter (31231): at sq.flutter.flutter_tflite.TflitePlugin.onMethodCall(TflitePlugin.java:157) E/flutter (31231): at io.flutter.plugin.common.MethodChannel$IncomingMethodCallHandler.onMessage(MethodChannel.java:267) E/flutter (31231): at io.flutter.embedding.engine.dart.DartMessenger.invokeHandler(DartMessenger.java:292) E/flutter (31231): at io.flutter.embedding.engine.dart.DartMessenger.lambda$dispatchMessageToQueue$0$io-flutter-embedding-engine-dart-DartMessenger(DartMessenger.java:319) E/flutter (31231): at io.flutter.embedding.engine.dart.DartMessenger$$ExternalSyntheticLambda0.run(Unknown Source:12) E/flutter (31231): at android.os.Handler.handleCallback(Handler.java:942) E/flutter (31231): at android.os.Handler.dispatchMessage(Handler.java:99) E/flutter (31231): at android.os.Looper.loopOnce(Looper.java:240) E/flutter (31231): at android.os.Looper.loop(Looper.java:351) E/flutter (31231): at android.app.ActivityThread.main(ActivityThread.java:8377) E/flutter (31231): at java.lang.reflect.Method.invoke(Native Method) E/flutter (31231): at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:584) E/flutter (31231): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1013) E/flutter (31231): , null) E/flutter (31231): #0 StandardMethodCodec.decodeEnvelope (package:flutter/src/services/message_codecs.dart:651:7) E/flutter (31231): #1 MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:334:18) E/flutter (31231): <asynchronous suspension> E/flutter (31231): #2 Tflite.detectObjectOnImage (package:flutter_tflite/flutter_tflite.dart:113:12) E/flutter (31231): <asynchronous suspension> E/flutter (31231): #3 _CameraPageState.yolov2Tiny (package:fyp2/views/camera_page.dart:494:24) E/flutter (31231): <asynchronous suspension> E/flutter (31231): #4 _CameraPageState.predictImage (package:fyp2/views/camera_page.dart:421:5) E/flutter (31231): <asynchronous suspension> E/flutter (31231): W/System (31231): A resource failed to call close. W/System (31231): A resource failed to call close.

This is the complete code that I modified based on my use case, but mostly is the similar one.

import 'dart:async';
import 'dart:io';
import 'dart:typed_data';
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
import 'package:image/image.dart' as img;
import 'package:image_picker/image_picker.dart';
import 'package:flutter_tflite/flutter_tflite.dart';

class CameraPage extends StatefulWidget {
  const CameraPage({super.key});

  @override
  _CameraPageState createState() => _CameraPageState();
}

class _CameraPageState extends State<CameraPage> {
  final ImagePicker _picker = ImagePicker();
  XFile? _image;
  List<dynamic>? _recognitions;
  late double _imageHeight;
  late double _imageWidth;
  bool _busy = false;

  Future predictImagePicker(ImageSource source) async {
    var image = await _picker.pickImage(source: source);
    if (image == null) {
      ScaffoldMessenger.of(context).showSnackBar(
          const SnackBar(content: Text("Image selection cancelled")));
      return;
    }
    print('G O T !!!!!!! Image');
    setState(() {
      _busy = true;
    });
    predictImage(image);
  }

  Future predictImage(XFile image) async {
    await yolov2Tiny(image);

    FileImage(File(image.path))
        .resolve(const ImageConfiguration())
        .addListener(ImageStreamListener((ImageInfo info, bool _) {
      setState(() {
        _imageHeight = info.image.height.toDouble();
        _imageWidth = info.image.width.toDouble();
      });
    }));

    setState(() {
      _image = image;
      _busy = false;
    });
  }

  @override
  void initState() {
    super.initState();
    _busy = true;

    loadModel().then((val) {
      setState(() {
        _busy = false;
      });
    });
  }

  Future loadModel() async {
    Tflite.close();
    try {
      String res = (await Tflite.loadModel(
        model: "assets/best-int8.tflite",
        labels: "assets/labels.txt",
        numThreads: 4,
        isAsset: true,
        useGpuDelegate: false,
      ))!;
      print('Load successfully');
    } on PlatformException {
      print('Failed to load model.');
    }
  }

  Uint8List imageToByteListFloat32(
      img.Image image, int inputSize, double mean, double std) {
    var convertedBytes = Float32List(1 * inputSize * inputSize * 3);
    var buffer = Float32List.view(convertedBytes.buffer);
    int pixelIndex = 0;
    for (var i = 0; i < inputSize; i++) {
      for (var j = 0; j < inputSize; j++) {
        var pixel = image.getPixel(j, i);
        buffer[pixelIndex++] = (img.getRed(pixel) - mean) / std;
        buffer[pixelIndex++] = (img.getGreen(pixel) - mean) / std;
        buffer[pixelIndex++] = (img.getBlue(pixel) - mean) / std;
      }
    }
    return convertedBytes.buffer.asUint8List();
  }

  Future yolov2Tiny(XFile image) async {
    int startTime = DateTime.now().millisecondsSinceEpoch;
    var recognitions = await Tflite.detectObjectOnImage(
      path: image.path,
      model: "YOLO",
      threshold: 0.3,
      imageMean: 0.0,
      imageStd: 255.0,
      numResultsPerClass: 3,
      asynch: true,
    );
    setState(() {
      _recognitions = recognitions!;
    });
    int endTime = DateTime.now().millisecondsSinceEpoch;
    print("Inference took ${endTime - startTime}ms");
  }

  List<Widget> renderBoxes(Size screen) {
    if (_recognitions == null) return [];
    double factorX = screen.width;
    double factorY = _imageHeight / _imageWidth * screen.width;
    Color blue = const Color.fromRGBO(37, 213, 253, 1.0);
    return _recognitions!.map((re) {
      return Positioned(
        left: re["rect"]["x"] * factorX,
        top: re["rect"]["y"] * factorY,
        width: re["rect"]["w"] * factorX,
        height: re["rect"]["h"] * factorY,
        child: Container(
          decoration: BoxDecoration(
            borderRadius: const BorderRadius.all(Radius.circular(8.0)),
            border: Border.all(
              color: blue,
              width: 2,
            ),
          ),
          child: Text(
            "${re["detectedClass"]} ${(re["confidenceInClass"] * 100).toStringAsFixed(0)}%",
            style: TextStyle(
              background: Paint()..color = blue,
              color: Colors.white,
              fontSize: 12.0,
            ),
          ),
        ),
      );
    }).toList();
  }

  @override
  Widget build(BuildContext context) {
    Size size = MediaQuery.of(context).size;
    List<Widget> stackChildren = [];

    if(_image != null){
      stackChildren.add(Positioned(
        top: 0.0,
        left: 0.0,
        width: size.width,
        child: Image.file(File(_image?.path ?? '')),
      ));
    }

    if (_busy) {
      stackChildren.add(const Opacity(
        opacity: 0.3,
        child: ModalBarrier(dismissible: false, color: Colors.grey),
      ));
      stackChildren.add(const Center(child: CircularProgressIndicator()));
    }

    if (_recognitions != null && _recognitions!.isNotEmpty) {
      stackChildren.addAll(renderBoxes(size));
    }

    return MaterialApp(
      home: Scaffold(
        appBar: AppBar(
          title: const Text('Scan Ingredients'),
        ),
        body: Center(
          child: Column(
            mainAxisAlignment: MainAxisAlignment.center,
            children: [
              ElevatedButton.icon(
                onPressed: (){predictImagePicker(ImageSource.camera);},
                icon: const Icon(Icons.camera, size: 24.0,),
                label: const Text('Capture image with camera'),
              ),
              const SizedBox(height: 16),
              ElevatedButton.icon(
                onPressed: (){predictImagePicker(ImageSource.gallery);},
                icon: const Icon(Icons.image, size: 24.0,),
                label: const Text('Choose from Gallery'),
              ),
            ],
          ),
        ),
      ),
    );
  }
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant