Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tflite regression model gives different output while input is the same #35

Closed
MojtabaYEG opened this issue Aug 19, 2020 · 4 comments
Closed

Comments

@MojtabaYEG
Copy link

I have a tflite regression model, I tested the output of my tflite model in python. It's giving me the same output result as the .h5 model. However, tflite_flutter interpretter gives me wrong results. I have randomly checked 10 different pixels and the input values are totally same. So the input is same, the models are same, but tflite_flutter interpretter gives a different result than tensorflow tflite interpretter.

Here is my code:
var interpreterOptions = InterpreterOptions()..addDelegate(NnApiDelegate());
final interpreter = await Interpreter.fromAsset('modelPath',options: interpreterOptions);
var imageBytes = (await rootBundle.load('imagePath')).buffer;
imageLib.Image oriImage = imageLib.decodePng(imageBytes.asUint8List());
imageLib.Image copyImage = imageLib.copyCrop(oriImage, 60, 0, 60, 30);
var resizedImage = copyImage.getBytes(format: imageLib.Format.rgb);
var input = [];
for (int i = 0 ; i < resizedImage.length ; i++){ input.add(resizedImage[i].toDouble() / 255); }
input = input.reshape([1,30,60,3]);
var output = List(1).reshape([1,1]);
interpreter.run(input, output);

Have you tested the package for DL regression models? I'm using a Deep CNN regressor.

@am15h
Copy link
Owner

am15h commented Aug 19, 2020

Strange, the interpreter generally works on all tflite models, (not tested for custom ops till now). Please share your flutter doctor -v as well.

These kinds of issues are mostly a result of input/output format mismatch. I recommend using https://pub.dev/packages/tflite_flutter_helper, to eliminate such errors.

@MojtabaYEG
Copy link
Author

Here is my flutter doctor -v:

[√] Flutter (Channel stable, v1.17.5, on Microsoft Windows, locale en-CA)
• Flutter version 1.17.5 at C:\Path
• Framework revision 8af6b2f038 (7 weeks ago), 2020-06-30 12:53:55 -0700
• Engine revision ee76268252
• Dart version 2.8.4

[√] Android toolchain - develop for Android devices (Android SDK version 29.0.3)
• Android SDK at C:\Path
• Platform android-29, build-tools 29.0.3
• Java binary at: C:\Path
• Java version OpenJDK Runtime Environment (build 1.8.0_212-release-1586-b04)
• All Android licenses accepted.

[√] Android Studio (version 3.6)
• Android Studio at C:\Path
• Flutter plugin version 47.1.1
• Dart plugin version 192.8052
• Java version OpenJDK Runtime Environment (build 1.8.0_212-release-1586-b04)

[√] Connected device (1 available)
• Android SDK built for x86 • emulator-5554 • android-x86 • Android 10 (API 29) (emulator)

• No issues found!

I also tried tflite_flutter_helper. here is my code:

var interpreterOptions = InterpreterOptions()..addDelegate(NnApiDelegate());
final interpreter = await Interpreter.fromAsset('path',options: interpreterOptions);
var imageBytes = (await rootBundle.load('path')).buffer;
imageLib.Image oriImage = imageLib.decodePng(imageBytes.asUint8List());
imageLib.Image copyImage = imageLib.copyCrop(oriImage, 60, 0, 60, 30);
TensorImage tensorImage = TensorImage.fromImage(copyImage);
TensorBuffer probabilityBuffer = TensorBuffer.createFixedSize([1,1], TfLiteType.float32);
interpreter.run(tensorImage.buffer, probabilityBuffer.buffer);

which doesn't work and gives me the following error:
[ERROR:flutter/lib/ui/ui_dart_state.cc(157)] Unhandled Exception: Bad state: failed precondition

the error referes to this line of code:
interpreter.run(tensorImage.buffer, probabilityBuffer.buffer);

The problem goes away when I use the ImageProcessorBuilder. Then my code is:

var interpreterOptions = InterpreterOptions()..addDelegate(NnApiDelegate());
final interpreter = await Interpreter.fromAsset('path',options: interpreterOptions);
var imageBytes = (await rootBundle.load('path')).buffer;
imageLib.Image oriImage = imageLib.decodePng(imageBytes.asUint8List());
imageLib.Image copyImage = imageLib.copyCrop(oriImage, 60, 0, 60, 30);
TensorImage tensorImage = TensorImage.fromImage(copyImage);
ImageProcessor imageProcessor = ImageProcessorBuilder().add(DequantizeOp(0, 1 / 255.0)).build();
tensorImage = imageProcessor.process(tensorImage);
TensorBuffer probabilityBuffer = TensorBuffer.createFixedSize([1,1], TfLiteType.float32);
interpreter.run(tensorImage.buffer, probabilityBuffer.buffer);

But all the input and output stuff will be incorrect. the input shape should be 13060*3 = 5400 while after using the imageProcessor it's 21600 ( which is 5400 * 4). also the ouput is [35, 204, 170, 74], and all of them are incorrect. Also outputs are integer instead of float. The output should be just 1 float number ( 60.5)

@MojtabaYEG
Copy link
Author

I also changed the model from a regression to a classifier by making 118 output neurons instead of 1 neuron. I trained the new classifier CNN. The outputs of the classifier is also different from what I'm getting in python from h5 and tflite model.

@am15h Did you find a chance to make some test experiments? Is it possible to make an example that classifies MNIST with the package?

@MojtabaYEG
Copy link
Author

Finally working with tflite_flutter_helper the problem is gone using the following code:

var interpreterOptions = InterpreterOptions()..addDelegate(NnApiDelegate());
final interpreter = await Interpreter.fromAsset('ModelPath',options: interpreterOptions);
var imageBytes = (await rootBundle.load('ImagePath')).buffer;
imageLib.Image oriImage = imageLib.decodePng(imageBytes.asUint8List());
imageLib.Image copyImage = imageLib.copyCrop(oriImage, 60, 0, 60, 30);
TensorImage tensorImage = TensorImage.fromImage(copyImage);
ImageProcessor imageProcessor = ImageProcessorBuilder().add(NormalizeOp(0, 255)).build();
tensorImage = imageProcessor.process(tensorImage);
TensorBuffer probabilityBuffer = TensorBuffer.createFixedSize(<int>[1,1], TfLiteType.float32);
interpreter.run(tensorImage.buffer, probabilityBuffer.buffer);
print(probabilityBuffer.buffer.asFloat32List());

JBBx2016 pushed a commit to JBBx2016/tflite_flutter_plugin that referenced this issue Aug 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants