Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting [java.lang.IllegalArgumentException: Internal error: Error applying delegate] error while applying GPU Delegate #65648

Open
walker-ai opened this issue Apr 14, 2024 · 7 comments
Assignees
Labels

Comments

@walker-ai
Copy link

walker-ai commented Apr 14, 2024

Issue type

Bug

Have you reproduced the bug with TensorFlow Nightly?

Yes

Source

source

TensorFlow version

tf 2.12.0

Custom code

Yes

OS platform and distribution

Linux Ubuntu 20.04

Mobile device

android 10.0 emulator

Python version

3.9

Bazel version

No response

GCC/compiler version

No response

CUDA/cuDNN version

CUDA 11.4 cuDNN 8

GPU model and memory

No response

Current behavior?

Sorry about the type 'bug', I have no idea to solve it, I don' mean to offend that.

I am using the official tf example named generative_ai which from:
https://github.com/tensorflow/examples/tree/master/lite/examples/generative_ai/android.

And I tried to add GPU delegate to the interpreter, and my dependencies's version are:

tflite-gpu = { module = "org.tensorflow:tensorflow-lite-gpu", version = "2.12.0" }
tflite-gpu-api = { module = "org.tensorflow:tensorflow-lite-gpu-api", version = "2.12.0" }

The test case is:

override suspend fun initModel(): InitModelResult {
    return withContext(dispatcher) {
        // Load model file
        val loadResult = loadModelFile(context)

        /**
         * wyt
         * create interpreter option for gpu delegate
         */
        val options = Interpreter.Options()
        val myDelegate = GpuDelegate()
        options.addDelegate(myDelegate)

        // Determine if load was successful
        if (loadResult.isFailure) {
            val exc = loadResult.exceptionOrNull()
            return@withContext if (exc is FileNotFoundException) {
                InitModelResult.Error(AutoCompleteServiceError.MODEL_FILE_NOT_FOUND)
            } else {
                InitModelResult.Error(AutoCompleteServiceError.MODEL_NOT_INITIALIZED)
            }
        }

        /**
         * wyt
         * add options to interpreter
         */

        // Instantiate interpreter with loaded model
        val model = loadResult.getOrNull()
        isInitialized = model?.let {
            interpreter = Interpreter(it, options)
            true
        } ?: false

        if (isInitialized) InitModelResult.Success
        else InitModelResult.Error(AutoCompleteServiceError.MODEL_NOT_INITIALIZED)
    }
}

I have only add these lines(Added wyt commented section).

But the app cannot be installed rightly. It will crash.

But when I delete the param options to interpreter = Interpreter(it), it works well.

It stands to reason that gpu delegate should not be supported.

Standalone code to reproduce the issue

The standalone code as described above(Current behavior).

Relevant log output

java.lang.IllegalArgumentException: Internal error: Error applying delegate: 
    at org.tensorflow.lite.NativeInterpreterWrapper.createInterpreter(Native Method)
    at org.tensorflow.lite.NativeInterpreterWrapper.init(NativeInterpreterWrapper.java:110)
    at org.tensorflow.lite.NativeInterpreterWrapper.<init>(NativeInterpreterWrapper.java:73)
    at org.tensorflow.lite.NativeInterpreterWrapperExperimental.<init>(NativeInterpreterWrapperExperimental.java:36)
    at org.tensorflow.lite.Interpreter.<init>(Interpreter.java:214)
    at com.google.tensorflowdemo.data.autocomplete.AutoCompleteServiceImpl$initModel$2.invokeSuspend(AutoCompleteService.kt:167)
    at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
    at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
    at kotlinx.coroutines.internal.LimitedDispatcher.run(LimitedDispatcher.kt:42)
    at kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:95)
    at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
    at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
    at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
    at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)
@vytanase
Copy link

vytanase commented Apr 14, 2024



override suspend fun initModel(): InitModelResult {
    return withContext(dispatcher) {
        val loadResult = loadModelFile(context)

        if (loadResult.isFailure) {
            val exc = loadResult.exceptionOrNull()
            return@withContext if (exc is FileNotFoundException) {
                InitModelResult.Error(AutoCompleteServiceError.MODEL_FILE_NOT_FOUND)
            } else {
                InitModelResult.Error(AutoCompleteServiceError.MODEL_NOT_INITIALIZED)
            }
        }

        val options = Interpreter.Options()
        val myDelegate = GpuDelegate()
        options.addDelegate(myDelegate)

        try {

            val model = loadResult.getOrNull()
            isInitialized = model?.let {
                interpreter = Interpreter(it, options)
                true
            } ?: false

            return@withContext if (isInitialized) {
                InitModelResult.Success
            } else {
                InitModelResult.Error(AutoCompleteServiceError.MODEL_NOT_INITIALIZED)
            }
        } catch (e: Exception) {
 
            e.printStackTrace()
            return@withContext InitModelResult.Error(AutoCompleteServiceError.MODEL_INITIALIZATION_FAILED)
        }
    }
}
// Try this code snippet 

@walker-ai
Copy link
Author



override suspend fun initModel(): InitModelResult {
    return withContext(dispatcher) {
        val loadResult = loadModelFile(context)

        if (loadResult.isFailure) {
            val exc = loadResult.exceptionOrNull()
            return@withContext if (exc is FileNotFoundException) {
                InitModelResult.Error(AutoCompleteServiceError.MODEL_FILE_NOT_FOUND)
            } else {
                InitModelResult.Error(AutoCompleteServiceError.MODEL_NOT_INITIALIZED)
            }
        }

        val options = Interpreter.Options()
        val myDelegate = GpuDelegate()
        options.addDelegate(myDelegate)

        try {

            val model = loadResult.getOrNull()
            isInitialized = model?.let {
                interpreter = Interpreter(it, options)
                true
            } ?: false

            return@withContext if (isInitialized) {
                InitModelResult.Success
            } else {
                InitModelResult.Error(AutoCompleteServiceError.MODEL_NOT_INITIALIZED)
            }
        } catch (e: Exception) {
 
            e.printStackTrace()
            return@withContext InitModelResult.Error(AutoCompleteServiceError.MODEL_INITIALIZATION_FAILED)
        }
    }
}
// Try this code snippet 

Thank you for your reply. I have tried this code above. It does catch the exception. However, the log convert from 'Error' to 'Warning":

com.google.tensorflowdemo.debug      W  java.lang.IllegalArgumentException: Internal error: Error applying delegate: 
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at org.tensorflow.lite.NativeInterpreterWrapper.createInterpreter(Native Method)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at org.tensorflow.lite.NativeInterpreterWrapper.init(NativeInterpreterWrapper.java:110)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at org.tensorflow.lite.NativeInterpreterWrapper.<init>(NativeInterpreterWrapper.java:73)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at org.tensorflow.lite.NativeInterpreterWrapperExperimental.<init>(NativeInterpreterWrapperExperimental.java:36)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at org.tensorflow.lite.Interpreter.<init>(Interpreter.java:214)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at com.google.tensorflowdemo.data.autocomplete.AutoCompleteServiceImpl$initModel$2.invokeSuspend(AutoCompleteService.kt:156)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at kotlinx.coroutines.internal.LimitedDispatcher.run(LimitedDispatcher.kt:42)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:95)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
2024-04-14 18:08:11.198 29026-29253 System.err              com.google.tensorflowdemo.debug      W  	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)

The problem seems to still exist.

@sawantkumar
Copy link

Hi @walker-ai ,

I have been trying to replicate your issue but i don't have autocomplete.tflite file, can you provide me the autocomplete.tflite file or the script you used to create it .

@sawantkumar sawantkumar added Android comp:lite-examples TensorFlow Lite Examples labels May 7, 2024
@walker-ai
Copy link
Author

Hi @walker-ai ,

I have been trying to replicate your issue but i don't have autocomplete.tflite file, can you provide me the autocomplete.tflite file or the script you used to create it .

Thank you for your reply. The autocomplete.tflite was converted from quant_generate_tflite, you can check the example:

gpt2_lm.jit_compile = False
converter = tf.lite.TFLiteConverter.from_concrete_functions(
  [concrete_func],
  gpt2_lm)

converter.target_spec.supported_ops = [
  tf.lite.OpsSet.TFLITE_BUILTINS, # enable TFLite ops
  tf.lite.OpsSet.SELECT_TF_OPS, # enable TF ops
]
converter.allow_custom_ops = True
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.experimental_select_user_tf_ops = [
  "UnsortedSegmentJoin",
  "UpperBound"
]
converter._experimental_guarantee_all_funcs_one_use = True
quant_generate_tflite = converter.convert()
run_inference("I'm enjoying a", quant_generate_tflite)

with open('quantized_gpt2.tflite', 'wb') as f:
  f.write(quant_generate_tflite)  # autocomplete.tflite

@sawantkumar
Copy link

Hi @walker-ai ,

I followed the same example while creating the autocomplete.tflite file , for some reason the "generate" function under the decorator "tf.function" gave me numpy related error . I am trying to debug that, but if you have a dummy autocomplete.tflite file that you can provide, it will save me some time.

@walker-ai
Copy link
Author

Hi @walker-ai ,

I followed the same example while creating the autocomplete.tflite file , for some reason the "generate" function under the decorator "tf.function" gave me numpy related error . I am trying to debug that, but if you have a dummy autocomplete.tflite file that you can provide, it will save me some time.

Sorry, I haven't provided more details. You can open the colab and run through the notebook to get the autocomplete.tflite. And I have got the file for you.

@sawantkumar
Copy link

Hi @walker-ai ,

Thank you for the model file. I replicated your issue on my emulator with android 10 and it also crashed when i tried running the model on GPU. I will try out some other tensorflow examples which make use of gpu acceleration and i will get back to you .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants