Skip to content
This repository has been archived by the owner on Jun 9, 2021. It is now read-only.

op type not registered NormalizeUTF8 initializing BERT #276

Open
jaismith opened this issue May 30, 2021 · 1 comment
Open

op type not registered NormalizeUTF8 initializing BERT #276

jaismith opened this issue May 30, 2021 · 1 comment

Comments

@jaismith
Copy link

jaismith commented May 30, 2021

Getting the following error when initializing a BERT model using this package:

FileNotFoundError: Op type not registered 'NormalizeUTF8' in binary running on <computer name>.local. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
 If trying to load on a different device from the computational device, consider using setting the `experimental_io_device` option on tf.saved_model.LoadOptions to the io_device such as '/job:localhost'.

Here's the code initializing the model:

# use ALBERT
tfhub_handle_encoder = 'https://tfhub.dev/tensorflow/albert_en_base/2'
tfhub_handle_preprocess = 'https://tfhub.dev/tensorflow/albert_en_preprocess/3'

bert_preprocess_model = hub.KerasLayer(tfhub_handle_preprocess)
bert_model = hub.KerasLayer(tfhub_handle_encoder)

I've tried installing via the quick install script and the Conda instructions in #153, both throw the same error. Would appreciate if anyone had any insights.


Edit: modules are imported as follows:

import tensorflow as tf
import tensorflow_hub as hub
import tensorflow_addons.text as text
import tensorflow_addons.optimizers as optimizers
@bksaini078
Copy link

Hello @jaismith ,
I have tried the same but was getting the same error message.
Even transformers from Huggingface also not working.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants