Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BanglaBERT is not loading due to problem in config.json file #1

Closed
MusfiqDehan opened this issue Dec 19, 2021 · 4 comments
Closed

BanglaBERT is not loading due to problem in config.json file #1

MusfiqDehan opened this issue Dec 19, 2021 · 4 comments

Comments

@MusfiqDehan
Copy link

Hi,
I have tried to use your BanglaBERT model from huggingface. I have used the below code. The error message is telling that there is some problem in your config.json file. Can you please fix this issue?

!pip install git+https://github.com/csebuetnlp/normalizer
from transformers import AutoModelForPreTraining, AutoTokenizer
from normalizer import normalize
import torch

model = AutoModelForPreTraining.from_pretrained("csebuetnlp/banglabert")
tokenizer_bbert = AutoTokenizer.from_pretrained("csebuetnlp/banglabert")

text = "আমি বিদ্যালয়ে যাই ।"
text = normalize(text)

tokenizer_bbert.tokenize(text)

image

@abhik1505040
Copy link
Collaborator

Hi, This seems to be a problem related to your environment setup. Please provide the following information:

  1. Complete stack trace of the error.
  2. Your transformers version.
  3. Output of the following command in your current working directory: !ls csebuetnlp/banglabert.

@MusfiqDehan
Copy link
Author

Thanks for your response. I am using Google Colab. My transformers version is 3.1.0 .

image

Complete error log


Collecting git+https://github.com/csebuetnlp/normalizer
  Cloning https://github.com/csebuetnlp/normalizer to /tmp/pip-req-build-72jjz983
  Running command git clone -q https://github.com/csebuetnlp/normalizer /tmp/pip-req-build-72jjz983
Requirement already satisfied: regex in /usr/local/lib/python3.7/dist-packages (from normalizer==0.0.1) (2019.12.20)
Collecting emoji==1.4.2
  Downloading emoji-1.4.2.tar.gz (184 kB)
     |████████████████████████████████| 184 kB 5.2 MB/s 
Collecting ftfy==6.0.3
  Downloading ftfy-6.0.3.tar.gz (64 kB)
     |████████████████████████████████| 64 kB 2.6 MB/s 
Requirement already satisfied: wcwidth in /usr/local/lib/python3.7/dist-packages (from ftfy==6.0.3->normalizer==0.0.1) (0.2.5)
Building wheels for collected packages: normalizer, emoji, ftfy
  Building wheel for normalizer (setup.py) ... done
  Created wheel for normalizer: filename=normalizer-0.0.1-py3-none-any.whl size=6860 sha256=c4dc3e4c0147faa8acd5f93d79cca936811edbbea7bbd06fadad4c0ec38ce8c5
  Stored in directory: /tmp/pip-ephem-wheel-cache-93ed0rzy/wheels/af/b1/ee/b9e2a2f2dd861976a357b6a6fa105aeedf2254016676f6cf8f
  Building wheel for emoji (setup.py) ... done
  Created wheel for emoji: filename=emoji-1.4.2-py3-none-any.whl size=186469 sha256=48a0f01a5c2f301755a4797cd2440f5166a5d92815b337c5e3d87ecc71d86758
  Stored in directory: /root/.cache/pip/wheels/e4/61/e7/2fc1ac8f306848fc66c6c013ab511f0a39ef4b1825b11363b2
  Building wheel for ftfy (setup.py) ... done
  Created wheel for ftfy: filename=ftfy-6.0.3-py3-none-any.whl size=41933 sha256=f632660cfd522d95271c3e19975736670269857e9a9b2c177c7efba1345b8fbd
  Stored in directory: /root/.cache/pip/wheels/19/f5/38/273eb3b5e76dfd850619312f693716ac4518b498f5ffb6f56d
Successfully built normalizer emoji ftfy
Installing collected packages: ftfy, emoji, normalizer
Successfully installed emoji-1.4.2 ftfy-6.0.3 normalizer-0.0.1
---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
/usr/local/lib/python3.7/dist-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
    352             if resolved_config_file is None:
--> 353                 raise EnvironmentError
    354             config_dict = cls._dict_from_json_file(resolved_config_file)

OSError: 

During handling of the above exception, another exception occurred:

OSError                                   Traceback (most recent call last)
3 frames
/usr/local/lib/python3.7/dist-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
    360                 f"- or '{pretrained_model_name_or_path}' is the correct path to a directory containing a {CONFIG_NAME} file\n\n"
    361             )
--> 362             raise EnvironmentError(msg)
    363 
    364         except json.JSONDecodeError:

OSError: Can't load config for 'csebuetnlp/banglabert'. Make sure that:

- 'csebuetnlp/banglabert' is a correct model identifier listed on 'https://huggingface.co/models'

- or 'csebuetnlp/banglabert' is the correct path to a directory containing a config.json file

@abhik1505040
Copy link
Collaborator

Please upgrade your transformers installation to version 4.11.0 or above. This should resolve the issues.

@MusfiqDehan
Copy link
Author

Thank you so much. My issue is solved after upgrading the transformers model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants