Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error in ctranslate . #60

Closed
pr509 opened this issue Apr 2, 2024 · 1 comment
Closed

error in ctranslate . #60

pr509 opened this issue Apr 2, 2024 · 1 comment

Comments

@pr509
Copy link

pr509 commented Apr 2, 2024

2024-04-02 10:58:39.967723: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2024-04-02 10:58:39.967828: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2024-04-02 10:58:40.110310: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2024-04-02 10:58:43.291466: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
2024-04-02 10:58:47 | INFO | fairseq.tasks.text_to_speech | Please install tensorboardX: pip install tensorboardX
Traceback (most recent call last):
File "/usr/local/bin/ct2-fairseq-converter", line 8, in
sys.exit(main())
File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/fairseq.py", line 341, in main
converter.convert_from_args(args)
File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/converter.py", line 50, in convert_from_args
return self.convert(
File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/converter.py", line 89, in convert
model_spec = self._load()
File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/fairseq.py", line 166, in _load
spec = _get_model_spec(args)
File "/usr/local/lib/python3.10/dist-packages/ctranslate2/converters/fairseq.py", line 32, in _get_model_spec
model_name = fairseq.models.ARCH_MODEL_NAME_REGISTRY[args.arch]
KeyError: 'transformer_18_18'

i am getting this error while binarizing

@PranjalChitale
Copy link
Collaborator

usage: ct2-fairseq-converter [-h] --model_path MODEL_PATH --data_dir DATA_DIR [--user_dir USER_DIR]
                             [--fixed_dictionary FIXED_DICTIONARY] [--source_lang SOURCE_LANG] [--target_lang TARGET_LANG]
                             [--no_default_special_tokens] --output_dir OUTPUT_DIR [--vocab_mapping VOCAB_MAPPING]
                             [--quantization {int8,int8_float16,int16,float16}] [--force]

As we are using a customized architecture, you need to use the --user_dir flag when using the converter and specify the path to model_configs directory present in your local clone.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants