Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

推理跑不起来,这是什么原因呢? #31

Open
chaorenai opened this issue Jan 4, 2024 · 5 comments
Open

推理跑不起来,这是什么原因呢? #31

chaorenai opened this issue Jan 4, 2024 · 5 comments

Comments

@chaorenai
Copy link

(anytext) C:\Users\sunny\Documents\AnyText>python inference.py
2024-01-04 18:24:07,722 - modelscope - INFO - PyTorch version 2.1.2+cu121 Found.
2024-01-04 18:24:07,724 - modelscope - INFO - TensorFlow version 2.13.0 Found.
2024-01-04 18:24:07,724 - modelscope - INFO - Loading ast index from C:\Users\sunny.cache\modelscope\ast_indexer
2024-01-04 18:24:07,772 - modelscope - INFO - Loading done! Current index file version is 1.10.0, with md5 25145d097e3652b81ca7902ed6ed4218 and a total number of 946 components indexed
2024-01-04 18:24:08,928 - modelscope - INFO - Use user-specified model revision: v1.1.0
2024-01-04 18:24:11,285 - modelscope - WARNING - ('PIPELINES', 'my-anytext-task', 'my-custom-pipeline') not found in ast index file
2024-01-04 18:24:11,286 - modelscope - INFO - initiate model from C:\Users\sunny.cache\modelscope\hub\damo\cv_anytext_text_generation_editing
2024-01-04 18:24:11,286 - modelscope - INFO - initiate model from location C:\Users\sunny.cache\modelscope\hub\damo\cv_anytext_text_generation_editing.
2024-01-04 18:24:11,287 - modelscope - INFO - initialize model from C:\Users\sunny.cache\modelscope\hub\damo\cv_anytext_text_generation_editing
2024-01-04 18:24:11,289 - modelscope - WARNING - ('MODELS', 'my-anytext-task', 'my-custom-model') not found in ast index file
A matching Triton is not available, some optimizations will not be enabled.
Error caught was: No module named 'triton'
OMP: Error #15: Initializing libiomp5md.dll, but found libiomp5md.dll already initialized.
OMP: Hint This means that multiple copies of the OpenMP runtime have been linked into the program. That is dangerous, since it can degrade performance or cause incorrect results. The best thing to do is to ensure that only a single OpenMP runtime is linked into the process, e.g. by avoiding static linking of the OpenMP runtime in any library. As an unsafe, unsupported, undocumented workaround you can set the environment variable KMP_DUPLICATE_LIB_OK=TRUE to allow the program to continue to execute, but that may cause crashes or silently produce incorrect results. For more information, please see http://www.intel.com/software/products/support/.

@chaorenai
Copy link
Author

python inference.py
2024-01-04 18:56:02,430 - modelscope - INFO - PyTorch version 2.1.2+cu121 Found.
2024-01-04 18:56:02,432 - modelscope - INFO - TensorFlow version 2.13.0 Found.
2024-01-04 18:56:02,432 - modelscope - INFO - Loading ast index from C:\Users\sunny.cache\modelscope\ast_indexer
2024-01-04 18:56:02,478 - modelscope - INFO - Loading done! Current index file version is 1.10.0, with md5 25145d097e3652b81ca7902ed6ed4218 and a total number of 946 components indexed
2024-01-04 18:56:03,577 - modelscope - INFO - Use user-specified model revision: v1.1.0
2024-01-04 18:56:05,925 - modelscope - WARNING - ('PIPELINES', 'my-anytext-task', 'my-custom-pipeline') not found in ast index file
2024-01-04 18:56:05,925 - modelscope - INFO - initiate model from C:\Users\sunny.cache\modelscope\hub\damo\cv_anytext_text_generation_editing
2024-01-04 18:56:05,925 - modelscope - INFO - initiate model from location C:\Users\sunny.cache\modelscope\hub\damo\cv_anytext_text_generation_editing.
2024-01-04 18:56:05,930 - modelscope - INFO - initialize model from C:\Users\sunny.cache\modelscope\hub\damo\cv_anytext_text_generation_editing
2024-01-04 18:56:05,932 - modelscope - WARNING - ('MODELS', 'my-anytext-task', 'my-custom-model') not found in ast index file
A matching Triton is not available, some optimizations will not be enabled.
Error caught was: No module named 'triton'
ControlLDM: Running in eps-prediction mode
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is 768 and using 8 heads.
DiffusionWrapper has 859.52 M params.
making attention of type 'vanilla-xformers' with 512 in_channels
building MemoryEfficientAttnBlock with 512 in_channels...
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
making attention of type 'vanilla-xformers' with 512 in_channels
building MemoryEfficientAttnBlock with 512 in_channels...
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 320, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 640, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 768 and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is None and using 8 heads.
Setting up MemoryEfficientCrossAttention. Query dim is 1280, context_dim is 768 and using 8 heads.
Loaded model config from [models_yaml/anytext_sd15.yaml]
Loaded state_dict from [C:\Users\sunny.cache\modelscope\hub\damo\cv_anytext_text_generation_editing\anytext_v1.1.ckpt]
2024-01-04 18:56:26,972 - modelscope - INFO - initiate model from C:\Users\sunny.cache\modelscope\hub\damo\cv_anytext_text_generation_editing\nlp_csanmt_translation_zh2en
2024-01-04 18:56:26,972 - modelscope - INFO - initiate model from location C:\Users\sunny.cache\modelscope\hub\damo\cv_anytext_text_generation_editing\nlp_csanmt_translation_zh2en.
2024-01-04 18:56:26,975 - modelscope - INFO - initialize model from C:\Users\sunny.cache\modelscope\hub\damo\cv_anytext_text_generation_editing\nlp_csanmt_translation_zh2en
{'hidden_size': 1024, 'filter_size': 4096, 'num_heads': 16, 'num_encoder_layers': 24, 'num_decoder_layers': 6, 'attention_dropout': 0.0, 'residual_dropout': 0.0, 'relu_dropout': 0.0, 'layer_preproc': 'layer_norm', 'layer_postproc': 'none', 'shared_embedding_and_softmax_weights': True, 'shared_source_target_embedding': True, 'initializer_scale': 0.1, 'position_info_type': 'absolute', 'max_relative_dis': 16, 'num_semantic_encoder_layers': 4, 'src_vocab_size': 50000, 'trg_vocab_size': 50000, 'seed': 1234, 'beam_size': 4, 'lp_rate': 0.6, 'max_decoded_trg_len': 100, 'device_map': None, 'device': 'cuda'}
2024-01-04 18:56:26,980 - modelscope - WARNING - No val key and type key found in preprocessor domain of configuration.json file.
2024-01-04 18:56:26,980 - modelscope - WARNING - Cannot find available config to build preprocessor at mode inference, current config: {'src_lang': 'zh', 'tgt_lang': 'en', 'src_bpe': {'file': 'bpe.zh'}, 'model_dir': 'C:\Users\sunny\.cache\modelscope\hub\damo\cv_anytext_text_generation_editing\nlp_csanmt_translation_zh2en'}. trying to build by task and model information.
2024-01-04 18:56:26,980 - modelscope - WARNING - No preprocessor key ('csanmt-translation', 'translation') found in PREPROCESSOR_MAP, skip building preprocessor.
Traceback (most recent call last):
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\utils\registry.py", line 212, in build_from_cfg
return obj_cls(**args)
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\pipelines\nlp\translation_pipeline.py", line 54, in init
self._src_vocab = dict([
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\pipelines\nlp\translation_pipeline.py", line 54, in
self._src_vocab = dict([
UnicodeDecodeError: 'gbk' codec can't decode byte 0x84 in position 7: illegal multibyte sequence

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\utils\registry.py", line 210, in build_from_cfg
return obj_cls._instantiate(**args)
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\models\base\base_model.py", line 67, in _instantiate
return cls(**kwargs)
File "C:\Users\sunny.cache\modelscope\modelscope_modules\cv_anytext_text_generation_editing\ms_wrapper.py", line 43, in init
self.init_model(**kwargs)
File "C:\Users\sunny.cache\modelscope\modelscope_modules\cv_anytext_text_generation_editing\ms_wrapper.py", line 225, in init_model
self.trans_pipe = pipeline(task=Tasks.translation, model=os.path.join(self.model_dir, 'nlp_csanmt_translation_zh2en'))
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\pipelines\builder.py", line 170, in pipeline
return build_pipeline(cfg, task_name=task)
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\pipelines\builder.py", line 65, in build_pipeline
return build_from_cfg(
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\utils\registry.py", line 215, in build_from_cfg
raise type(e)(f'{obj_cls.name}: {e}')
TypeError: function takes exactly 5 arguments (1 given)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\utils\registry.py", line 212, in build_from_cfg
return obj_cls(**args)
File "C:\Users\sunny.cache\modelscope\modelscope_modules\cv_anytext_text_generation_editing\ms_wrapper.py", line 320, in init
super().init(model=model, auto_collate=False)
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\pipelines\base.py", line 99, in init
self.model = self.initiate_single_model(model)
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\pipelines\base.py", line 53, in initiate_single_model
return Model.from_pretrained(
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\models\base\base_model.py", line 183, in from_pretrained
model = build_model(model_cfg, task_name=task_name)
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\models\builder.py", line 35, in build_model
model = build_from_cfg(
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\utils\registry.py", line 215, in build_from_cfg
raise type(e)(f'{obj_cls.name}: {e}')
TypeError: MyCustomModel: function takes exactly 5 arguments (1 given)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\sunny\Documents\AnyText\inference.py", line 3, in
pipe = pipeline('my-anytext-task', model='damo/cv_anytext_text_generation_editing', model_revision='v1.1.0')
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\pipelines\builder.py", line 170, in pipeline
return build_pipeline(cfg, task_name=task)
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\pipelines\builder.py", line 65, in build_pipeline
return build_from_cfg(
File "C:\Users\sunny.conda\envs\anytext\lib\site-packages\modelscope\utils\registry.py", line 215, in build_from_cfg
raise type(e)(f'{obj_cls.name}: {e}')
TypeError: MyCustomPipeline: MyCustomModel: function takes exactly 5 arguments (1 given)

(anytext) C:\Users\sunny\Documents\AnyText>

@chaorenai
Copy link
Author

@tyxsspa 请问是什么原因呢?接下来如何做?

@nerdyrodent
Copy link

在Windows上,系统的默认代码页通常设置为'cp1252'(Windows-1252),这是一种字符编码,不完全支持所有Unicode字符。
这可能会起作用

set PYTHONIOENCODING=utf-8

或者可能

import sys
sys.stdout.reconfigure(encoding='utf-8')

?

@chaorenai
Copy link
Author

在Windows上,系统的默认代码页通常设置为'cp1252'(Windows-1252),这是一种字符编码,不完全支持所有Unicode字符。 这可能会起作用

set PYTHONIOENCODING=utf-8

或者可能

import sys
sys.stdout.reconfigure(encoding='utf-8')

?

感谢,按照这个做了,依然报错,运行不起来……

@jsfgit
Copy link

jsfgit commented Jan 9, 2024

同样的错误

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants