Skip to content

部署bark-small报错:mindspore算子不支持python自带数据类型 #2129

@yanboc

Description

@yanboc

Describe the bug/ 问题描述 (Mandatory / 必填)
A clear and concise description of what the bug is.

import mindnlp
import mindspore as ms
from transformers import pipeline
import scipy

synthesiser = pipeline("text-to-speech", "<PATH_TO_MODEL>")

speech = synthesiser("Hello, my dog is cooler than you!", forward_params={"do_sample": False})

scipy.io.wavfile.write("bark_out.wav", rate=speech["sampling_rate"], data=speech["audio"])

报错信息:

Traceback (most recent call last):
  File "/code/tem/test_tts_ms/run_bark.py", line 6, in <module>
    synthesiser = pipeline("text-to-speech", "/loaded_models/suno/bark-small")
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/mindnlp/utils/decorators.py", line 15, in wrapper
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/pipelines/__init__.py", line 1008, in pipeline
    framework, model = infer_framework_load_model(
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/pipelines/base.py", line 292, in infer_framework_load_model
    model = model_class.from_pretrained(model, **kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 600, in from_pretrained
    return model_class.from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/mindnlp/utils/decorators.py", line 15, in wrapper
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 317, in _wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 4994, in from_pretrained
    model = cls(config, *model_args, **model_kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/bark/modeling_bark.py", line 1368, in __init__
    self.semantic = BarkSemanticModel(config.semantic_config)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/bark/modeling_bark.py", line 389, in __init__
    self.layers = nn.ModuleList([BarkBlock(config, is_causal=True, layer_idx=i) for i in range(config.num_layers)])
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/bark/modeling_bark.py", line 389, in <listcomp>
    self.layers = nn.ModuleList([BarkBlock(config, is_causal=True, layer_idx=i) for i in range(config.num_layers)])
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/bark/modeling_bark.py", line 290, in __init__
    self.attn = BARK_ATTENTION_CLASSES[config._attn_implementation](
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/bark/modeling_bark.py", line 97, in __init__
    bias = torch.tril(torch.ones((block_size, block_size), dtype=bool)).view(1, 1, block_size, block_size)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/mindnlp/core/ops/creation.py", line 69, in ones
    output = execute('ones', size, dtype,
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/mindnlp/core/executor.py", line 5, in execute
    out, device = dispatcher.dispatch(func_name, *args, **kwargs)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/mindnlp/core/dispatcher.py", line 62, in dispatch
    return func(*args), device
           ^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/mindnlp/core/_prims/numpy.py", line 14, in ones
    return core.Tensor.from_numpy(np.ones(size, core.dtype2np[dtype]))
                                                ~~~~~~~~~~~~~^^^^^^^
KeyError: <class 'bool'>
  • Hardware Environment(Ascend/GPU/CPU) / 硬件环境:

    • NPU: 300I DUO
    • CPU: Kunpeng 920 aarch64
  • Software Environment / 软件环境 (Mandatory / 必填):

    • MindSpore version: 2.6.0
    • Python version: 3.11.6
    • OS platform and distribution: OpenEuler 22.03 SP4
  • Excute Mode / 执行模式 (Mandatory / 必填)(PyNative/Graph):

/mode pynative

To Reproduce / 重现步骤 (Mandatory / 必填)
Steps to reproduce the behavior:

  1. 运行上述代码脚本
  2. See error

Expected behavior / 预期结果 (Mandatory / 必填)
A clear and concise description of what you expected to happen.

  • 模型正常推理

Screenshots/ 日志 / 截图 (Mandatory / 必填)
If applicable, add screenshots to help explain your problem.

  • 见问题描述

Additional context / 备注 (Optional / 选填)
Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions