-
Notifications
You must be signed in to change notification settings - Fork 31.3k
Closed
Description
System Info
transformersversion: 4.23.1- Platform: Linux-5.15.0-53-generic-x86_64-with-glibc2.35
- Python version: 3.10.6
- Huggingface_hub version: 0.10.1
- PyTorch version (GPU?): 1.13.0+cu117 (False)
- Tensorflow version (GPU?): 2.10.0 (False)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?:
- Using distributed or parallel set-up in script?:
Who can help?
No response
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - My own task or dataset (give details below)
Reproduction
import torch
from transformers import Speech2TextProcessor, Speech2TextForConditionalGeneration
from datasets import load_dataset
import soundfile as sf
model = Speech2TextForConditionalGeneration.from_pretrained("facebook/s2t-small-librispeech-asr")
processor = Speech2TextProcessor.from_pretrained("facebook/s2t-small-librispeech-asr")
def map_to_array(batch):
speech, _ = sf.read(batch["file"])https://huggingface.co/docs/transformers/model_doc/speech_to_text
batch["speech"] = speech
return batch
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
ds = ds.map(map_to_array)
inputs = processor(ds["speech"][0], sampling_rate=16_000, return_tensors="pt")
generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"])
transcription = processor.batch_decode(generated_ids)
print(f'{transcription=}')
- the above code copy/paste from huggingface.co
- running the script
- get a exception
Traceback (most recent call last):
File "/home/ymq/tmp/pretrained-models/test/t.py", line 16, in <module>
generated_ids = model.generate(input_ids=inputs["input_features"], attention_mask=inputs["attention_mask"])
File "/home/ymq/py3/lib/python3.10/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/home/ymq/py3/lib/python3.10/site-packages/transformers/generation_utils.py", line 1208, in generate
self._validate_model_kwargs(model_kwargs.copy())
File "/home/ymq/py3/lib/python3.10/site-packages/transformers/generation_utils.py", line 910, in _validate_model_kwargs
raise ValueError(
ValueError: The following `model_kwargs` are not used by the model: ['input_ids'] (note: typos in the generate arguments will also show up in this list)
Expected behavior
get stt result text
Metadata
Metadata
Assignees
Labels
No labels