You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
My own task or dataset (give details below)
Reproduction
The length of the eval_preds parameter received in the compute_metrics function is different from the original length in eval_dataset.
defcompute_metrics(eval_preds):
preds, labels=eval_preds# In case the model returns more than the prediction logitsifisinstance(preds, tuple):
preds=preds[0]
assertpreds.shape[-1] ==training_args.max_lengthassertpreds.shape[0] ==len(tokenized_datasets[-1]) #### assertion error preds.shape[0]=1024 ,len(tokenized_datasets[-1])=1012
System Info
transformers
version: 4.40.0Who can help?
@muellerzr
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
The length of the
eval_preds
parameter received in thecompute_metrics
function is different from the original length ineval_dataset
.My training args lists below :
Expected behavior
Everything works fine when using single gpu but not gpus
I started my script by calling
accelerate launch scripy.py
The text was updated successfully, but these errors were encountered: