You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[WARNING|tokenization_utils_base.py:3831] 2023-11-15 19:02:06,747 >> Token indices sequence length is longer than the specified maximum sequence length for this model (2576 > 2048). Running this sequence through the model will result in indexingerrors
Had a question about the max_seq_length hyper parameter.
I just started training and set the config for SFT to be the below:
However, I got this warning below:
This was also from the logs:
Any thoughts as to what might be happening? Ran into the same thing when trying Llama 7B-hf which has a default token length of 4096.
The text was updated successfully, but these errors were encountered: