You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have fine-tuned the model to write a compliment for a person, given the person's profile and it works pretty well. In the training examples, I haven't prepended the string 'summarize :' to the source_string column entries. Is it necessary (does it lead to better results) to prepend the string indicating the task?
The text was updated successfully, but these errors were encountered:
For fine-tuning a T5 model, we add “prefix”, to assist multi-task training . In your case, as you are training on single task, so the prefix is not necessary.
Based on my experiments, I found that if you training for a task which T5 was earlier pre-trained on, for e.g. summarization or translation, adding a prefix, might make sense as it is going to converge slightly faster. That's it. But no impact on the results.
Hi,
I have fine-tuned the model to write a compliment for a person, given the person's profile and it works pretty well. In the training examples, I haven't prepended the string 'summarize :' to the source_string column entries. Is it necessary (does it lead to better results) to prepend the string indicating the task?
The text was updated successfully, but these errors were encountered: