You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
i followed your codes to write a ten class textclassification and failed (output predictions are the same) , logits seems to be equal correspondingly. i dont konw why ,and ask for helps, thank you
my model is just that simple:
lbert(tokens)
dense(10)
but after i forze bert layers , and add some additional layers like lstm , it worked,well , accuracy is still under my expectation..
SORRY!
its my mistacks!
i wrote a wrong sequnces padder and a wrong tokenizer....
emmmmmmmmmmmmm
The text was updated successfully, but these errors were encountered:
i followed your codes to write a ten class textclassification and failed (output predictions are the same) , logits seems to be equal correspondingly. i dont konw why ,and ask for helps, thank you
my model is just that simple:
lbert(tokens)
dense(10)
but after i forze bert layers , and add some additional layers like lstm , it worked,well , accuracy is still under my expectation..
SORRY!
its my mistacks!
i wrote a wrong sequnces padder and a wrong tokenizer....
emmmmmmmmmmmmm
The text was updated successfully, but these errors were encountered: