You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is this easily incorporated using transformer? It seems the procedure of using it is ultra complicated. Could someone try it and see if it can be easily integrated?
It is okay i thought that you would have an idea already to instantly integrate it ! I will have some experiments and tell you if it worked. no pressure since it can easily not even work. 👍
Hello, I have seen this new ultrafastbert https://github.com/pbelcak/UltraFastBERT where they train a bert in one day on 1 Gpu !
I was wondering if Ultrafastbert can be incorporated in Pl-bert to train really fast ? and how to do it if yes
Thanks in advance!
The text was updated successfully, but these errors were encountered: