Provide the google colab ipynb #25
Comments
Yeah, will be nice if anyone provides colab notebook |
It would be great to have the code working on Colab. Please, share if you make it work. |
Hello! |
#25 (comment) Thanks. Can you explain why you use ssh in your notebook |
Well, when I was training through notebook, it glitches out when output length is too much to handle for the web app. But if you use SSH, then everything is OK and nobody will glitch
|
What format of dataset I need to pass to your notebook? Can you help me? |
#33 (comment) |
Still no transformers compliant tokenizer. |
Can't understand what you want.
There are resons. I don't have equipment powerful enough to train GPT-2 from ground up and I want to finetune it on russian dataset.
Here, take a look at this |
Transformers'
yeah no, what about a direct way of doing this rather then relying on some script? here's a gpt-2 raw sample, it does it's thing well enough w/o requiring any script execution
|
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
Please provide the working ipynb, i'm trying to run it in google colab (using transformers), but it's so messy it won't even run and where's the transformers compliant tokenizer?
upd 17.07: no response
The text was updated successfully, but these errors were encountered: