You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 19, 2020. It is now read-only.
Just piggybacking on this - it was interesting seeing Finnish, Hungarian and Polish documents in the samples. I sent them to a couple of friends and so far as they could tell, they were all grammatically correct and made complete sense. Apparently the few-shot translation test isn't reflecting its true performance in non-English languages. I'm wondering whether the model has learned deeper linguistic structures common to all the languages in the training dataset. Would the OpenAI team be able to prompt the model with some of the rarer languages that appear in CommonCrawl so we can see more examples? This page translates short phrases into lots of different languages at once, in case that's useful. Anyway, really fascinating stuff :)
P.S: GPT-2-1.5b gave interesting results when I prompted it with constructed languages like Esperanto and dead languages like Gothic - it usually produced samples with grammatically-correctish sentences with plausible but gibberish words. Not sure whether GPT-3 would do anything differently.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
it would be appreciated if there is a chinese model available
The text was updated successfully, but these errors were encountered: