You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I read your paper with great interest. You seem to have a lot of novel ideas about how to improve the pretraining. Some of the scores are really impressive. I would like to test some of these ideas on other corpuses.
Have you considered making the code available as a HuggingFace module (TensorFlow/PyTorch/Flax)? I think this would lead to a lot more people looking into your ideas.
The text was updated successfully, but these errors were encountered:
Any update on this? I notice that there is a out of box pretrain version for GLM-10B. Would like to know whether there are any future plan on uploading other pretrain model (e.g. GLM-10B-Chinese)?
Any update on this? I notice that there is a out of box pretrain version for GLM-10B. Would like to know whether there are any future plan on uploading other pretrain model (e.g. GLM-10B-Chinese)?
I read your paper with great interest. You seem to have a lot of novel ideas about how to improve the pretraining. Some of the scores are really impressive. I would like to test some of these ideas on other corpuses.
Have you considered making the code available as a HuggingFace module (TensorFlow/PyTorch/Flax)? I think this would lead to a lot more people looking into your ideas.
The text was updated successfully, but these errors were encountered: