You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There only appears to be approximately 120 books on this site. While it’s a potentially excellent resource for many applications, it would be infeasible to build a dataset of hundreds of gigabytes out of components this small.
Based on my experience working on the Pile, I would strongly recommend putting on hold any data source that does not have over 5 GB of text, and only accepting ones with less than 10 GB if they’re special. If the goal is to comprise one quarter of the training data for the multilingual model (a number I have in my head but don’t know where it came from), we need at least 250 GB of text. It’s going to be much less work to find 25 10 GB sources than it will be to find 250 1 GB sources.
https://www.africanminds.co.za/
The text was updated successfully, but these errors were encountered: