Automated GGML conversions to the Huggingface HUB
Currently the following HF repos are created via these conversions:
- https://huggingface.co/rustformers/redpajama-3b-ggml
- https://huggingface.co/rustformers/pythia-ggml
- https://huggingface.co/rustformers/bloom-ggml
- https://huggingface.co/rustformers/bloomz-ggml
The following HF repos are quantized via these scripts:
- https://huggingface.co/rustformers/mpt-7b-ggml
- https://huggingface.co/rustformers/gpt-j-ggml
- https://huggingface.co/rustformers/stablelm-ggml
- https://huggingface.co/rustformers/dolly-v2-ggml
- https://huggingface.co/rustformers/gpt4all-j-ggml
- https://huggingface.co/rustformers/redpajama-7b-ggml
- https://huggingface.co/rustformers/open-llama-ggml