Replies: 1 comment 2 replies
-
I found all OPT,T5-variant, and GPT-neox works on m1 |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey,
I am experimenting for now but I tried a lot of models.. Only one is working, the smallest one OPT-125m
Dolly-v2-3b is running but never answering the requests and others are simply not starting.
I am running them within a dev container (based on python:3.11.3-bullseye) so I assume there are not enough resources for them to run properly.
How can I build myself a rough idea of the suitable size of a model to run on a laptop?
Beta Was this translation helpful? Give feedback.
All reactions