🧠
Busy Creating Slop
Pinned Loading
-
run-ollama-colab
run-ollama-colab PublicA lightweight setup to run Ollama (for local LLMs like LLaMA 3, Mistral, Gemma, etc.) directly in Google Colab with free GPU/TPU support. Perfect for experimenting with open-source language models …
Jupyter Notebook
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.