Before using this applet, you must have Ollama installed on your system. To do this, run this in your terminal:
curl -fsSL https://ollama.com/install.sh | sh
Source: Ollama Github
After installing Ollama. Pull some models, you would like to use with chat, for example
ollama pull llama3
More models you can find in library: https://ollama.com/library
Clone the repository, and use just
If you don't have just
installed, it is available in PopOS repository, so you can install it with apt
sudo apt install just
Now you can clone repo and install applet.
git clone https://github.com/elevenhsoft/cosmic-ext-applet-ollama.git
cd cosmic-ext-applet-ollama
Run just:
just
sudo just install
Done
From now, you will be able to add applet to your desktop panel/dock and chat with different models in real time :)
Cheers!
There are currently some rendering issues with the wgpu
libcosmic features in some (older?) gpus.
This doesn't affect Ollama, only the applet.
If you are affected by this, you can build and install it with this feature disabled:
just build-no-wgpu
sudo just install