Voice Chat with LLM models.
An elaborated example showcasing the usage of the Assistant module from pywhispercpp and langchain.
- A good Microphone
- Install Ollama
- Pull and run
llama2:7b-chat-q4_0 - clone this repo
- Install the requirements
- run
llm_chatter/main.py