Skip to content

billwparker/chat_application

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Chat Application using Streamlit and local models

Create conda environment, we'll call it chat

conda create -n chat python=3.11

Install dependencies

pip install openai pip install streamlit

Download Ollama

(Ollama only works on Mac for Linux right now. For Windows you could use LMStudio) https://ollama.com/

Choose and run local model using Ollama

This command will download the mistral model for ollama to use

ollama pull mistral

This command will run mistral at http://localhost:11434/v1

ollama run mistral

Streamlit app

Build Streamlit python file. We'll call it app.py.

This command will run Streamlit app

streamlit run app.py

About

Chat Application using Streamlit and local models

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages