Skip to content

This project demonstrates building a conversational chatbot web application with Streamlit and the LLama language model. It provides a simple interface for natural conversations with LLama, running locally via the Ollama.

License

Notifications You must be signed in to change notification settings

artmiss-gns/chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

chatbot

This repository contains a Python chatbot project with a Streamlit web interface for conversations. It allows you to interact with conversational AI models like LLama through an easy-to-use chat UI.

Overview

  • streamlit_app.py - Streamlit web application providing chatbot UI
  • src/model_interaction.py - Python module for calling LLama via Ollama
  • Uses Ollama (allows you to run open-source large language models, such as Llama 2, locally)
  • LLama server runs locally to power chatbot conversations

Getting Started

Prerequisites

  • Python 3.7+
  • Ollama
  • LLama model installed and running locally through Ollama
  • Streamlit
pip install streamlit

to install Ollama, check their github page for installation instructions
Ollama github page

Usage

  1. Start LLama server with Ollama
ollama run llama2

instead of llama2, you can put any other supported models by Ollama.

  1. Run Streamlit app
streamlit run app.py
  1. Chat with the bot at http://localhost:8501!

Code Overview

model_interaction.py

  • Makes API calls to localhost Ollama server
  • Handles LLama response payload

streamlit_app.py

  • Initializes Streamlit app
  • Manages chat message state in session_state
  • Displays chat UI using Streamlit components
  • Calls model_interaction to get bot responses
  • Displays bot messages and responses

About

This project demonstrates building a conversational chatbot web application with Streamlit and the LLama language model. It provides a simple interface for natural conversations with LLama, running locally via the Ollama.

Resources

License

Stars

Watchers

Forks

Languages