Skip to content

Leveraged Ollama, streamlit and the open source Mistral LLM to create a local chatbot to answer questions on given website(s)

Notifications You must be signed in to change notification settings

Aabha-J/Website-Chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Chat with Websites 🤖

A chatbot that uses Mistral 7B to parse website information and anwser questions only based of the info from those websites. However, you can modify after_rag_template to change this.

To run on your local computer clone the repo or download the files and follow the instructions in setup.txt. The set up instructions are also copied below for convenience

Demo

1. Input Info 🔎

image

2. Wait for Results ⌛

image

3. Get Results 🎉

image

Note: Depedning on your processor the speed will vary

Set Up

  1. Create a virtual enviorment (Optional but reccomended) In the command terminal (Don't use powershell)

    1. python -m venv name
    2. MAC: source name/bin/activate
    3. Windows: .\name\Scripts\activate
  2. Install these packages with pip in your virtual enviorment

    pip install langchain

    pip install chromadb tiktoken

    pip install streamlit

    pip install beautifulsoup4

  3. Download ollama from https://ollama.com/

    Type this into your command terminal to get acess to these models

    ollama pull mistral

    ollama pull nomic-embed-text

  4. streamlit run app.py

About

Leveraged Ollama, streamlit and the open source Mistral LLM to create a local chatbot to answer questions on given website(s)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages