Skip to content

survivor-zik/ollama-fastapi-chainlit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

My Python App

This is a simple Python application built with:

Installation

To install the required dependencies, it's recommended to create a new Conda environment:

conda create -p venv python==3.11 -y
conda activate venv/

Install other dependencies

pip install -r requirements.txt

I have used a custom LLM from Gemma-7b-it-GGUF

Invoke-WebRequest -Uri "https://huggingface.co/mlabonne/gemma-7b-it-GGUF/resolve/main/gemma-7b-it.Q5_K_M.gguf" -OutFile "./model/llm/gemma-7b-it.Q5_K_M.gguf"

Pre-Requisites

  1. Use the Modelfile.txt to add into Ollama list of models.
  2. Run the following commands in the terminal.
    ollama create gemma-updated -f ./models/modelfile.txt
    

Run Apps

  1. For FastAPI

    python main.py
    
  2. Using Chainlit

    1. For ConversationalChain
       chainlit run app.py
    
    1. For LCEL
      chainlit run test-app.py
    

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages