This repository showcases a suite of experimental LLM applications built using Streamlit.
- Streamlit (UI)
- LangChain
- OpenAI API (
gpt-3.5-turbo
)
$ git clone github.com/rexsimiloluwah/streamlit-llm-apps
$ cd streamlit-llm-apps
You can advisably create a virtual environment
$ pip install -r requirements.txt
$ streamlit run src/main.py
# Using make
$ make run-app
This application enables you to perform question-answering over your PDF document. It uses the RetrievalQA
chain and the in-memory DocArray
vector store provided by LangChain.
This application enables you to perform question-answering over content loaded from a web page. It similarly uses the RetrievalQA
chain and the in-memory DocArray
vector store provided by LangChain.
This application enables you to chat over your PDF document. It uses the ConversationalRetrievalChain
chain and the in-memory DocArray
vector store provided by LangChain. The memory is managed externally.