Skip to content

Question Answering with Custom FIles using LLMs

Notifications You must be signed in to change notification settings

theHprogrammer-FORKS/DocQA

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DocQA 🤖

image

DocQA 🤖 is a web application built using Streamlit 🔥 and the LangChain 🦜🔗 framework, allowing users to leverage the power of LLMs for Generative Question Answering. 🌟

Read More Here 👉 https://ai.plainenglish.io/️-langchain-streamlit-llama-bringing-conversational-ai-to-your-local-machine-a1736252b172

Installation

To run the LangChain web application locally, follow these steps:

Clone this repository 🔗

git clone https://github.com/afaqueumer/DocQA.git

Create Virtual Environment and Install the required dependencies ⚙️

Run ➡️ setup_env.bat 

Launch Streamlit App 🚀

Run ➡️ run_app.bat

Usage

Once you have the Streamlit web application up and running, you can perform the following steps:

  1. Upload the Text File.
  2. Once the Text File is loaded as the Vector Store Database it will show a success alert "Document is Loaded".
  3. Insert the question in "Ask" textbox and submit your question for LLM to generate the answer.

Contributing

Contributions to this app are welcome! If you have any ideas, suggestions, or bug fixes, please feel free to open an issue or submit a pull request. We appreciate your contributions.

License

This project is licensed under the MIT License.

🎉 Thank you 🤗 Happy question answering! 🌟

About

Question Answering with Custom FIles using LLMs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 97.0%
  • Batchfile 3.0%