This project consists of a React-based frontend that listens to user queries, sends them to a Flask backend, and receives responses generated by a locally set up Llama3 language model. The backend handles the processing and communication with the LLM to provide intelligent answers to the queries. There is a system message in the main.py file inside the backend folder which acts as the persona of the chatbot, so you can change it and get the bot answer according to your content or needs.
- frontend/: Contains the React application code.
- backend/: Contains the Flask application code and configuration for the Llama3 language model.
- Node.js and npm installed on your machine.
- Python and pip installed on your machine.
- Llama3 language model set up locally.
cd frontend
npm install
npm start
The frontend should now be running on http://localhost:3000.
cd backend
pip install -r requirements.txt
flask run
The backend should now be running on http://localhost:5000.
1- Open the frontend application in your web browser (http://localhost:3000). 2- Interact with the application by pressing the voice button to start listening to your query. 3- The query will be sent to the Flask backend, processed by the Llama3 language model, and the response will be displayed on the frontend.
Feel free to contribute to this project by submitting issues or pull requests. Contributions are welcome!
This project is licensed under the MIT License. See the LICENSE file for details.
1- React 2- Flask 3- Llama3