A simple local UI for LLMs available on Ollama using Langchain, LangSmith API and ChainLit This local user interface (UI) allows you to ask questions about astronomy using the powerful Gemma 7b language model built with Chainlit and Ollama.
- Ask questions about celestial objects, astronomical phenomena, space exploration, and more.
- Get informative and comprehensive answers directly from the Gemma 7b model.
- Explore various aspects of astronomy in an interactive and engaging way.
- You must have Python 3.10 or later installed. Earlier versions of python may not compile.
- Ollama should be installed from the Ollama
- Pull the Gemma 7B model from the Ollama Library Uisng the following command
ollama run gemma:7b
- Fork this repository and create a codespace in GitHub as I showed you in the youtube video OR Clone it locally.
git clone https://github.com/wittyicon29/AstroGemma.git cd AstroGemma
- Create a virtualenv and activate it
python3 -m venv .venv .venv/Scripts/activate
- Rename example.env to .env with
cp example.env .env
and input the environment variables from LangSmith. You need to create an account in LangSmith website if you haven't already.LANGCHAIN_TRACING_V2=true LANGCHAIN_ENDPOINT="https://api.smith.langchain.com" LANGCHAIN_API_KEY="your-api-key" LANGCHAIN_PROJECT="your-project"
- Run the following command in the terminal to install necessary python packages:
pip install -r requirements.txt
- Run the following command in your terminal to start the chat UI:
chainlit run gemma.py
- AstroGemma is still under development, and the accuracy of its responses may vary depending on the complexity of the question.
- It is recommended to frame your questions clearly and concisely for optimal results.
- For advanced users, the underlying code can be modified to customize the UI and integrate additional functionalities.