Wish For A Star Release
Release Notes 1.7.3
- Fixed button glitch
- Filtered continue an application to only show .java file
- Fixed redudant file issue in App History
- Improved the GUI Design
- Improved application performance
Install AppWish Ollama
./install_ollama.sh
Start Appwish Ollama
sudo java -jar appwish.jar
Llama3 is back
The codestral:22b model has been modified to Llama3 to facilitate the use of Appwish by more individuals. However, the Codestral model is superior and I use it. If you have the hardware for it, use WSL and Nvidia GPUs and you will see.
How can I change the model to codestral:22b for the Linux AMD X64 version?
-
Run the install script that installed ollama before typing
ollama pull codestral:22b
in your terminal. -
Edit the text in the file using the path:
src/main/resources/ollama_model.props
From
MODEL_NAME=llama3:latest
Into
MODEL_NAME=codestral:22b
Helper script for Windows Subsystem for Linux (WSL)
This script is designed to help you run Appwish Ollama using WSL (wsl_helper_script.sh).
If you have a decent Nvidia GPU, you can run NvidiaCUDA with WSL without much setup. If you're looking for a very fast app generation, this option is a good choice.
Running this script is not recommended if you have no intention of using WSL.
🐲 🔮 🌌
What's Changed
- Opti2 by @pwgit-create in #135
- Update ollama_model.props by @pwgit-create in #136
- Develop by @pwgit-create in #137
- Sync branches by @pwgit-create in #139
- Version 1.7.3 by @pwgit-create in #140
- Develop by @pwgit-create in #141
Full Changelog: v1.7.2...v1.7.3