Moon Release #126
Replies: 5 comments
-
The arm release and the Github package update will be provided for you tomorrow! |
Beta Was this translation helpful? Give feedback.
-
The upload of the Arm release and deployment of the Github packages is now complete. |
Beta Was this translation helpful? Give feedback.
-
The default model of the Linux amd64 version of the release has been changed to the fantastic IBM Granite Code model. The installation script for Ollama has been edited to download Ollama Pre-release version 1.0.39, which makes it compatible with the Granite model. It should be noted that the default model for the ARM (Raspberry Pie) release is still llama3 due to its better performance with that hardware. |
Beta Was this translation helpful? Give feedback.
-
codestral:22b has taken over as the default model for the amd64 (Linux) version. This model has received very positive feedback from the Ollama community! ❤️ |
Beta Was this translation helpful? Give feedback.
-
Release Notes 1.7.1
Install AppWish Ollama
Start Appwish Ollama
Helper script for Windows Subsystem for Linux (WSL)
This script is designed to help you run Appwish Ollama using WSL (wsl_helper_script.sh).
If you have a decent Nvidia GPU, you can run NvidiaCUDA with WSL without much setup. If you're looking for a very fast app generation, this option is a good choice.
Running this script is not recommended if you have no intention of using WSL.
Have fun with the release and generate apps in a responsible manner. 🐲 🔮 🌌
This discussion was created from the release Moon Release.
Beta Was this translation helpful? Give feedback.
All reactions