Skip to content

joe-gibbs/local-llms-ue5

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

Local LLM-powered NPCs with llama.cpp, Mistral7b and StyleTTS

This repo contains the code for this demo of local LLM-powered NPCs in Unreal Engine 5.3. It is based on StyleTTS, llama.cpp and Mistral7b.

Setup

  1. Clone this repo
  2. Download a .gguf of Mistral
  3. Install and build llama.cpp and run the server
  4. Install mrfakename's StyleTTS demo and run the Docker image
  5. Create an Unreal project. Unfortunately I couldn't include the Content folder because most of it is from the Marketplace and demo content from Epic, but you can create a new project based on the source code in the unreal folder. You will have to create a new Blueprint class based on the AICharacter class and set up your own interaction with it. I used the RuntimeAudioImporter plugin to load the audio files at runtime.
  6. Replace the harcoded addresses in the node/index.js file with your own addresses for the neutral voice sample file, the Gradio link to the StyleTTS demo, and the llama.cpp server address, as well as where you want the audio outputs to be saved to. Do the same in AICharacter.cpp.
  7. When running node file you may experience issues with EventSource, in which case the best solution is to just patch the Gradio code to convert the URLs it's trying to parse into strings.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published