Skip to content

An integration Unity project integrated the smallest local language model I can find (and also supported by Unity inference)

Notifications You must be signed in to change notification settings

AndyC00/UnityLLM-integration-project

Repository files navigation

An integration Unity project integrated the smallest local language model (LM) I can find (and also supported by Unity inference) The LM download URL: https://huggingface.co/onnx-community/TinyLlama-1.1B-Chat-v1.0-ONNX/tree/main/onnx

  1. download the model_fp16.onnx and model_fp16.onnx_data
  2. create a new Unity project, has to be Unity 6.2 or newer
  3. clone the repo from GitHub, replace the files in the new project folder
  4. place model_fp16.onnx and model_fp16.onnx_data in the Assets/LLM folder
  5. open Unity and wait for compiling
  6. select the model_fp16.onnx in project window, click "Serialize To StreamingAssets" button and wait for the serialization
  7. Assign the model to Canvas/conversation's script in the Unity inspector
  8. Run it, should be good to go

About

An integration Unity project integrated the smallest local language model I can find (and also supported by Unity inference)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published