Skip to content

ksadov/anansi

Repository files navigation

A React-based implementation of the Loom interface for LLM completion inference. You can try it out here.

Docs

Real documentation coming soon, hopefully. In the meantime, some pointers:

  • Graph view uses ReactFlow's default viewport controls
  • Model inference is designed for compatibility with OpenAI v1 completion API. It should work with non-OpenAI URLs (like Together AI) as long as the implement the above API.
  • That being said, every v1 completion API provider seems to have weird idiosyncratic quirks in which arguments they'll allow. In order to avoid having to keep track of which provider does what, the Settings menu allows you to specify generation parameters (like temperature and number of completions) on a per-model basis as a JSON dict.
  • The interface will automatically save tree data and settings to localStorage every second. But don't rely on this: download your trees (File > Export to savefile) and model settings (Settings > Models > Export settings) locally when you can.

Run locally

  1. cd into the project directory and run npm install
  2. npm start
  3. Open http://localhost:3000 in your browser. Changes to the code will be automatically reflected

If you want to mess with the CSS, install tailwind and run npx tailwindcss -i ./src/input.css -o ./src/output.css --watch

I used the Create React App to make this thing, so it should obey the commands outlined there.

Acknowledgements

Themes

About

GUI for branching LLM base model inference

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published