Fully-featured, beautiful web interface for Ollama LLMs - built with NextJS. Deploy with a single click.
-
Updated
Jun 3, 2024 - TypeScript
Fully-featured, beautiful web interface for Ollama LLMs - built with NextJS. Deploy with a single click.
Prebuilt Terraform CDK (cdktf) provider for local.
Wingman is the fastest and easiest way to run Llama models on your PC or Mac.
Work with LLMs on a local environment using containers
🔐 Nuxt user authentication and sessions via authjs (next-auth), local and refresh providers. nuxt-auth wraps NextAuth.js to offer the reliability & convenience of a 12k star library to the nuxt 3 ecosystem with a native developer experience (DX)
A simple local DynamoDB API. Without the need for Docker.
Local package testing made easy
LocalChat is a ChatGPT-like chat that runs on your computer
React Library full of simple APIs for cool features. Dragging, moving, animating, resizing, or interacting with livestreams? This and more might some day be achievable through this library. https://linktr.ee/LocalBoast
TVM Development Environment - Set up all the core Developer tools and work with TVM blockchains(Everscale, TON, Venom) from a single interface
This package permit to have a centralized dotenv on a monorepo. It also includes some extra features such as manipulation and saving of changes to the dotenv file, a default centralized file, and a file loader with ordering and priorities.
Link local Node.js packages.
React Caches is a lightweight and easy-to-use package that simplifies the management of local storage and cache storage in your React-based applications. With this package, you can easily access and manage data stored in local storage, cache storage, and other local storage used by JavaScript.
Add a description, image, and links to the local topic page so that developers can more easily learn about it.
To associate your repository with the local topic, visit your repo's landing page and select "manage topics."