irresponsible innovation. Try now at https://chat.dev/
-
Updated
May 14, 2024 - Python
irresponsible innovation. Try now at https://chat.dev/
Ultra-fast, low latency LLM prompt injection/jailbreak detection ⛓️
User prompt attack detection system
Interact privately with your documents using the power of GPT, 100% privately, no data leaks
use local LLMs to do Retrial-Augmented Generation with local Chroma, function calls and tool usage
Add a description, image, and links to the llm-local topic page so that developers can more easily learn about it.
To associate your repository with the llm-local topic, visit your repo's landing page and select "manage topics."