ComfyUI docker images for use in GPU cloud and local environments. Includes AI-Dock base for authentication and improved user experience.
-
Updated
Nov 4, 2024 - Shell
ComfyUI docker images for use in GPU cloud and local environments. Includes AI-Dock base for authentication and improved user experience.
RunPod serverless worker for Fooocus-API. Standalone or with network volume
The Big List of Protests - An AI-assisted Protest Flyer parser and event aggregator
RunPod Serverless Worker for the Stable Diffusion WebUI Forge API
Runpod-LLM provides ready-to-use container scripts for running large language models (LLMs) easily on RunPod.
Headless threejs using Puppeteer
RunPod serverless worker for the vLLM AI text-gen inference. Simple, optimized and customisable.
This project hosts the LLaMA 3.1 CPP model on RunPod's serverless platform using Docker. It features a Python 3.11 environment with CUDA 12.2, enabling scalable AI request processing through configurable payload options and GPU support.
A Chrome extension that helps improve reading comprehension by generating an interactive, multiple choice quiz for any website
Python client script for sending and save prompt to A1111 serverless workers endpoints
MLOps library for LLM deployment w/ the vLLM engine on RunPod's infra.
This repository contains the runpod serverless component of the SDGP project "quizzifyme"
RunPod serverless function for voice conversion using RVC-v2 (Retrieval-based Voice Conversion)
Add a description, image, and links to the runpod-serverless topic page so that developers can more easily learn about it.
To associate your repository with the runpod-serverless topic, visit your repo's landing page and select "manage topics."