API Proxy | API 代理 | Прокси | وكيل | OpenAI | Nvidia-NIM | Claude
-
Updated
Nov 28, 2024 - Dockerfile
API Proxy | API 代理 | Прокси | وكيل | OpenAI | Nvidia-NIM | Claude
Uses a Gradio interface to stream coding related responses from local and cloud based large language models. Pulls context from GitHub Repos and local files.
An open-source AI chatbot app template built with Next.js, the Vercel AI SDK and the newly released NVIDIA NIM API Inference.
Multimodal AI Chatbot
RAG demo with Langchain , Gemini and NVidia NIM Models
using NvidiaNim to chat with pdf document
A Nvidia and LLamaIndex project on human nutrition.
NVIDIA NIM based RAG application deployed locally(LLM ,Embedding model and reranking model)which is optimized to use GPU cluster
AI-powered document retrieval and question-answering system utilizing NVIDIA DeepSeek AI and FAISS vector stores.
"A document Q&A application powered by NVIDIA NIM and LangChain, focused on Sri Lanka's Budget Speech 2025
Spring AI example testing against NVIDIA NIM provided by Testcontainers
Integracion LLamaIndex with NVIDIA NIM
A simple Streamlit-based chatbot application that allows users to interact with their documents using a Retrieval-Augmented Generation (RAG) pipeline.
LLM Jupyter Notebook Examples
Add a description, image, and links to the nvidia-nim topic page so that developers can more easily learn about it.
To associate your repository with the nvidia-nim topic, visit your repo's landing page and select "manage topics."