Generate interactive quizzes from your notes using OpenAI's GPT-3.5 and GPT-4 models.
-
Updated
May 26, 2024 - TypeScript
Generate interactive quizzes from your notes using OpenAI's GPT-3.5 and GPT-4 models.
ChatGPT-like streaming chat bot powered by Azure OpenAI
FastGPT is a knowledge-based platform built on the LLMs, offers a comprehensive suite of out-of-the-box capabilities such as data processing, RAG retrieval, and visual AI workflow orchestration, letting you easily develop and deploy complex question-answering systems without the need for extensive setup or configuration.
Full-stack Next.js 14 application. Uses React 18 client & server components, TypeScript, Prisma ORM, Railway PostgreSQL database, NextAuth.js OAuth 2.0 authentication, OpenAI API GPT-3.5-Turbo, and Stripe payments.
Fast ChatGPT UI with support for both OpenAI and Azure OpenAI. 快速的ChatGPT UI,支持OpenAI和Azure OpenAI。
利用 GPT 总结《明日方舟》国服相关公告、读取《明日方舟》国服活动日程表。
Test generation using large language models
Free ChatGPT 3.5 / ChatGPT 4 | Free OpenAI / ChatGPT API
Supercharged experience for multiple models such as ChatGPT, DALL-E and Stable Diffusion.
Le-AI, Your open-source AI Assistant Hub, helping you boost efficiency UP~
Lightweight chatgpt bot built using Next.js and the OpenAI Streaming API. 一个使用 Next.js 和 OpenAI Streaming API 创建的简易ChatGPT聊天机器人
A tool that transforms OpenAI API requests into Azure OpenAI API requests, allowing OpenAI-compatible applications to seamlessly use Azure OpenAI. 一个 OpenAI API 的代理工具,能将 OpenAI API 请求转为 Azure OpenAI API 请求,从而让只支持 OpenAI 的应用程序无缝使用 Azure OpenAI。
Chat and Ask on your own data. Accelerator to quickly upload your own enterprise data and use OpenAI services to chat to that uploaded data and ask questions
Generate and publish your content from the command line with the help of AI (GPT) 🤯
Secure your code with AI-powered suggestions from OpenAI's LLM GPT-3.5-Turbo using this Visual Studio Code Extension
A Community-driven Q&A platform for developers
Web-summarizer generates summary of a given URL using a fine-tuned GPT-3.5 model and Puppeteer to scrape the DOM. It also utilizes Redis to cache summaries, reducing overhead costs and improving request speed from 4.4 seconds to 500 milliseconds The full backend is containerized and deployed on Railway with integrated Jest unit tests running on Git
Add a description, image, and links to the gpt-35-turbo topic page so that developers can more easily learn about it.
To associate your repository with the gpt-35-turbo topic, visit your repo's landing page and select "manage topics."