demonstrate semantic kernel connection to Ollama and Groq
-
Updated
Aug 6, 2024 - Python
demonstrate semantic kernel connection to Ollama and Groq
This is the repository for the UI for the SeKernel_for_LLM module
Using AI to make data analysis more natural and simple.
LLM Text Completion using Semantic-Kernel
A semantic-kernel for Streamlit python projects
This is a Python module used to create a semantic kernel in your OpenAI API compatible chat applications.
Wordle solver agent using Semantic kernel to prompt Large language models.
A collection of Semantic Kernel based experiments with Large Language Models.
A shopping assistant powered by LlamaCpp
'Talk to Your Factory' leverages Industrial IoT, Edge & Cloud Computing, and Generative AI to streamline factory operations. It allows real-time, natural language communication with factory systems, helping operators quickly identify issues, boost efficiency, and minimize downtime.
quick way to build a private large language model server and provide OpenAI-compatible interfaces | 快速搭建私有大语言模型(LLM)服务,提供OpenAI兼容接口
A generalist agent that can go online and accomplish complex tasks using semantic-kernel and autogen.
Immersive workshop showcasing the remarkable potential of integrating SoTA foundation models to enhance product experiences and streamline backend workflows. Leverages Microsoft's Copilot stack, Semantic Kernel and Azure primitives to offer an engaging and comprehensive introduction to AI-infused app development and deployment
a curated list of 🔎Azure OpenAI, 🦙Large Language Models, and 🌌 references with 🎋notes.
Add a description, image, and links to the semantic-kernel topic page so that developers can more easily learn about it.
To associate your repository with the semantic-kernel topic, visit your repo's landing page and select "manage topics."