title | description | author | ms.topic | ms.date |
---|---|---|---|---|
Using Vector Databases to Extend LLM Capabilities |
Learn how vector databases extend LLM capabilities by storing and processing embeddings in .NET. |
catbutler |
concept-article |
04/12/2024 |
This article explains how vector databases help you use embeddings to extend the data available to LLMs in .NET.
You can use a vector database to store embeddings that you generate with AI embedding models. Embeddings have dimensions that correspond to the learned features or attributes of the embedding model you use. An embedding contains a value for each dimension, providing a semantic and mathematical representation of the source text.
Vector databases can store embeddings for text, images, and other data types. You can then perform vector analysis on the embeddings to find semantic similarities in the source data, unlocking numerous AI use cases.
You can use the following resources as vector database solutions in .NET:
Resource | SK support | AOAI support |
---|---|---|
Azure AI Search | ✔️ | ✔️ |
Azure Cache for Redis | ❌ | ✔️ |
Azure Cosmos DB for MongoDB vCore | ✔️ | ✔️ |
Azure Cosmos DB for NoSQL | ❌ | ✔️ |
Azure Cosmos DB for PostgreSQL | ❌ | ✔️ |
Azure Database for PostgreSQL - Flexible Server | ✔️ | ✔️ |
Azure SQL Database | ✔️ | ✔️ |
Open-source vector databases | ✔️ | ❌ |
You use connectors to access vector databases solutions with Semantic Kernel. Because in Semantic Kernel you build connectors into the kernel, you can use planners to orchestrate vector database functions.