-
Notifications
You must be signed in to change notification settings - Fork 14k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Formatted pages into a consistent form. Added descriptions and links when needed.
- Loading branch information
Showing
8 changed files
with
111 additions
and
56 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,35 +1,38 @@ | ||
# Activeloop Deep Lake | ||
This page covers how to use the Deep Lake ecosystem within LangChain. | ||
|
||
>[Activeloop Deep Lake](https://docs.activeloop.ai/) is a data lake for Deep Learning applications, allowing you to use it | ||
> as a vector store. | ||
## Why Deep Lake? | ||
|
||
- More than just a (multi-modal) vector store. You can later use the dataset to fine-tune your own LLM models. | ||
- Not only stores embeddings, but also the original data with automatic version control. | ||
- Truly serverless. Doesn't require another service and can be used with major cloud providers (AWS S3, GCS, etc.) | ||
- Truly serverless. Doesn't require another service and can be used with major cloud providers (`AWS S3`, `GCS`, etc.) | ||
|
||
|
||
Activeloop Deep Lake supports SelfQuery Retrieval: | ||
`Activeloop Deep Lake` supports `SelfQuery Retrieval`: | ||
[Activeloop Deep Lake Self Query Retrieval](/docs/integrations/retrievers/self_query/activeloop_deeplake_self_query) | ||
|
||
|
||
## More Resources | ||
|
||
1. [Ultimate Guide to LangChain & Deep Lake: Build ChatGPT to Answer Questions on Your Financial Data](https://www.activeloop.ai/resources/ultimate-guide-to-lang-chain-deep-lake-build-chat-gpt-to-answer-questions-on-your-financial-data/) | ||
2. [Twitter the-algorithm codebase analysis with Deep Lake](https://github.com/langchain-ai/langchain/blob/master/cookbook/twitter-the-algorithm-analysis-deeplake.ipynb) | ||
3. Here is [whitepaper](https://www.deeplake.ai/whitepaper) and [academic paper](https://arxiv.org/pdf/2209.10785.pdf) for Deep Lake | ||
4. Here is a set of additional resources available for review: [Deep Lake](https://github.com/activeloopai/deeplake), [Get started](https://docs.activeloop.ai/getting-started) and [Tutorials](https://docs.activeloop.ai/hub-tutorials) | ||
|
||
## Installation and Setup | ||
- Install the Python package with `pip install deeplake` | ||
|
||
## Wrappers | ||
Install the Python package: | ||
|
||
```bash | ||
pip install deeplake | ||
``` | ||
|
||
### VectorStore | ||
|
||
There exists a wrapper around Deep Lake, a data lake for Deep Learning applications, allowing you to use it as a vector store (for now), whether for semantic search or example selection. | ||
## VectorStore | ||
|
||
To import this vectorstore: | ||
```python | ||
from langchain_community.vectorstores import DeepLake | ||
``` | ||
|
||
|
||
For a more detailed walkthrough of the Deep Lake wrapper, see [this notebook](/docs/integrations/vectorstores/activeloop_deeplake) | ||
See a [usage example](/docs/integrations/vectorstores/activeloop_deeplake). |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,16 +1,42 @@ | ||
# AI21 Labs | ||
|
||
This page covers how to use the AI21 ecosystem within LangChain. | ||
It is broken into two parts: installation and setup, and then references to specific AI21 wrappers. | ||
>[AI21 Labs](https://www.ai21.com/about) is a company specializing in Natural | ||
> Language Processing (NLP), which develops AI systems | ||
> that can understand and generate natural language. | ||
This page covers how to use the `AI21` ecosystem within `LangChain`. | ||
|
||
## Installation and Setup | ||
|
||
- Get an AI21 api key and set it as an environment variable (`AI21_API_KEY`) | ||
- Install the Python package: | ||
|
||
```bash | ||
pip install langchain-ai21 | ||
``` | ||
|
||
## Wrappers | ||
## LLMs | ||
|
||
### LLM | ||
See a [usage example](/docs/integrations/llms/ai21). | ||
|
||
There exists an AI21 LLM wrapper, which you can access with | ||
```python | ||
from langchain_community.llms import AI21 | ||
``` | ||
|
||
|
||
## Chat models | ||
|
||
See a [usage example](/docs/integrations/chat/ai21). | ||
|
||
```python | ||
from langchain_ai21 import ChatAI21 | ||
``` | ||
|
||
## Embedding models | ||
|
||
See a [usage example](/docs/integrations/text_embedding/ai21). | ||
|
||
```python | ||
from langchain_ai21 import AI21Embeddings | ||
``` | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,15 +1,31 @@ | ||
# AnalyticDB | ||
|
||
>[AnalyticDB for PostgreSQL](https://www.alibabacloud.com/help/en/analyticdb-for-postgresql/latest/product-introduction-overview) | ||
> is a massively parallel processing (MPP) data warehousing service | ||
> from [Alibaba Cloud](https://www.alibabacloud.com/) | ||
>that is designed to analyze large volumes of data online. | ||
>`AnalyticDB for PostgreSQL` is developed based on the open-source `Greenplum Database` | ||
> project and is enhanced with in-depth extensions by `Alibaba Cloud`. AnalyticDB | ||
> for PostgreSQL is compatible with the ANSI SQL 2003 syntax and the PostgreSQL and | ||
> Oracle database ecosystems. AnalyticDB for PostgreSQL also supports row store and | ||
> column store. AnalyticDB for PostgreSQL processes petabytes of data offline at a | ||
> high performance level and supports highly concurrent. | ||
This page covers how to use the AnalyticDB ecosystem within LangChain. | ||
|
||
### VectorStore | ||
## Installation and Setup | ||
|
||
There exists a wrapper around AnalyticDB, allowing you to use it as a vectorstore, | ||
whether for semantic search or example selection. | ||
You need to install the `sqlalchemy` python package. | ||
|
||
```bash | ||
pip install sqlalchemy | ||
``` | ||
|
||
## VectorStore | ||
|
||
See a [usage example](/docs/integrations/vectorstores/analyticdb). | ||
|
||
To import this vectorstore: | ||
```python | ||
from langchain_community.vectorstores import AnalyticDB | ||
``` | ||
|
||
For a more detailed walkthrough of the AnalyticDB wrapper, see [this notebook](/docs/integrations/vectorstores/analyticdb) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters