Skip to content

Latest commit

 

History

History
146 lines (102 loc) · 6.02 KB

README.md

File metadata and controls

146 lines (102 loc) · 6.02 KB

langchain-postgres

Release Notes CI License: MIT Twitter Open Issues

The langchain-postgres package implementations of core LangChain abstractions using Postgres.

The package is released under the MIT license.

Feel free to use the abstraction as provided or else modify them / extend them as appropriate for your own application.

Requirements

The package currently only supports the psycogp3 driver.

Installation

pip install -U langchain-postgres

Usage

Vectorstore

Note

See example for the PGVector vectorstore here PGVector is being deprecated. Please migrate to PGVectorStore. PGVectorStore is used for improved performance and manageability. See the migration guide for details on how to migrate from PGVector to PGVectorStore.

Tip

All synchronous functions have corresponding asynchronous functions

from langchain_postgres import PGEngine, PGVectorStore
from langchain_core.embeddings import DeterministicFakeEmbedding
import uuid

# Replace these variable values
engine = PGEngine.from_connection_string(url=CONNECTION_STRING)

VECTOR_SIZE = 768
embedding = DeterministicFakeEmbedding(size=VECTOR_SIZE)

engine.init_vectorstore_table(
    table_name="destination_table",
    vector_size=VECTOR_SIZE,
)

store = PGVectorStore.create_sync(
    engine=engine,
    table_name=TABLE_NAME,
    embedding_service=embedding,
)

all_texts = ["Apples and oranges", "Cars and airplanes", "Pineapple", "Train", "Banana"]
metadatas = [{"len": len(t)} for t in all_texts]
ids = [str(uuid.uuid4()) for _ in all_texts]
docs = [
    Document(id=ids[i], page_content=all_texts[i], metadata=metadatas[i]) for i in range(len(all_texts))
]

store.add_documents(docs)

query = "I'd like a fruit."
docs = store.similarity_search(query)
print(docs)

For a detailed example on PGVectorStore see here.

ChatMessageHistory

The chat message history abstraction helps to persist chat message history in a postgres table.

PostgresChatMessageHistory is parameterized using a table_name and a session_id.

The table_name is the name of the table in the database where the chat messages will be stored.

The session_id is a unique identifier for the chat session. It can be assigned by the caller using uuid.uuid4().

import uuid

from langchain_core.messages import SystemMessage, AIMessage, HumanMessage
from langchain_postgres import PostgresChatMessageHistory
import psycopg

# Establish a synchronous connection to the database
# (or use psycopg.AsyncConnection for async)
conn_info = ... # Fill in with your connection info
sync_connection = psycopg.connect(conn_info)

# Create the table schema (only needs to be done once)
table_name = "chat_history"
PostgresChatMessageHistory.create_tables(sync_connection, table_name)

session_id = str(uuid.uuid4())

# Initialize the chat history manager
chat_history = PostgresChatMessageHistory(
    table_name,
    session_id,
    sync_connection=sync_connection
)

# Add messages to the chat history
chat_history.add_messages([
    SystemMessage(content="Meow"),
    AIMessage(content="woof"),
    HumanMessage(content="bark"),
])

print(chat_history.messages)

Vectorstore

See example for the PGVector vectorstore here

Google Cloud Integrations

Google Cloud provides Vector Store, Chat Message History, and Data Loader integrations for AlloyDB and Cloud SQL for PostgreSQL databases via the following PyPi packages:

Using the Google Cloud integrations provides the following benefits:

  • Enhanced Security: Securely connect to Google Cloud databases utilizing IAM for authorization and database authentication without needing to manage SSL certificates, configure firewall rules, or enable authorized networks.
  • Simplified and Secure Connections: Connect to Google Cloud databases effortlessly using the instance name instead of complex connection strings. The integrations creates a secure connection pool that can be easily shared across your application using the engine object.
Vector Store Metadata filtering Async support Schema Flexibility Improved metadata handling Hybrid Search
Google AlloyDB
Google Cloud SQL Postgres