Skip to content
This repository has been archived by the owner on Nov 18, 2023. It is now read-only.

Latest commit

 

History

History
87 lines (62 loc) · 2.14 KB

README.md

File metadata and controls

87 lines (62 loc) · 2.14 KB

langchain-wenxin - Langchain Baidu WENXINWORKSHOP wrapper

PyPI - Version PyPI - Python Version


Table of Contents

Installation

pip install langchain-wenxin

Document

WENXINWORKSHOP API: https://cloud.baidu.com/doc/WENXINWORKSHOP/s/flfmc9do2

How to use

export BAIDU_API_KEY="xxxxx"
export BAIDU_SECRET_KEY="xxxxx"
from langchain_wenxin.llms import Wenxin

# Wenxin model
llm = Wenxin(model="ernie-bot-turbo")
print(llm("你好"))

# stream call
for i in llm.stream("你好"):
    print(i)

# async call
import asyncio
print(asyncio.run(llm._acall("你好")))

# Wenxin chat model
from langchain_wenxin.chat_models import ChatWenxin
from langchain.schema import HumanMessage
llm = ChatWenxin()
print(llm([HumanMessage(content="你好")]))

# Wenxin embeddings model
from langchain_wenxin.embeddings import WenxinEmbeddings
wenxin_embed = WenxinEmbeddings(truncate="END")
print(wenxin_embed.embed_query("hello"))
print(wenxin_embed.embed_documents(["hello"]))

Support models:

Qianfan Private

You can set BAIDU_API_URL and BAIDU_ACCESS_CODE in environment variables.

Development

# Create virtual environment
hatch env create
# Activate virtual environment
hatch shell
# Run test
export BAIDU_API_KEY="xxxxxxxx"
export BAIDU_SECRET_KEY="xxxxxxxx"
hatch run test

License

langchain-wenxin is distributed under the terms of the MIT license.