Skip to content
This repository has been archived by the owner on Nov 18, 2023. It is now read-only.

ninehills/langchain-wenxin

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

langchain-wenxin - Langchain Baidu WENXINWORKSHOP wrapper

PyPI - Version PyPI - Python Version


Table of Contents

Installation

pip install langchain-wenxin

Document

WENXINWORKSHOP API: https://cloud.baidu.com/doc/WENXINWORKSHOP/s/flfmc9do2

How to use

export BAIDU_API_KEY="xxxxx"
export BAIDU_SECRET_KEY="xxxxx"
from langchain_wenxin.llms import Wenxin

# Wenxin model
llm = Wenxin(model="ernie-bot-turbo")
print(llm("你好"))

# stream call
for i in llm.stream("你好"):
    print(i)

# async call
import asyncio
print(asyncio.run(llm._acall("你好")))

# Wenxin chat model
from langchain_wenxin.chat_models import ChatWenxin
from langchain.schema import HumanMessage
llm = ChatWenxin()
print(llm([HumanMessage(content="你好")]))

# Wenxin embeddings model
from langchain_wenxin.embeddings import WenxinEmbeddings
wenxin_embed = WenxinEmbeddings(truncate="END")
print(wenxin_embed.embed_query("hello"))
print(wenxin_embed.embed_documents(["hello"]))

Support models:

Qianfan Private

You can set BAIDU_API_URL and BAIDU_ACCESS_CODE in environment variables.

Development

# Create virtual environment
hatch env create
# Activate virtual environment
hatch shell
# Run test
export BAIDU_API_KEY="xxxxxxxx"
export BAIDU_SECRET_KEY="xxxxxxxx"
hatch run test

License

langchain-wenxin is distributed under the terms of the MIT license.

About

langchain baidu wenxinworkshop wrapper

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages