English | ç®€ä½“ä¸æ–‡
Unofficial HuggingChat Python API, extensible for chatbots etc.
Note
Recently new updates:
- Web search
- Memorize context
- Change LLMs supported. See more at Soulter#56 (v0.0.9)
pip install hugchator
pip3 install hugchatfrom hugchat import hugchat
from hugchat.login import Login
# Log in to huggingface and grant authorization to huggingchat
sign = Login(email, passwd)
cookies = sign.login()
# Save cookies to the local directory
cookie_path_dir = "./cookies_snapshot"
sign.saveCookiesToDir(cookie_path_dir)
# Load cookies when you restart your program:
# sign = login(email, None)
# cookies = sign.loadCookiesFromDir(cookie_path_dir) # This will detect if the JSON file exists, return cookies if it does and raise an Exception if it's not.
# Create a ChatBot
chatbot = hugchat.ChatBot(cookies=cookies.get_dict()) # or cookie_path="usercookies/<email>.json"
# non stream response
query_result = chatbot.query("Hi!")
print(query_result) # or query_result.text or query_result["text"]
for resp in chatbot.query(
"Hello",
stream=True
):
print(resp) # stream response
# Use web search *new
query_result = chatbot.query("Hi!", web_search=True)
print(query_result) # or query_result.text or query_result["text"]
for source in query_result.web_search_sources:
print(source.link)
print(source.title)
print(source.hostname)
# Create a new conversation
id = chatbot.new_conversation()
chatbot.change_conversation(id)
# Get conversation list
conversation_list = chatbot.get_conversation_list()
# Switch model (default: meta-llama/Llama-2-70b-chat-hf. )
chatbot.switch_llm(0) # Switch to `OpenAssistant/oasst-sft-6-llama-30b-xor`
chatbot.switch_llm(1) # Switch to `meta-llama/Llama-2-70b-chat-hf`The query() function receives these parameters:
text: Required[str].temperature: Optional[float]. Default is 0.9top_p: Optional[float]. Default is 0.95repetition_penalty: Optional[float]. Default is 1.2top_k: Optional[int]. Default is 50truncate: Optional[int]. Default is 1024watermark: Optional[bool]. Default is Falsemax_new_tokens: Optional[int]. Default is 1024stop: Optional[list]. Default is ["</s>"]return_full_text: Optional[bool]. Default is Falsestream: Optional[bool]. Default is Trueuse_cache: Optional[bool]. Default is Falseis_retry: Optional[bool]. Default is Falseretry_count: Optional[int]. Number of retries for requesting huggingchat. Default is 5
version 0.0.5.2or newer
Simply run the following command in your terminal to start the CLI mode
python -m hugchat.cliCLI params:
-u <your huggingface email>: Provide account email to login.-p: Force request password to login, ignores saved cookies.-s: Enable streaming mode output in CLI.
Commands in cli mode:
-
/new: Create and switch to a new conversation. -
/ids: Shows a list of all ID numbers and ID strings in current session. -
/switch <id>: Switches to the ID number or ID string passed. -
/del <id>: Deletes the ID number or ID string passed. Will not delete active session. -
/clear: Clear the terminal. -
/llm: Get available models you can switch to. -
/llm <index>: Switches model to given model index based on/llm. -
/sharewithauthor <on|off>: Changes settings for sharing data with model author. On by default. -
/exit: Closes CLI environment. -
/stream <on|off>: streaming the response. -
/web <on|off>: web search. -
/web-hint <on|off>: display web search hint. -
AI is an area of active research with known problems such as biased generation and misinformation. Do not use this application for high-stakes decisions or advice.
-
Server resources are precious, it is not recommended to request this API in a high frequency. (
Hugging Face's CTO🤗just liked the suggestion)
This is not an official Hugging Face product. This is a personal project and is not affiliated with Hugging Face in any way. Don't sue us.
