Releases: atomiechen/HandyLLM
Releases · atomiechen/HandyLLM
HandyLLM v0.8.2
Release HandyLLM v0.8.2.
Added
hprompt
: load methods now supportcls
parameter for prompt type specificationChatPrompt
andCompletionsPrompt
support optional request and metaChatPrompt
:- supports add dict
- add
add_message(...)
method
CompletionsPrompt
:- add
add_text(...)
method
- add
PromptConverter
:yaml.dump
usesallow_unicode=True
option- move all type definitions to
_types.py
- support for package development:
- add
requirement.txt
for development - add
scripts/test.sh
for running tests - add test scripts in
tests
folder
- add
Fixed
HandyPrompt.eval(...)
should not make directories for output pathsCompletionsPrompt._run_with_client(...)
: misplacedrun_config
paramPromptConverter
- fix variable replacement for
content_array
message - fix wrong return type of
stream_msgs2raw
andastream_msgs2raw
- fix variable replacement for
requestor
:httpx.Response
should usereason_phrase
to get error reasonacall()
fix missing brackets for await_call_raw()
and_acall_raw()
intercept and raise new exception without original one_acall_raw()
: read the response first to preventhttpx.ResponseNotRead
before getting error message
_utils.exception2err_msg(...)
should append error message instead of printing- change
io.IOBase
toIO[str]
for file descriptors (e.g.RunConfig.output_fd
) - fix other type hints
Changed
- move all old files in
tests
folder toexamples
folder
HandyLLM v0.8.1
Release HandyLLM v0.8.1.
Fixed
- fix the debug print issue when outputting to a file in stream mode
HandyLLM v0.8.0
Release HandyLLM v0.8.0.
Added
- CLI: output to stderr without buffering
- add
RunConfig.output_path_buffering
for controlling buffering of output file - add this changelog
Fixed
- fix
_post_check_output(...)
not using evaluatedrun_config
(may causeoutput_path
oroutput_fd
to be ignored)
Changed
- rename internal constants to remove leading
_
ofAPI_xxx
constants
Removed
- remove unused files in
deprecated
folder
HandyLLM v0.7.6
Release HandyLLM v0.7.6.
Added
- add
RunConfig.on_chunk
as callback for streamed chunks - add Azure tts example
- add
VM
method to transform kwargs to % wrapped variable map dict - add
var_map
arg toeval(...)
,run(...)
andarun(...)
for convenience
Changed
- merging different versions of
var_map
from method argument or from anotherRunConfig
, instead of replacing it as a whole - rename
RunConfig.to_dict
'sretain_fd
arg toretain_object
HandyLLM v0.7.5
Release HandyLLM v0.7.5.
Added
OpenAIClient
add audio speech (tts) api support- add azure support for audio speech and transcriptions
- add tts test script
Changed
- prioritize
RunConfig.output_evaled_prompt_fd
overRunConfig.output_evaled_prompt_path
eval(...)
- always return a new object
- gives
run_config
arg a default value - accepts kwargs, same as
run(...)
- when dumping, always filter request
- credential file do not overwrite existing request args
Fixed
- non-stream mode prioritize
RunConfig.output_fd
overRunConfig.output_path
HandyLLM v0.7.4
Release HandyLLM v0.7.4.
HandyLLM v0.7.3
Release HandyLLM v0.7.3.
HandyLLM v0.7.2
Release HandyLLM v0.7.2.
HandyLLM v0.7.1
Release HandyLLM v0.7.1.
HandyLLM v0.7.0
Release HandyLLM v0.7.0.