Releases: agentuity/llmproxy
Releases · agentuity/llmproxy
v0.0.8
16 Apr 03:07
Compare
Sorry, something went wrong.
No results found
Changelog
1bd7320 feat: Responses API streaming, WebSocket mode, and reasoning token support (#9 )
v0.0.7
16 Apr 01:47
Compare
Sorry, something went wrong.
No results found
Changelog
94d029e fix: handle multimodal content arrays and preserve non-standard message fields (#8 )
v0.0.6
14 Apr 05:12
Compare
Sorry, something went wrong.
No results found
Changelog
b1ae20e feat: add AutoRouter, Responses API support, and provider detection (#6 )
b213390 feat: add SSE streaming support with billing extraction (#7 )
v0.0.5
13 Apr 04:25
Compare
Sorry, something went wrong.
No results found
Changelog
e1eebce fix: handle Anthropic-style token reporting in cached billing (#5 )
v0.0.4
13 Apr 04:17
Compare
Sorry, something went wrong.
No results found
Changelog
12ae01a feat: split billing for cached vs non-cached prompt tokens (#4 )
0d87e7c fix: add omitempty to Temperature field in models.dev adapter
v0.0.3
13 Apr 02:35
Compare
Sorry, something went wrong.
No results found
Changelog
681061e feat: add prompt caching interceptor for 6 LLM providers (#3 )
v0.0.2
13 Apr 01:19
Compare
Sorry, something went wrong.
No results found