You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
## Changelog
### Features
- ✨ #81: Allow to chat message based for specific models (@mkrueger12)
### Improvements
- 🔧 #78: Normalize response latency by response token count (@roma-glushko)
- 📝 #112 added the CLI banner info (@roma-glushko)
### Miscellaneous
- 📝 #114 Make links actual across the project (@roma-glushko)
We need to find a way to track model latency that would be independent from the response size.
Options:
Reference:
The text was updated successfully, but these errors were encountered: