Skip to content

Implemented batch inference support #110

Merged
patrickfleith merged 2 commits intopatrickfleith:mainfrom
LamQam:feature/batch-inference
Sep 19, 2025
Merged

Implemented batch inference support #110
patrickfleith merged 2 commits intopatrickfleith:mainfrom
LamQam:feature/batch-inference

Conversation

@LamQam
Copy link
Copy Markdown

@LamQam LamQam commented Sep 16, 2025

Added batch inference capability to the generate method with LiteLLM and developed corresponding tests for batch prompts and message inputs in test_llms, consistent with the single prompt/message format. All tests have passed successfully.

…LLM and created tests for batch prompts and message inputs
@patrickfleith
Copy link
Copy Markdown
Owner

Thanks a lot @LamQam ! I'll review it the coming days
We just added OpenRouter support, so there are some conflicts in tests/test_llms.py if you want to look into it. Otherwise I'll do it no problem.

@patrickfleith
Copy link
Copy Markdown
Owner

Conflict fixed, code changes looks good to me. All tests passed 🎉
Thanks for your contribution to datafast

@patrickfleith patrickfleith merged commit 173403f into patrickfleith:main Sep 19, 2025
@patrickfleith
Copy link
Copy Markdown
Owner

closes #103

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants