-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve and add more tests #807
Conversation
3cfe768
to
5797454
Compare
Codecov ReportAll modified lines are covered by tests ✅
... and 9 files with indirect coverage changes 📢 Thoughts on this report? Let us know!. |
pyproject.toml
Outdated
@@ -120,6 +120,7 @@ ftfy = { version = "6.1.1", optional = true } | |||
regex = { version = "2023.8.8", optional = true } | |||
huggingface_hub = { version = "^0.17.3", optional = true } | |||
pymilvus = "2.3.1" | |||
pytest-asyncio = "^0.21.1" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should be part of the dev dependencies.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, I am not sure if we really need this package btw.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@deshraj thank you, I've added it to dev dependency. It's actually needed by a test for the Poe bot, which uses async code that's not currently running and is being skipped.
tests/llm/test_huggingface.py
Outdated
assert answer == "Test answer" | ||
mock_llm_instance.assert_called_once_with("Test query") | ||
mock_llm_instance.assert_called_once_with("Test query") | ||
mock_llm_instance.assert_called_once_with("Test query") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need to call this three times? I am a bit confused here 😅
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for the miss, yup you're right! I've removed it :)
571f2eb
to
b793d21
Compare
Description
This PR fixes Open AI LLM for streaming and adds more tests