Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(instructor/patch.py, tests/openai/test_multitask.py): Correct model checking and add new tests #413

Merged
merged 2 commits into from
Feb 7, 2024

Conversation

jxnl
Copy link
Collaborator

@jxnl jxnl commented Feb 7, 2024

fixes #412


Ellipsis 🚀 This PR description was created by Ellipsis for commit 3239f40.

Summary:

This PR fixes a bug in instructor/patch.py and adds new tests in tests/openai/test_multitask.py for streaming and non-streaming scenarios.

Key points:

  • Modified process_response and process_response_async in instructor/patch.py to check model instead of response_model.
  • Added new tests in tests/openai/test_multitask.py for streaming and non-streaming scenarios, both synchronous and asynchronous.

Generated with ❤️ by ellipsis.dev

@jxnl jxnl changed the title ... fix: Adding tests to make sure non-stream iterables work Feb 7, 2024
@jxnl jxnl merged commit f9389c1 into main Feb 7, 2024
17 checks passed
@jxnl jxnl deleted the fix-iterable branch February 7, 2024 14:46
@ellipsis-dev ellipsis-dev bot changed the title fix: Adding tests to make sure non-stream iterables work feat(instructor/patch.py, tests/openai/test_multitask.py): modify isinstance checks and add new test functions Feb 7, 2024
@ellipsis-dev ellipsis-dev bot changed the title feat(instructor/patch.py, tests/openai/test_multitask.py): modify isinstance checks and add new test functions fix(instructor/patch.py, tests/openai/test_multitask.py): Correct model checking and add new tests Feb 7, 2024
Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes requested.


Something look wrong? You can customize Ellipsis by editing the ellipsis.yaml for this repository.

Generated with ❤️ by ellipsis.dev

@@ -213,8 +213,7 @@ def process_response(

# ? This really hints at the fact that we need a better way of
# ? attaching usage data and the raw response to the model we return.
if isinstance(response_model, IterableBase):
#! If the response model is a multitask, return the tasks
if isinstance(model, IterableBase):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variable 'model' is not defined in the scope of this function. This will cause a NameError at runtime. Please revert the change to 'response_model'.

@@ -267,7 +266,7 @@

# ? This really hints at the fact that we need a better way of
# ? attaching usage data and the raw response to the model we return.
if isinstance(response_model, IterableBase):
if isinstance(model, IterableBase):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variable 'model' is not defined in the scope of this function. This will cause a NameError at runtime. Please revert the change to 'response_model'.

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes requested.


Something look wrong? You can customize Ellipsis by editing the ellipsis.yaml for this repository.

Generated with ❤️ by ellipsis.dev

@@ -213,8 +214,7 @@ def process_response(

# ? This really hints at the fact that we need a better way of
# ? attaching usage data and the raw response to the model we return.
if isinstance(response_model, IterableBase):
#! If the response model is a multitask, return the tasks
if isinstance(model, IterableBase):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variable model is not defined in the scope of this function. This will likely cause a NameError at runtime. Did you mean to use response_model instead?

resp = []
async for user in await stream_extract(input="Jason is 20, Sarah is 30"):
resp.append(user)
print(resp)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove the print statement. It's not a good practice to have print statements in test cases. If you want to check the value of resp, consider using assertions or logging.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Iterables extraction not working in 0.5.0
1 participant