Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed examples under Fast API section #256

Closed
wants to merge 3 commits into from

Conversation

waseemhnyc
Copy link
Contributor

@waseemhnyc waseemhnyc commented Dec 6, 2023

Results:

https://jxnl.github.io/instructor/concepts/fastapi/#code-example-starting-a-fastapi-app-with-a-post-request
Screenshot 2023-12-05 at 11 08 15 PM

https://jxnl.github.io/instructor/concepts/fastapi/#streaming-responses-with-fastapi
Screenshot 2023-12-05 at 11 09 26 PM

Summary by CodeRabbit

  • New Features

    • Introduced new API endpoint /extract for handling streaming events and returning user data.
  • Documentation

    • Updated documentation to reflect the use of OpenAI library and new data models.
  • Refactor

    • Transitioned from asynchronous to synchronous calls for the OpenAI service interaction.
  • New Data Models

    • Added UserData and UserDetail models to standardize API input and output.

Copy link
Contributor

coderabbitai bot commented Dec 6, 2023

Walkthrough

The FastAPI application has been refactored to use the synchronous OpenAI library for API interactions, replacing the asynchronous variant. New data models for user input and output have been introduced, enhancing the structure and clarity of data handling. Additionally, a fresh endpoint has been added to facilitate streaming responses for large data sets, marking a significant update to the service's capabilities.

Changes

File Path Change Summary
.../fastapi.md - Switched from AsyncOpenAI to OpenAI library
- Changed method calls from asynchronous to synchronous
- Added UserData and UserDetail data models
- Introduced new /extract endpoint for SSE events

🐇✨
To code, a change we bring,
With OpenAI, our app does sing.
New models dance in line,
As endpoints new, intertwine.
🎉🌟


Tips

Chat with CodeRabbit Bot (@coderabbitai)

  • If you reply to a review comment from CodeRabbit, the bot will automatically respond.
  • To engage with CodeRabbit bot directly around the specific lines of code in the PR, mention @coderabbitai in your review comment
  • Note: Review comments are made on code diffs or files, not on the PR overview.
  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Note: For conversation with the bot, please use the review comments on code diffs or files.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • The JSON schema for the configuration file is available here.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/coderabbit-overrides.v2.json

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Configuration used: CodeRabbit UI

Commits Files that changed from the base of the PR and between ba1070a and c0fb798.
Files selected for processing (1)
  • docs/concepts/fastapi.md (3 hunks)
Additional comments: 4
docs/concepts/fastapi.md (4)
  • 18-20: The summary mentions a change from AsyncOpenAI to OpenAI, but the code snippets do not show any import statement changes. Please verify if the summary is accurate or if the import statement change is missing in the hunks.

  • 74-75: The summary indicates that method invocations have been changed from asynchronous to synchronous, but the extract function is still defined with the async keyword. Please verify if the function should be synchronous as per the summary.

  • 26-26: The addition of UserData and UserDetail models aligns with the summary and is a good practice for defining structured data in FastAPI.

  • 74-74: The addition of the /extract endpoint is consistent with the summary and is a valuable addition for handling streaming responses in FastAPI.

@waseemhnyc waseemhnyc changed the title Get examples working under Fast API section Fixed examples under Fast API section Dec 6, 2023
@jxnl
Copy link
Collaborator

jxnl commented Dec 6, 2023

should fundamentally be using async tho

@jxnl
Copy link
Collaborator

jxnl commented Dec 6, 2023

@Anmol6 can you take a look here?

@Anmol6
Copy link
Contributor

Anmol6 commented Dec 7, 2023

Good catch @waseemhnyc ! turns out async open ai calls with stream=True return an async generator. I'll put up a fix to handle this. I don't think the original example needs to change though!

@waseemhnyc
Copy link
Contributor Author

waseemhnyc commented Dec 7, 2023

Oh nice - thanks for taking a look at this @Anmol6. Look forward to reading your fix/pr. Still getting familiar with some of this async python stuff 😬

@jxnl jxnl closed this Dec 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants