Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Improve OpenAI mock server streams #1890

Merged
merged 1 commit into from Nov 29, 2023

Conversation

jsumners-nr
Copy link
Contributor

The OpenAI folks helped me realize that I didn't quite design the error stream case in our mock server. This PR refactors the mock server to induce an error at the client side that will be handled correctly by the OpenAI client, and in turn bubbled up to our code as a legitimate connection error.

Copy link

codecov bot commented Nov 29, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (d11d100) 96.87% compared to head (db421f7) 96.87%.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #1890   +/-   ##
=======================================
  Coverage   96.87%   96.87%           
=======================================
  Files         209      209           
  Lines       39768    39768           
=======================================
  Hits        38527    38527           
  Misses       1241     1241           
Flag Coverage Δ
integration-tests-16.x 78.75% <ø> (ø)
integration-tests-18.x 79.02% <ø> (-0.02%) ⬇️
integration-tests-20.x 79.03% <ø> (-0.01%) ⬇️
unit-tests-16.x 90.99% <ø> (ø)
unit-tests-18.x 90.97% <ø> (ø)
unit-tests-20.x 90.97% <ø> (ø)
versioned-tests-16.x 73.72% <ø> (-0.03%) ⬇️
versioned-tests-18.x 73.72% <ø> (-0.03%) ⬇️
versioned-tests-20.x 73.73% <ø> (-0.03%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@jsumners-nr jsumners-nr added the dev:tests Indicates only changes to tests label Nov 29, 2023
@@ -105,28 +128,25 @@ function goodStream(dataToStream, chunkTemplate) {
this.push(null)
}
}
})
}).pause()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok so because the stream was finished we couldn't destroy it which would cause the openai library to not handle the errors properly? do we need to pause the stream in the good case though?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Streams start filling up their internal buffer as soon as they are created. Hitting pause on it until the stream.pipe(res) happens just gives some breathing room between stream creation and consumption. I am convinced it is necessary for the error case stream. I am open to removing .pause() from the normal case, though, if you feel strongly about it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok makes sense. I don't feel strongly about it just caling it out. I'll approve

@bizob2828 bizob2828 merged commit d12bfe4 into newrelic:main Nov 29, 2023
24 checks passed
Node.js Engineering Board automation moved this from Needs PR Review to Done: Issues recently completed Nov 29, 2023
@jsumners-nr jsumners-nr deleted the mock-server-improvement branch November 29, 2023 17:27
@github-actions github-actions bot mentioned this pull request Dec 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dev:tests Indicates only changes to tests
Projects
Node.js Engineering Board
  
Done: Issues recently completed
Development

Successfully merging this pull request may close these issues.

None yet

2 participants