Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Backend - prompt service plumbing for error handling #315

Merged
merged 1 commit into from
May 7, 2024

Conversation

chandrasekharan-zipstack
Copy link
Contributor

What

  • Added support in prompt-service to log exceptions similar to backend and wrap with a valid response format
    { "code": 500 "name": "PromptServiceError" "description": "Some user-friendly message }
  • Update prompt-service FE status updates to include the doc_name
  • Added support in backend to parse and report this as an error, avoided suppressing errors
  • MINOR: Removed some unused code in the backend

Why

  • Previously there was no exception handling around prompt service calls back to the backend

How

  • Passed the error message back to the backend and also pushed some updates to the FE

Can this PR break any existing features. If yes, please list possible items. If no, please explain why. (PS: Admins do not merge the PR without this section filled)

  • No, no DB migrations. It doesn't introduce breaking changes to any APIs and all function definition changes were updated in places of use

Relevant Docs

Notes on Testing

  • Tried reproducing some prompt studio errors locally and observed the error. The error handling isn't 100% yet - we include JSONs in the error message at times and we would have to beautify the response a bit more wherever possible
  • Subsequent improvements require changes in the SDK and adapters which can follow in a future PR

Screenshots

image
image
image

Checklist

I have read and understood the Contribution Guidelines.

Copy link

sonarcloud bot commented May 6, 2024

Quality Gate Passed Quality Gate passed

Issues
3 New issues
0 Accepted issues

Measures
0 Security Hotspots
No data about Coverage
0.0% Duplication on New Code

See analysis details on SonarCloud

Copy link
Contributor

@Deepak-Kesavan Deepak-Kesavan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@chandrasekharan-zipstack In screenshot 2 and 3, we need to show the error description without JSON. Is it achievable only if we get rid of general exception catch or is it because the error happens outside our code?

@chandrasekharan-zipstack
Copy link
Contributor Author

chandrasekharan-zipstack commented May 7, 2024

@chandrasekharan-zipstack In screenshot 2 and 3, we need to show the error description without JSON. Is it achievable only if we get rid of general exception catch or is it because the error happens outside our code?

@Deepak-Kesavan the error \gets triggered from

openAI -> llama_index methods -> SDK -> Prompt Service

Ideally our LLM adapter should expose the .complete() method and the error should flow as
openAI -> llama_index methods -> Adapter -> SDK -> Prompt Service
  1. Now we can only implement error handling in SDK or prompt service for this and doing it in the SDK makes most sense so that it'll be applicable for all scenarios where we call an LLM.
  2. We'll have to do this for every LLM provider - go through their code base's error classes and obtain a way to parse the error message (preferably their base class so that all errors can be tackled) - I'll raise an SDK PR handling it for OpenAI

Copy link
Contributor

@muhammad-ali-e muhammad-ali-e left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@nehabagdia nehabagdia merged commit 57d1470 into main May 7, 2024
4 checks passed
@nehabagdia nehabagdia deleted the feat/error-handling-improvements-prompt-service branch May 7, 2024 06:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants