Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

azure_openai llm always read timeout,need to customize the value of timeout #4342

Open
3 of 4 tasks
atang220 opened this issue May 13, 2024 · 1 comment
Open
3 of 4 tasks
Labels
💪 enhancement New feature or request

Comments

@atang220
Copy link

Self Checks

  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • Please do not modify this template :) and fill in all the required fields.

1. Is this request related to a challenge you're experiencing?

yes, most llm request cost cover 60 seconds ~ 600 seconds, so my workflows most are not working

2. Describe the feature you'd like to see

when add/edit llm configuration, there have timeout settings

3. How will this feature improve your workflow or experience?

my workflows most can work

4. Additional context or comments

No response

5. Can you help us with this feature?

  • I am interested in contributing to this feature.
@atang220
Copy link
Author

20240513-195758

@atang220 atang220 changed the title azure_openai llm always read timeout,need to customize readtimeout azure_openai llm always read timeout,need to customize the value of timeout May 13, 2024
@dosubot dosubot bot added the 💪 enhancement New feature or request label May 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
💪 enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant