Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Community: Add mistral oss model support to azureml endpoints, plus configurable timeout #19123

Merged
merged 11 commits into from
Mar 19, 2024

Conversation

tjaffri
Copy link
Contributor

@tjaffri tjaffri commented Mar 15, 2024

  • Description: There was no formatter for mistral models for Azure ML endpoints. Adding that, plus a configurable timeout (it was hard coded before)
  • Dependencies: none
  • Twitter handle: @tjaffri @docugami

@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Mar 15, 2024
Copy link

vercel bot commented Mar 15, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview Mar 16, 2024 3:29am

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🤖:improvement Medium size change to existing code to handle new use-cases labels Mar 15, 2024
@dosubot dosubot bot added the lgtm PR looks good. Use to confirm that a PR is ready for merging. label Mar 19, 2024
@hwchase17 hwchase17 merged commit 044bc22 into langchain-ai:master Mar 19, 2024
59 checks passed
rahul-trip pushed a commit to daxa-ai/langchain that referenced this pull request Mar 27, 2024
…onfigurable timeout (langchain-ai#19123)

- **Description:** There was no formatter for mistral models for Azure
ML endpoints. Adding that, plus a configurable timeout (it was hard
coded before)
- **Dependencies:** none
- **Twitter handle:** @tjaffri @docugami
bechbd pushed a commit to bechbd/langchain that referenced this pull request Mar 29, 2024
…onfigurable timeout (langchain-ai#19123)

- **Description:** There was no formatter for mistral models for Azure
ML endpoints. Adding that, plus a configurable timeout (it was hard
coded before)
- **Dependencies:** none
- **Twitter handle:** @tjaffri @docugami
gkorland pushed a commit to FalkorDB/langchain that referenced this pull request Mar 30, 2024
…onfigurable timeout (langchain-ai#19123)

- **Description:** There was no formatter for mistral models for Azure
ML endpoints. Adding that, plus a configurable timeout (it was hard
coded before)
- **Dependencies:** none
- **Twitter handle:** @tjaffri @docugami
chrispy-snps pushed a commit to chrispy-snps/langchain that referenced this pull request Mar 30, 2024
…onfigurable timeout (langchain-ai#19123)

- **Description:** There was no formatter for mistral models for Azure
ML endpoints. Adding that, plus a configurable timeout (it was hard
coded before)
- **Dependencies:** none
- **Twitter handle:** @tjaffri @docugami
chrispy-snps pushed a commit to chrispy-snps/langchain that referenced this pull request Mar 30, 2024
…onfigurable timeout (langchain-ai#19123)

- **Description:** There was no formatter for mistral models for Azure
ML endpoints. Adding that, plus a configurable timeout (it was hard
coded before)
- **Dependencies:** none
- **Twitter handle:** @tjaffri @docugami
hinthornw pushed a commit that referenced this pull request Apr 26, 2024
…onfigurable timeout (#19123)

- **Description:** There was no formatter for mistral models for Azure
ML endpoints. Adding that, plus a configurable timeout (it was hard
coded before)
- **Dependencies:** none
- **Twitter handle:** @tjaffri @docugami
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:improvement Medium size change to existing code to handle new use-cases lgtm PR looks good. Use to confirm that a PR is ready for merging. Ɑ: models Related to LLMs or chat model modules size:M This PR changes 30-99 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants