Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Llama Bedrock client #1943

Merged
merged 6 commits into from
Jun 4, 2024
Merged

Added Llama Bedrock client #1943

merged 6 commits into from
Jun 4, 2024

Conversation

kevinmessiaen
Copy link
Member

Description

Added Llama Bedrock client

Related Issue

Type of Change

  • 馃摎 Examples / docs / tutorials / dependencies update
  • 馃敡 Bug fix (non-breaking change which fixes an issue)
  • 馃 Improvement (non-breaking change which improves an existing feature)
  • 馃殌 New feature (non-breaking change which adds functionality)
  • 馃挜 Breaking change (fix or feature that would cause existing functionality to change)
  • 馃攼 Security fix

Checklist

  • I've read the CODE_OF_CONDUCT.md document.
  • I've read the CONTRIBUTING.md guide.
  • I've written tests for all new methods and classes that I created.
  • I've written the docstring in Google format for all the methods and classes that I used.
  • I've updated the pdm.lock running pdm update-lock (only applicable when pyproject.toml has been
    modified)

Copy link

sonarcloud bot commented May 29, 2024

Please retry analysis of this Pull-Request directly on SonarCloud

@kevinmessiaen kevinmessiaen marked this pull request as ready for review June 4, 2024 06:06
Copy link

sonarcloud bot commented Jun 4, 2024

Please retry analysis of this Pull-Request directly on SonarCloud

Copy link
Member

@henchaves henchaves left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's working well!

By the way I found a small error in Claude Client, line 91.

we should replace
sampled_tokens=completion["usage"]["input_tokens"]
with
sampled_tokens=completion["usage"]["output_tokens"]

@henchaves
Copy link
Member

Also, it would be better to have a common class called BedrockLLMClient.
It seems that both LLamaBedrockClient and ClaudeBedrockClient are reusing several pieces of code

Copy link

sonarcloud bot commented Jun 4, 2024

Please retry analysis of this Pull-Request directly on SonarCloud

Copy link

sonarcloud bot commented Jun 4, 2024

Please retry analysis of this Pull-Request directly on SonarCloud

@henchaves henchaves merged commit 3949c9d into main Jun 4, 2024
15 of 16 checks passed
@henchaves henchaves deleted the feature/llama-bedrock-support branch June 4, 2024 10:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants