Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Opens sentence transformer backend to edit batch_size param #210

Merged
merged 2 commits into from
Feb 28, 2024

Conversation

adhadse
Copy link
Contributor

@adhadse adhadse commented Feb 26, 2024

- updates flake8 pre-commit-config to use github repo
Copy link
Owner

@MaartenGr MaartenGr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your work on this PR! A few minor points of feedback here and there relating to the model and backend parameters.

keybert/__init__.py Outdated Show resolved Hide resolved
keybert/_model.py Outdated Show resolved Hide resolved
keybert/backend/__init__.py Outdated Show resolved Hide resolved
tests/test_backend.py Outdated Show resolved Hide resolved
tests/test_backend.py Outdated Show resolved Hide resolved
@adhadse
Copy link
Contributor Author

adhadse commented Feb 26, 2024

Thanks for the valuable feedback, I've updated the PR with the suggestions.

@MaartenGr
Copy link
Owner

Thanks for the changes, LGTM! Thank you for the PR, it is appreciated.

@MaartenGr MaartenGr merged commit dcf31dd into MaartenGr:master Feb 28, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Allow KeyBERT to pass batch_size to llm.encode() method
2 participants