Skip to content
This repository has been archived by the owner on Jan 24, 2024. It is now read-only.

build(deps): update bitsandbytes requirement from ~=0.39.1 to ~=0.40.0 #226

Merged
merged 1 commit into from
Jul 12, 2023

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Jul 11, 2023

Updates the requirements on bitsandbytes to permit the latest version.

Changelog

Sourced from bitsandbytes's changelog.

0.0.21

  • Ampere, RTX 30 series GPUs now compatible with the library.

0.0.22:

  • Fixed an error where a reset_parameters() call on the StableEmbedding would lead to an error in older PyTorch versions (from 1.7.0).

0.0.23:

Bugs:

  • Unified quantization API: each quantization function now returns Q, S where Q is the quantized tensor and S the quantization state which may hold absolute max values, a quantization map or more. For dequantization all functions now accept the inputs Q, S so that Q is dequantized with the quantization state S.
  • Fixed an issue where the CUDA 11.1 binary was not compiled with the right headers

API changes:

  • Block-wise quantization for optimizers now enabled by default

Features:

  • Block-wise quantization routines now support CPU Tensors.

0.0.24:

  • Fixed a bug where a float/half conversion led to a compilation error for CUDA 11.1 on Turning GPUs.
  • removed Apex dependency for bnb LAMB

0.0.25:

Features:

  • Added skip_zeros for block-wise and 32-bit optimizers. This ensures correct updates for sparse gradients and sparse models.
  • Added support for Kepler GPUs. (#4)
  • Added Analysis Adam to track 8-bit vs 32-bit quantization errors over time.
  • Make compilation more user friendly.

Bug fixes:

  • fixed "undefined symbol: __fatbinwrap_38" error for P100 GPUs on CUDA 10.1 (#5)

Docs:

  • Added docs with instructions to compile from source.

0.26.0:

Features:

  • Added Adagrad (without grad clipping) as 32-bit and 8-bit block-wise optimizer.
  • Added AdamW (copy of Adam with weight decay init 1e-2). #10
  • Introduced ModuleConfig overrides which can be seamlessly be used at initialization time of a module.
  • Added bnb.nn.Embedding layer which runs at 32-bit but without the layernorm. This works well if you need to fine-tune pretrained models that do not have a embedding layer norm. #19

Bug fixes:

  • Fixed a bug where weight decay was incorrectly applied to 32-bit Adam. #13

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Updates the requirements on [bitsandbytes](https://github.com/TimDettmers/bitsandbytes) to permit the latest version.
- [Release notes](https://github.com/TimDettmers/bitsandbytes/releases)
- [Changelog](https://github.com/TimDettmers/bitsandbytes/blob/main/CHANGELOG.md)
- [Commits](https://github.com/TimDettmers/bitsandbytes/commits)

---
updated-dependencies:
- dependency-name: bitsandbytes
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Jul 11, 2023
@peakji peakji merged commit 33080a0 into master Jul 12, 2023
@peakji peakji deleted the dependabot/pip/bitsandbytes-approx-eq-0.40.0 branch July 12, 2023 02:08
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant