Skip to content

Latest commit

 

History

History
15 lines (9 loc) · 650 Bytes

embeddings.mdx

File metadata and controls

15 lines (9 loc) · 650 Bytes

Embedding

The embedding class is used to store and retrieve word embeddings from their indices. There are two types of embeddings in bitsandbytes, the standard PyTorch [Embedding] class and the [StableEmbedding] class.

The [StableEmbedding] class was introduced in the 8-bit Optimizers via Block-wise Quantization paper to reduce gradient variance as a result of the non-uniform distribution of input tokens. This class is designed to support quantization.

Embedding

[[autodoc]] bitsandbytes.nn.Embedding - init

StableEmbedding

[[autodoc]] bitsandbytes.nn.StableEmbedding - init