The embedding class is used to store and retrieve word embeddings from their indices. There are two types of embeddings in bitsandbytes, the standard PyTorch [Embedding
] class and the [StableEmbedding
] class.
The [StableEmbedding
] class was introduced in the 8-bit Optimizers via Block-wise Quantization paper to reduce gradient variance as a result of the non-uniform distribution of input tokens. This class is designed to support quantization.
[[autodoc]] bitsandbytes.nn.Embedding - init
[[autodoc]] bitsandbytes.nn.StableEmbedding - init