Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

'NoneType' object has no attribute 'cdequantize_blockwise_cpu_fp32' #31

Open
HumzaSami00 opened this issue Nov 15, 2022 · 0 comments
Open

Comments

@HumzaSami00
Copy link

I am trying to train GPT-J with 8bit weights. It's working well on GPU. But When I try to use it on CPU, it gives this error

'NoneType' object has no attribute 'cdequantize_blockwise_cpu_fp32'

I have used dequantize_blockwise from bitsandbytes.functional. Following is the class in which its used:

class DequantizeAndLinear(torch.autograd.Function):

    def forward(ctx, input: torch.Tensor, weights_quantized: torch.ByteTensor,
                absmax: torch.FloatTensor, code: torch.FloatTensor, bias: torch.FloatTensor):
        weights_deq = dequantize_blockwise(weights_quantized, absmax=absmax, code=code)
        ctx.save_for_backward(input, weights_quantized, absmax, code)
        ctx._has_bias = bias is not None
        return F.linear(input, weights_deq, bias)

    def backward(ctx, grad_output: torch.Tensor):
        assert not ctx.needs_input_grad[1] and not ctx.needs_input_grad[2] and not ctx.needs_input_grad[3]
        input, weights_quantized, absmax, code = ctx.saved_tensors
        # grad_output: [*batch, out_features]
        weights_deq = dequantize_blockwise(weights_quantized, absmax=absmax, code=code)
        grad_input = grad_output @ weights_deq
        grad_bias = grad_output.flatten(0, -2).sum(dim=0) if ctx._has_bias else None
        return grad_input, None, None, None, grad_bias

Is it possible to run it on CPUor should I have to run it only GPU ?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant