Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug in fixed_point_quantize #21

Closed
calvinqi opened this issue May 30, 2019 · 3 comments
Closed

Bug in fixed_point_quantize #21

calvinqi opened this issue May 30, 2019 · 3 comments

Comments

@calvinqi
Copy link

Hi again! I'm experiencing some weird results in fixed_point_quantize that I think might not be correct behavior. Here is an example:

import torch
from qtorch.quant import fixed_point_quantize

tensor = torch.Tensor([1, 2, -1, -2])
fixed_point_tensor = fixed_point_quantize(tensor, wl=8, fl=8, rounding='nearest')
print('full precision tensor:', tensor)
print('fixed point quantized tensor:', fixed_point_tensor)

The output of this is:

full precision tensor: tensor([ 1.,  2., -1., -2.])
fixed point quantized tensor: tensor([ 0.4961,  0.4961, -0.5000, -0.5000])

However, with 8 integer bits and 8 fractional bits, the original tensor values should definitely be representable, so the result should be the exact same. Would be great to take a look at this and see what's going on!

@Tiiiger
Copy link
Owner

Tiiiger commented May 31, 2019

Hi @calvinqi,

This is more of a naming issue...

So wl is an acronym of word length, which refers to the total number of bits used to represent the numbers. In other words, wl=1+integer bits+fractional bits.

so in this example, integer bits is actually -1, so the smallest representable number is 2^(-1) = -0.5.

@Tiiiger
Copy link
Owner

Tiiiger commented May 31, 2019

thanks for pointing this confusion out...I will update the documents and examples to reflect this...

@calvinqi
Copy link
Author

Oh I see. Thank you so much for the clarification and for being so responsive!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants