Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rethiking the affect of deleting qformat() #13

Closed
majianjia opened this issue Mar 13, 2019 · 4 comments
Closed

Rethiking the affect of deleting qformat() #13

majianjia opened this issue Mar 13, 2019 · 4 comments

Comments

@majianjia
Copy link
Owner

Currently, the Q format is handled by the scripts.
The nnom cannot really know what is the q format of the layers outputs but their output shifting.

This is fine with most of the layers, such as conv, dense, and relu.
However, the activations such as sigmoid and tanh must know the current Q format because they are doing arithmetic base on the real number. There is a "num int bit" argument, which should be the m in Qmn format.

I am still thinking of the solution.

@parai
Do you have any suggestion?
Thanks

@parai
Copy link
Contributor

parai commented Mar 13, 2019

Oh, Let me think and get back to you.

@parai
Copy link
Contributor

parai commented Mar 14, 2019

I didn't think up a good way that could make nnom still looks good and pretty if not using qformat for each layer.
The ugly way is to add a parameter to TanH/Sigmoid layer and this is very easy to implement this by scripts, below code could be a demo:

layer[...] = TanH(INPUT_LAYER_Q);

But this is really ugly as I think.

But if restore to using Q Format for each layer, this is somehow not necessary for most of the layers.

So somehow I prefer the ugly way.

@majianjia
Copy link
Owner Author

Thanks, @parai
Its time to fixed this issue
I am now planning to do it like the way you did. Together with Mult() and Add() layer.

@majianjia
Copy link
Owner Author

majianjia commented Apr 11, 2019

This issue is solved after #30

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants