Skip to content

Conversation

@jspark1105
Copy link
Contributor

Summary:
Per-group and per-channel quantization in fbgemm
This diff also cleans up explicit template instantiation using macro expansion
Using this in DNNLOWP operators will be done in a separate diff.

Differential Revision: D13176386

Differential Revision: D13166591

fbshipit-source-id: 667e08448ae2546bf49693d66b16f517f0e4fe7e
Differential Revision: D13167073

fbshipit-source-id: 7926bc42856fc1119d9e6e1b893ef8856e792bac
Summary:
Pull Request resolved: pytorch#14340

Pull Request resolved: pytorch/FBGEMM#25

Per-group and per-channel quantization in fbgemm
This diff also cleans up explicit template instantiation using macro expansion
This diff also changes randFill interface which was easy to make mistakes of generating integer random numbers for floating point vectors.

Using this in DNNLOWP operators will be done in a separate diff.

Differential Revision: D13176386

fbshipit-source-id: 3137039d2822e42a16881638d54897d9c8bc75f4
jspark1105 added a commit to jspark1105/FBGEMM that referenced this pull request Nov 26, 2018
Summary:
Pull Request resolved: pytorch/pytorch#14340

Pull Request resolved: pytorch#25

Per-group and per-channel quantization in fbgemm
This diff also cleans up explicit template instantiation using macro expansion
This diff also changes randFill interface which was easy to make mistakes of generating integer random numbers for floating point vectors.

Using this in DNNLOWP operators will be done in a separate diff.

Differential Revision: D13176386

fbshipit-source-id: e08c676b6b9cf301f76b87cdb901ecc51c4cc8a4
facebook-github-bot pushed a commit that referenced this pull request Nov 27, 2018
Summary:
Pull Request resolved: #14340

Pull Request resolved: pytorch/FBGEMM#25

Per-group and per-channel quantization in fbgemm
This diff also cleans up explicit template instantiation using macro expansion
This diff also changes randFill interface which was easy to make mistakes of generating integer random numbers for floating point vectors.

Using this in DNNLOWP operators will be done in a separate diff.

Reviewed By: dskhudia

Differential Revision: D13176386

fbshipit-source-id: e46c53e31e21520bded71b8ed86e8b19e010e2dd
@ezyang ezyang added the merged label Jun 25, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants