Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Operators] preliminary symmetric weight quantization #298

Merged
merged 10 commits into from Jul 4, 2023

Conversation

Aalanli
Copy link
Collaborator

@Aalanli Aalanli commented Jun 30, 2023

  • Added preliminary symmetric weight quantization in tasks
  • Changed Operator._run to Operator.symbolic_run, such that construction of the operator does not automatically run imperative_run, but symbolic_run. This is so that constant folding does not happen with barrier op, and leads to the intended behaviour with quantization layers. This should be fine to do since all imperative calls calls op.get_output(...), anyways, which calls imperative_run

@Aalanli Aalanli requested a review from yaoyaoding June 30, 2023 19:27
Copy link
Member

@yaoyaoding yaoyaoding left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Aalanli !

I left some comments, and made some update directly on this PR.

I refactored a little bit on the run, symbolic_run and imperative_run.

python/hidet/graph/ops/quant/symmetric.py Outdated Show resolved Hide resolved
python/hidet/graph/ops/quant/symmetric.py Outdated Show resolved Hide resolved
python/hidet/ir/compute/cops/reduce.py Outdated Show resolved Hide resolved
python/hidet/graph/nn/linear.py Outdated Show resolved Hide resolved
Copy link
Collaborator Author

@Aalanli Aalanli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the changes, makes sense.

@yaoyaoding
Copy link
Member

Thanks @Aalanli !

@yaoyaoding yaoyaoding merged commit f937ad2 into hidet-org:main Jul 4, 2023
2 checks passed
@Aalanli Aalanli deleted the linear-layer-quantization branch September 27, 2023 18:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants