Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Conversation

@markurtz
Copy link
Member

@markurtz markurtz commented Feb 2, 2022

No description provided.

@markurtz markurtz requested a review from a team February 2, 2022 13:26
@markurtz markurtz self-assigned this Feb 2, 2022
@markurtz markurtz requested review from bfineran, dbogunowicz and natuan and removed request for a team February 2, 2022 13:26
bfineran
bfineran previously approved these changes Feb 2, 2022
@markurtz markurtz merged commit 8f4012e into main Feb 2, 2022
bfineran pushed a commit that referenced this pull request Feb 2, 2022
…537)

* Fix model load bug and add logging to catch potential future issues

* initial migration to generalize module sparsification information

* propagate ModuleSparsificationInfo

* report type of input tensors in export.py

* minor bug fixes

* ModuleSparsificationInfo docs

* export onnx bugfix

* bug fixes

* make style

* bug fix for quantization

* revert to use ScheduledOptimizer due to bug with torch LambdaLR

* remove language_modeling script

* add end model sparsification log

Co-authored-by: Benjamin <ben@neuralmagic.com>
markurtz added a commit that referenced this pull request Feb 2, 2022
* Refactor of Transformers SparseML CLI and integrations (#536)

* Refactor of Transformers SparseML CLI and integrations

* Refactor export.py to use new pathways, fix make quality

* Update src/sparseml/optim/manager.py

Co-authored-by: Rahul Tuli <rahul@neuralmagic.com>

* Update src/sparseml/transformers/utils/model.py

Co-authored-by: Rahul Tuli <rahul@neuralmagic.com>

* fixes from review

* fixes from review and testing

* bug fixes and logging

* bug fixes for export and distillation

* review fixes, quality fixes, style fixes

* fix dependency issue

* fix distillation tests

* fix distillation tests

* fix distillation tests

* fill in docs and update style

* fix issue with distillation improperly updating students inputs

* fix quality

* Update src/sparseml/pytorch/optim/modifier_distillation.py

* add in better logging for missing and unexpected keys in model reload for transformers trainer

* fix logging for transformers export

Co-authored-by: Rahul Tuli <rahul@neuralmagic.com>

* Fix model load bug and add logging to catch potential future issues (#537)

* Fix model load bug and add logging to catch potential future issues

* initial migration to generalize module sparsification information

* propagate ModuleSparsificationInfo

* report type of input tensors in export.py

* minor bug fixes

* ModuleSparsificationInfo docs

* export onnx bugfix

* bug fixes

* make style

* bug fix for quantization

* revert to use ScheduledOptimizer due to bug with torch LambdaLR

* remove language_modeling script

* add end model sparsification log

Co-authored-by: Benjamin <ben@neuralmagic.com>

Co-authored-by: Mark Kurtz <mark@neuralmagic.com>
Co-authored-by: Rahul Tuli <rahul@neuralmagic.com>
@jeanniefinks jeanniefinks deleted the transformers_refactor_fixes branch February 10, 2022 18:43
KSGulin pushed a commit that referenced this pull request Feb 15, 2022
…537)

* Fix model load bug and add logging to catch potential future issues

* initial migration to generalize module sparsification information

* propagate ModuleSparsificationInfo

* report type of input tensors in export.py

* minor bug fixes

* ModuleSparsificationInfo docs

* export onnx bugfix

* bug fixes

* make style

* bug fix for quantization

* revert to use ScheduledOptimizer due to bug with torch LambdaLR

* remove language_modeling script

* add end model sparsification log

Co-authored-by: Benjamin <ben@neuralmagic.com>
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants