Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Conversation

@bfineran
Copy link
Contributor

@bfineran bfineran commented Feb 5, 2021

This change will enable SparseML to parse multiple lists of modifiers (groups) from a recipe. The resulting structure will look like this:

training_modifiers:
  - !EpochRangeModifier
    ...

  - !SetLearningRateModifier
    ...

pruning_modifiers:
  - !GMPruningModifier
      ...
  - !GMPruningModifier
        ...
  - !GMPruningModifier
        ...

quantization_modifiers:
  - !QuantizationModifier
      ...
  - !SetWeightDecayModifier
      ...

All modifiers will be parsed into a single Modifier list at run time. Not adding any filtering in the lists to see if they are actually of modifiers, so the user will be able to see any errors if the recipe is incorrect.

mgoin
mgoin previously approved these changes Feb 5, 2021
Copy link
Contributor

@natuan natuan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given this will get more interesting, should we have some unit tests for it?

Copy link
Member

@markurtz markurtz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the simplicity, need a bit more visibility into the parsing for the user and I agree with Tuan on getting some unit tests in

else: # Dict
modifiers = []
for key, modifier_list in container.items():
if "modifiers" in key and isinstance(modifier_list, List):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if a user accidentally mispells a group, we'll have a silent ignore here. We should at least warn if a key doesn't have modifiers in it and won't be used, and potentially could throw an exception here to be explicit since we aren't supporting anything else currently

Copy link
Contributor Author

@bfineran bfineran Feb 5, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agreed. this was definitely a concern the issue is that not every top level dict element will necessarily be a modifier list. We could have for instance a list of parameter names stored as a variable to reference elsewhere. If this wasn't the case I would not have required "modifier" to be in the name at all.

One alternative is we could search all lists for BaseModifier objects and flatten that way, letting group names just be for organization.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, will raise an error when a Modifier object is found outside of a named "modifier" list. That way we could keep the YAML flexibility, but enforce some simple structure in how modifiers are defined.

@bfineran
Copy link
Contributor Author

bfineran commented Feb 5, 2021

@markurtz @natuan updated with unit tests and validation for modifier list names.

@bfineran bfineran merged commit 55b3e23 into main Feb 5, 2021
@bfineran bfineran deleted the modifier-groups branch February 5, 2021 23:54
markurtz added a commit that referenced this pull request Feb 25, 2021
* GA code, toctree links (#61)

- added tracking for docs output
- added help links for docs output

* Update README.md (#64)

removed placeholder reference to comingsoon repo in favor of active repo

* add decorator for flaky tf sparsity tests (#65)

* enable modifier groups in SparseML recipes (#66)

* enable modifier groups in SparseML recipes
* unit tests for YAML modifier list loading

* make build argument for nightly builds (#63)

* rst url syntax correction (#67)

correcting double slash at the end of URLs with updates to index.rst pre-compilation

* match types explicitly in torch qat quant observer wrappers (#68)

* docs updates (#71)

enhancing left nav for Help; after this merge, the docs need to be rebuilt for this repo so docs.neuralmagic.com can be refreshed. cc @markurtz

* Rename KSModifier to PruningModifier (#76)

* Rename ConstantKSModifier to ConstantPruningModifier

* Rename GradualKSModifier to GMPruningModifier

* Fix broken link for Optimization Recipes (#75)

* Serialize/deserialize MaskedLayer (#69)

* Serialize/deserialize MaskedLayer

* Remove unused symbols

* Register pruning scheduler classes for serialization

* removed ScheduledOptimizer, moved logic to ScheduledModifierManager (#77)

* load recipes directly from SparseZoo (#72)

* load recipes into Managers from sparsezoo stubs
* moving recipe_type handling to SparseZoo only, supporting loading SparseZoo recipe objects

* Revert "removed ScheduledOptimizer, moved logic to ScheduledModifierManager (#77)" (#80)

This reverts commit 6073abb.

* Update for 0.1.1 release (#82)

* Update for 0.1.1 release
- update python version to 0.1.1
- setup.py add in version parts and _VERSION_MAJOR_MINOR for more flexibility with dependencies between neural magic packages
- add in deepsparse optional install pathway

* missed updating version to 0.1.1

* rename examples directory to integrations (#78)

* rwightman/pytorch-image-models integration (#70)

* load checkpoint file based on sparsezoo recipe in pytorch_vision script (#83)

* ultralytics/yolov5 integration (#73)

* pytorch sparse quantized transfer learning notebook (#81)

* load qat onnx models for conversion from file path (#86)

* Sparsification update (#84)

* Sparsification update
- update sparsification descriptions and move to preferred verbage

* update from comments on deepsparse for sparsification

* Update README.md

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update README.md

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update README.md

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update README.md

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update docs/source/index.rst

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update docs/source/index.rst

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update docs/source/index.rst

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update docs/source/index.rst

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update docs/source/recipes.md

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* fix links in index.rst from reviewed content

* update overviews and taglines from doc

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* blog style readme for torch sparse-quant TL notebook (#85)

Co-authored-by: Jeannie Finks (NM) <74554921+jeanniefinks@users.noreply.github.com>
Co-authored-by: Benjamin Fineran <bfineran@users.noreply.github.com>
Co-authored-by: Eldar Kurtic <eldar.ciki@gmail.com>
Co-authored-by: Tuan Nguyen <tuan@neuralmagic.com>
markurtz pushed a commit that referenced this pull request Sep 1, 2021
* Update faqs.md

* Update getstarted.md

Inclusion of NLP support

Removing a question that could use a better response until GTM alignment is made.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants