Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Conversation

@bfineran
Copy link
Contributor

@bfineran bfineran commented Feb 13, 2021

i.e. manager = ScheduledModifierManager.from_yaml("zoo:/model/stub/path?recipe_type=<recipe-type>")

@bfineran bfineran requested review from a team, markurtz and natuan February 13, 2021 20:20
@bfineran bfineran self-assigned this Feb 13, 2021
mgoin
mgoin previously approved these changes Feb 14, 2021
Copy link
Member

@mgoin mgoin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

haha, this explains what i discovered in the yolov5 PR. i will echo that i think this is a great idea

Copy link
Member

@markurtz markurtz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Small complaint that I noticed afterwards


def load_recipe_yaml_str(file_path: str) -> str:
def load_recipe_yaml_str(
file_path: str, zoo_recipe_type: Union[str, None] = None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm generally not a fan of arguments that are conditional use based on the value of another arg, can lead to unfound bugs in other pipelines since there is effectively a planned non-tested, non-supported pathway. Is there another way we can find to simplify it such that the conditional logic is handled elsewhere such as putting the zoo_recipe_type as a conditional piece of the stub?

Also, we should take in a SparseZoo Recipe object here as well. That could also be another route for supplying the zoo_recipe_type that we don't need to explicitly call out here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My original plan was to insert it in the stub with something like zoo:path/to/stub.transfer or zoo:path/to/stub?recipe_type=transfer but decided against it because it would be extra syntax for users to discover. Additionally didn't want to add extras to stubs that would cause errors if the exact same stub were to be used in a different stub accepting function (ie load_model)

Agreed that having a conditional argument based on another arg is not great.

What do you think about following a http query string-like pattern that we can parse on the SparseZoo side

my_stub = "zoo:path/to/stub?recipe_type=transfer

def parse_zoo_stub(stub) -> Tuple[str, Dict[str, str]]:
    ...

stub, params = parse_zoo_stub(my_stub). # "path/to/stub", {"recipe_type": "transfer"}

Could also structure the params dict more if we want with a StubArgs class.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do like the query string approach and definitely seems very flexible with something like a sparse_zoo_stub function. Is there any reason to not have a Recipe constructor act as the sparse_zoo_stub function or have the Recipe constructor call into the sparse_zoo_stub? I think those combined with allowing a Recipe class to be passed in would cover all basis and give us enough flexibility to document an easy flow around. Additionally can easily identify each recipe by the stub through the UI's with easy copy and paste and additionally document flows in the recipe markdowns themselves for use cases

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated these changes so Managers can accept stubs or sparsezoo.objects.OptimizationRecipe objects. additionally, all logic for recipe types has been moved to SparseZoo by supporting query strings as we discussed.

@bfineran bfineran requested a review from a team February 17, 2021 20:35
@bfineran bfineran merged commit a32a853 into main Feb 18, 2021
@bfineran bfineran deleted the zoo-recipes branch February 18, 2021 22:45
markurtz added a commit that referenced this pull request Feb 25, 2021
* GA code, toctree links (#61)

- added tracking for docs output
- added help links for docs output

* Update README.md (#64)

removed placeholder reference to comingsoon repo in favor of active repo

* add decorator for flaky tf sparsity tests (#65)

* enable modifier groups in SparseML recipes (#66)

* enable modifier groups in SparseML recipes
* unit tests for YAML modifier list loading

* make build argument for nightly builds (#63)

* rst url syntax correction (#67)

correcting double slash at the end of URLs with updates to index.rst pre-compilation

* match types explicitly in torch qat quant observer wrappers (#68)

* docs updates (#71)

enhancing left nav for Help; after this merge, the docs need to be rebuilt for this repo so docs.neuralmagic.com can be refreshed. cc @markurtz

* Rename KSModifier to PruningModifier (#76)

* Rename ConstantKSModifier to ConstantPruningModifier

* Rename GradualKSModifier to GMPruningModifier

* Fix broken link for Optimization Recipes (#75)

* Serialize/deserialize MaskedLayer (#69)

* Serialize/deserialize MaskedLayer

* Remove unused symbols

* Register pruning scheduler classes for serialization

* removed ScheduledOptimizer, moved logic to ScheduledModifierManager (#77)

* load recipes directly from SparseZoo (#72)

* load recipes into Managers from sparsezoo stubs
* moving recipe_type handling to SparseZoo only, supporting loading SparseZoo recipe objects

* Revert "removed ScheduledOptimizer, moved logic to ScheduledModifierManager (#77)" (#80)

This reverts commit 6073abb.

* Update for 0.1.1 release (#82)

* Update for 0.1.1 release
- update python version to 0.1.1
- setup.py add in version parts and _VERSION_MAJOR_MINOR for more flexibility with dependencies between neural magic packages
- add in deepsparse optional install pathway

* missed updating version to 0.1.1

* rename examples directory to integrations (#78)

* rwightman/pytorch-image-models integration (#70)

* load checkpoint file based on sparsezoo recipe in pytorch_vision script (#83)

* ultralytics/yolov5 integration (#73)

* pytorch sparse quantized transfer learning notebook (#81)

* load qat onnx models for conversion from file path (#86)

* Sparsification update (#84)

* Sparsification update
- update sparsification descriptions and move to preferred verbage

* update from comments on deepsparse for sparsification

* Update README.md

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update README.md

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update README.md

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update README.md

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update docs/source/index.rst

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update docs/source/index.rst

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update docs/source/index.rst

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update docs/source/index.rst

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* Update docs/source/recipes.md

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* fix links in index.rst from reviewed content

* update overviews and taglines from doc

Co-authored-by: Jeannie Finks <74554921+jeanniefinks@users.noreply.github.com>

* blog style readme for torch sparse-quant TL notebook (#85)

Co-authored-by: Jeannie Finks (NM) <74554921+jeanniefinks@users.noreply.github.com>
Co-authored-by: Benjamin Fineran <bfineran@users.noreply.github.com>
Co-authored-by: Eldar Kurtic <eldar.ciki@gmail.com>
Co-authored-by: Tuan Nguyen <tuan@neuralmagic.com>
markurtz pushed a commit that referenced this pull request Sep 1, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants