Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can we have example of utilizing one-shot NAS for DNN models? #3593

Closed
ZhuJohnson opened this issue Apr 29, 2021 · 2 comments
Closed

Can we have example of utilizing one-shot NAS for DNN models? #3593

ZhuJohnson opened this issue Apr 29, 2021 · 2 comments
Assignees
Labels
Milestone

Comments

@ZhuJohnson
Copy link

What would you like to be added:
More example for one-shot NAS practice of DNN/MLP models
Why is this needed:
Didn't find this kind of example in the repo.
Without this feature, how does current nni work

Components that may involve changes:

Brief description of your proposal if any:
I'd like to apply NAS function to search the model structure for MLP(to searchhow many hidden layer/how many neurons in each layer/activation function/dropout rate/ layer type(dense or denseblock)), but I cannot find any examples about this problem in the repo.

@ultmaster
Copy link
Contributor

Good suggestion. Some of what you want can be naturally fit into current one-shot algorithms (e.g., activation function), while for others we need to find papers and even improvise as current one-shot NAS might not work at all.

In summary, we have to investigate what is going on. For now, let's put it into our future plan.

@ultmaster ultmaster added this to the Backlog milestone Apr 30, 2021
@ultmaster ultmaster self-assigned this Apr 30, 2021
@scarlett2018
Copy link
Member

already provided in later release, closing the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants