You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What would you like to be added:
More example for one-shot NAS practice of DNN/MLP models Why is this needed:
Didn't find this kind of example in the repo. Without this feature, how does current nni work:
Components that may involve changes:
Brief description of your proposal if any:
I'd like to apply NAS function to search the model structure for MLP(to searchhow many hidden layer/how many neurons in each layer/activation function/dropout rate/ layer type(dense or denseblock)), but I cannot find any examples about this problem in the repo.
The text was updated successfully, but these errors were encountered:
Good suggestion. Some of what you want can be naturally fit into current one-shot algorithms (e.g., activation function), while for others we need to find papers and even improvise as current one-shot NAS might not work at all.
In summary, we have to investigate what is going on. For now, let's put it into our future plan.
What would you like to be added:
More example for one-shot NAS practice of DNN/MLP models
Why is this needed:
Didn't find this kind of example in the repo.
Without this feature, how does current nni work:
Components that may involve changes:
Brief description of your proposal if any:
I'd like to apply NAS function to search the model structure for MLP(to searchhow many hidden layer/how many neurons in each layer/activation function/dropout rate/ layer type(dense or denseblock)), but I cannot find any examples about this problem in the repo.
The text was updated successfully, but these errors were encountered: