Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NAS for (Variational) Autoencoders #573

Closed
tik0 opened this issue Mar 10, 2019 · 4 comments
Closed

NAS for (Variational) Autoencoders #573

tik0 opened this issue Mar 10, 2019 · 4 comments
Labels

Comments

@tik0
Copy link

tik0 commented Mar 10, 2019

Feature Description

Training of predefined models with constraints (e.g. bottleneck layers) and additional losses.

Reason

It is unclear how to use autokeras on models with particular constraints, like Autencoders or Bottleneck-Networks.
Furthermore, the Variational Autoencoder has additional regularizer and sampling layer.

Solution

Some API like
# define model constraints first
ak_model = ak.GenericModel(my_keras_model)
ak_model.fit(x_train, y_train, time_limit=12 * 60 * 60) # while y_train can be also None
ak_model.final_fit(x_train, y_train, x_test, y_test, retrain=True)

Alternative Solutions

Additional Context

@tik0 tik0 changed the title Example for NAS on (Variational) Autoencoders NAS for (Variational) Autoencoders Mar 10, 2019
@stale
Copy link

stale bot commented Aug 12, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix label Aug 12, 2019
@stale stale bot closed this as completed Aug 19, 2019
@maechler
Copy link

maechler commented Mar 4, 2020

@tik0 Have you found out anything on this one? Have you been able to use AutoKeras to create an Autoencoder or did you use something different?

@daviembrito
Copy link

daviembrito commented Sep 2, 2022

I was trying to create a LSTM Autoencoder, but couldn't realize how to do this. Do you have a clue now?

@maechler
Copy link

maechler commented Sep 5, 2022

@daviembrito I ended up using a generic hyperparameter optimizer with a simple, handcrafted, chain-structured search space. It worked reasonably well in my use case, but it also introduces a large human bias with quite a limited search space of possible network architectures.

You can have a look at it here:

https://github.com/maechler/a2e
https://github.com/maechler/a2e/blob/master/experiments/automl/deep_easing_feed_forward_dropout.py
https://github.com/maechler/a2e/blob/master/a2e/model/keras/_feed_forward.py#L111
https://github.com/maechler/a2e/blob/master/a2e/model/keras/_lstm.py#L6

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants