-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Study the val_loss of DCAE depending on its window and strize parameters #20
Comments
In GitLab by @lgs on Oct 29, 2020, 11:09 I have updated the 02_DCAE to train it with a papermill experiment and a wandb sweep. I have included the following code to read the parameters from _experiments_papermill_caler.ipynb ifnone(config.get('variable'), default_value) I have launched a training sweep to study the val_loss of autoencoders with all possible combinations of the windows_size (range(24,144,12)) and stride (range(1,12,2)) parameters. |
In GitLab by @lgs on Oct 29, 2020, 12:35 After the first sweep, we got 60 DCAE models available here. When we take the results with a 'val_loss' lower than 0.3, we get a set of 14 models. In these models, it seems that the size of the window is more important to obtain a good-quality autoencoder than the stride. In the graph, we can see that for the models with a val_loss<=0.30, most of the models used windows below 60. In the case of the stride there is not such a clear pattern, but it seems that low strides (high redundancy in the windowing) improves the quality of the models. I will re-launch a sweep in which the models are calculated with a windows size range(24,72,12) and a stride in individual steps between 1 and 9. To check in more detail what is happening. |
Sweeps should be redone, closing |
In GitLab by @lgs on Oct 29, 2020, 10:49
null
The text was updated successfully, but these errors were encountered: