Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

birdflow metadata: hyperparameters vs. length of loss_values #81

Closed
slager opened this issue May 8, 2023 · 4 comments · Fixed by #82
Closed

birdflow metadata: hyperparameters vs. length of loss_values #81

slager opened this issue May 8, 2023 · 4 comments · Fixed by #82

Comments

@slager
Copy link
Contributor

slager commented May 8, 2023

Seems to be a mismatch here between training_steps and length of loss_values. I specified 600 for the modelfit, so I think the 1000 may be a bug?

> str(bf[['metadata']][c('hyperparameters', 'loss_values')])
List of 2
 $ hyperparameters:List of 7
  ..$ dist_pow      : num 0.4
  ..$ dist_weight   : num 1e-04
  ..$ ent_weight    : num 0.001
  ..$ learning_rate : num 0.1
  ..$ obs_weight    : num 1
  ..$ rng_seed      : int 17
  ..$ training_steps: int 1000
 $ loss_values    :'data.frame':	600 obs. of  4 variables:
  ..$ dist : num [1:600] 128 126 125 124 123 ...
  ..$ ent  : num [1:600] 331 331 332 332 333 ...
  ..$ obs  : num [1:600] 0.345 0.339 0.333 0.326 0.32 ...
  ..$ total: num [1:600] 1.063 1.037 1.011 0.985 0.961 ...
@slager
Copy link
Contributor Author

slager commented May 8, 2023

It's possible the whole hyperparameters section is hard-coded in the hdf5

> i <- 1
> ll_df$model[i]
[1] "buwtea_2021_93km_obs1.0_ent0.004_dist0.016_pow0.3.hdf5"
> bf <- import_birdflow(file.path(my_dir, ll_df$model[i]))
> str(bf[['metadata']][c('hyperparameters', 'loss_values')])
List of 2
 $ hyperparameters:List of 7
  ..$ dist_pow      : num 0.4
  ..$ dist_weight   : num 1e-04
  ..$ ent_weight    : num 0.001
  ..$ learning_rate : num 0.1
  ..$ obs_weight    : num 1
  ..$ rng_seed      : int 17
  ..$ training_steps: int 1000
 $ loss_values    :'data.frame':	600 obs. of  4 variables:
  ..$ dist : num [1:600] 128 126 125 124 123 ...
  ..$ ent  : num [1:600] 331 331 332 332 333 ...
  ..$ obs  : num [1:600] 0.345 0.339 0.333 0.326 0.32 ...
  ..$ total: num [1:600] 1.063 1.037 1.011 0.985 0.961 ...
> i <- 2
> 
> ll_df$model[i]
[1] "buwtea_2021_93km_obs1.0_ent0.0035_dist0.014_pow0.3.hdf5"
> bf <- import_birdflow(file.path(my_dir, ll_df$model[i]))
> str(bf[['metadata']][c('hyperparameters', 'loss_values')])
List of 2
 $ hyperparameters:List of 7
  ..$ dist_pow      : num 0.4
  ..$ dist_weight   : num 1e-04
  ..$ ent_weight    : num 0.001
  ..$ learning_rate : num 0.1
  ..$ obs_weight    : num 1
  ..$ rng_seed      : int 17
  ..$ training_steps: int 1000
 $ loss_values    :'data.frame':	600 obs. of  4 variables:
  ..$ dist : num [1:600] 128 127 125 124 123 ...
  ..$ ent  : num [1:600] 331 331 332 332 333 ...
  ..$ obs  : num [1:600] 0.345 0.338 0.332 0.326 0.32 ...
  ..$ total: num [1:600] 0.973 0.95 0.927 0.905 0.884 ...

@slager
Copy link
Contributor Author

slager commented May 8, 2023

@slager
Copy link
Contributor Author

slager commented May 8, 2023

My python PR fixes it on the Python end, I think. Import_birdflow may want to handle normalized differently to create logical instead of factor, see here:

  ..$ hyperparameters         :List of 8
  .. ..$ dist_pow      : num 0.1
  .. ..$ dist_weight   : num 0.008
  .. ..$ ent_weight    : num 0.0015
  .. ..$ learning_rate : num 0.1
  .. ..$ normalized    : Factor w/ 2 levels "FALSE","TRUE": 2
  .. ..$ obs_weight    : num 1
  .. ..$ rng_seed      : int 17
  .. ..$ training_steps: int 600

@ethanplunkett
Copy link
Contributor

Thanks. I'll convert the "normalized" hyper-parameter to logical while reading from the hdf5. I don't think any other changes are necessary on the R side.

ethanplunkett added a commit that referenced this issue May 9, 2023
…erparameters-vs-length-of-loss_values

Addresses #81 birdflow metadata hyperparameters vs length of loss values.
Removes legacy importation code.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants