Skip to content

v1.3.0

Latest
Compare
Choose a tag to compare
@stefan-m-lenz stefan-m-lenz released this 13 Oct 10:46

New features:

  • The batchsize can be specified also for fine tuning in fitdbm and traindbm!. The batchsize can be specified seperately for fine tuning and pretraining via the arguments batchsizefinetuning (new) and batchsizepretraining in fitdbm.
  • Added function top2latentdims, enabling a convenient dimension reduction with DBMs
  • Added a new example for using DBMs for dimension reduction
  • Added function blocksinnoise, simulating data sets with different subgroups and labels
  • Migrated continuous integration from Travis CI to GitHub Actions

Bug fix:

  • Argument optimizerpretraining in fitdbm is now respected. (Previously only the optimizer argument was used.)

Deprecation:

  • Argument learningrates in fitdbm is renamed to learningratesfinetuning for clarity