We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
umap/umap/parametric_umap.py
Line 54 in a142954
I believe the default specification of loss_report_frequency has a hidden impact on the pass through to history = model.fit()
loss_report_frequency
history = model.fit()
history = self.parametric_model.fit( edge_dataset, epochs=self.loss_report_frequency * self.n_training_epochs, steps_per_epoch=steps_per_epoch,
This leads to situations where a user can specify to train for N epochs, and keras fit() will report training for N * 10 epochs.
fit()
example:
embedder = ParametricUMAP( encoder=encoder, decoder=decoder, dims=dims, n_training_epochs=1, # explicit single training epoch parametric_reconstruction= True, reconstruction_validation=test_images, verbose=True, loss_report_frequency=2 ) # 45 min per epoch, Epoch 1/2 <- specs 2 passes over data # 193/3622 [>.............................] - ETA: 39:27
vs what happens with default params:
embedder = ParametricUMAP( encoder=encoder, decoder=decoder, dims=dims, n_training_epochs=1, # explicit single training epoch parametric_reconstruction= True, reconstruction_validation=test_images, verbose=True, ) # 10 epochs (n_training_epochs * 10, the default for loss_report_frequency)
Loss_report frequency also directly impacts steps_per_epoch in the calculation at line 332:
Line 337 in a142954
There's discussions here about other ways to deal with "printing loss updates during epochs." https://stackoverflow.com/questions/52205315/plot-loss-evolution-during-a-single-epoch-in-keras and here: keras-team/keras#2850
Those might eventually be more "elegant" and obvious ways to report inter epoch fit() updates.
Its mostly a low impact, low priority API clarity thing.
The text was updated successfully, but these errors were encountered:
No branches or pull requests
umap/umap/parametric_umap.py
Line 54 in a142954
I believe the default specification of
loss_report_frequency
has a hidden impact on the pass through tohistory = model.fit()
This leads to situations where a user can specify to train for N epochs, and keras
fit()
will report training for N * 10 epochs.example:
vs what happens with default params:
Loss_report frequency also directly impacts steps_per_epoch in the calculation at line 332:
umap/umap/parametric_umap.py
Line 337 in a142954
this makes sense (it obviously has to) but its not clear that it will change what
fit()
reports back as the training epochs.There's discussions here about other ways to deal with "printing loss updates during epochs."
https://stackoverflow.com/questions/52205315/plot-loss-evolution-during-a-single-epoch-in-keras
and here:
keras-team/keras#2850
Those might eventually be more "elegant" and obvious ways to report inter epoch
fit()
updates.Its mostly a low impact, low priority API clarity thing.
The text was updated successfully, but these errors were encountered: