Skip to content

Commit

Permalink
Remove self.trained_epochs (#134)
Browse files Browse the repository at this point in the history
* Expose hyperparameters/change cuda logic

* Fix set_device/update documentation

* Remove self from discriminator

* Fix optimizers

* Remove self from discriminator

* Remove "_" from variables

* Remove self.trained_epochs variable

Co-authored-by: Carles Sala <carles@pythiac.com>
  • Loading branch information
fealho and csala committed Mar 5, 2021
1 parent e93e00d commit 38f0d30
Showing 1 changed file with 0 additions and 2 deletions.
2 changes: 0 additions & 2 deletions ctgan/synthesizers/ctgan.py
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,6 @@ def __init__(self, embedding_dim=128, generator_dim=(256, 256), discriminator_di
self._log_frequency = log_frequency
self._verbose = verbose
self._epochs = epochs
self.trained_epochs = 0
self.pac = pac

if not cuda or not torch.cuda.is_available():
Expand Down Expand Up @@ -330,7 +329,6 @@ def fit(self, train_data, discrete_columns=tuple(), epochs=None):

steps_per_epoch = max(len(train_data) // self._batch_size, 1)
for i in range(epochs):
self.trained_epochs += 1
for id_ in range(steps_per_epoch):

for n in range(self._discriminator_steps):
Expand Down

0 comments on commit 38f0d30

Please sign in to comment.