You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
for level in range(1, self._levels): for i_dl in range(self._dense_layers-1): hidden = self.get('h{}_dense'.format(5+(level-1)*self._dense_layers+i_dl), tfkl.Dense, self._embed_size, activation=tf.nn.relu)(hidden) if self._dense_layers > 0: hidden = self.get('h{}_dense'.format(4+level*self._dense_layers), tfkl.Dense, feat_size, activation=None)(hidden) layer = hidden
line 39 onwards in the cnn.py Encoder(), the depth of these layers increases with the level as the hidden variables is overwritten. At large n_levels and n_enc_dense_layers this will result in a very deep network mapping from the observation embedding to the latent space. Not sure it's intentional, doesn't seem to have a purpose, ie is there a reason the higher latent spaces need a deeper function to map from the embedding?
The text was updated successfully, but these errors were encountered:
for level in range(1, self._levels): for i_dl in range(self._dense_layers-1): hidden = self.get('h{}_dense'.format(5+(level-1)*self._dense_layers+i_dl), tfkl.Dense, self._embed_size, activation=tf.nn.relu)(hidden) if self._dense_layers > 0: hidden = self.get('h{}_dense'.format(4+level*self._dense_layers), tfkl.Dense, feat_size, activation=None)(hidden) layer = hidden
line 39 onwards in the cnn.py Encoder(), the depth of these layers increases with the level as the hidden variables is overwritten. At large n_levels and n_enc_dense_layers this will result in a very deep network mapping from the observation embedding to the latent space. Not sure it's intentional, doesn't seem to have a purpose, ie is there a reason the higher latent spaces need a deeper function to map from the embedding?
The text was updated successfully, but these errors were encountered: