Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Mini-batch training in VAE #10

Merged
merged 39 commits into from
Apr 20, 2018
Merged

[WIP] Mini-batch training in VAE #10

merged 39 commits into from
Apr 20, 2018

Conversation

wehlutyk
Copy link
Collaborator

Work-in-progress pull request for #3, updated as work advances.

Feel free to comment or push other commits!

* Masking is disabled as it seems to necessitate more implementation
  details, and we're not planning on using it
* A GC layer that doesn't want to subsample can simply use a gather
  input equal to all node indices
* Gather input is fed to the layer call; it is a 1-D tensor, so not
  compatible with keras shape constraints (which uses at least 2-D
  tensors to include batch and inner dimensions); we will be feeding it
  through TF native placeholders
This lets us give a `feed_dict` to session calls inside predict and
fit functions.
The update notebook also shows how to use it.
@wehlutyk
Copy link
Collaborator Author

wehlutyk commented Apr 13, 2018

Things left to do:

  • update layers.Bilinear to not use batch_size
  • update ae.build_p, and simplify the gcn-ae notebook accordingly
  • implement batches() to infer what it needs from the model
  • adapt the normalization of A for non-symmetric batch adjacency matrices
  • implement ae.Model.fit_feeding_generator()
  • scale each loss properly
  • implement dag.restrict() and update the notebook to show predictions

…d_loss

is right for a non-tiled adjacency prediction
In particular, `Codec.logprobability()`, when called from
`Codec.estimated_pred_loss()`, should return something that has shape
(batch, samples), i.e. it should flatten and average over all the other
inner dimensions.
@wehlutyk wehlutyk merged commit 3b6461c into master Apr 20, 2018
@wehlutyk wehlutyk deleted the issue-3-mini-batching branch April 20, 2018 13:43
@wehlutyk wehlutyk restored the issue-3-mini-batching branch April 20, 2018 13:43
@wehlutyk wehlutyk deleted the issue-3-mini-batching branch April 20, 2018 13:44
wehlutyk added a commit that referenced this pull request Jul 10, 2018
 
Merge pull request #10 from ixxi-dante/issue-3-mini-batching

[WIP] Mini-batch training in VAE. Closes #3.
wehlutyk added a commit that referenced this pull request Jul 10, 2018
 
Merge pull request #10 from ixxi-dante/issue-3-mini-batching

[WIP] Mini-batch training in VAE. Closes #3.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant