Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create unet_lstm.py #207

Merged
merged 21 commits into from Aug 12, 2022
Merged

Conversation

Aakanksha-Rana
Copy link
Member

@Aakanksha-Rana Aakanksha-Rana commented Jan 19, 2022

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Summary

This model supports 3D time-series processing.

Checklist

  • I have added tests to cover my changes
  • I have updated documentation (if necessary)

Acknowledgment

  • I acknowledge that this contribution will be available under the Apache 2 license.

Copy link
Contributor

@satra satra left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there are some lines with todo related to a max pooling layer.

also it would be helpful to know why input shapes are flipped in the creation of data vs model. i.e. why can't they be the same. in the code there seems to be a lot of repeats. could that be compressed through some loops and parameters or partial funcs?

docstring for parameters, reference to paper/original model.

@Aakanksha-Rana
Copy link
Member Author

Aakanksha-Rana commented May 13, 2022

there are some lines with todo related to a max pooling layer.

also it would be helpful to know why input shapes are flipped in the creation of data vs model. i.e. why can't they be the same. in the code there seems to be a lot of repeats. could that be compressed through some loops and parameters or partial funcs?

docstring for parameters, reference to paper/original model.

  • Tensorflow doesn't provide 4D pooling/upsampling layers support yet, might need to wait till they provide the support for 4D layer or if someone wants to add that to nobrainer would be great (lets put an issue maybe for that?)
  • It is not flipped. the input size is (32, 32, 32, 32). other thing is just showcasing the batch size or class during initialization.
  • Yeah it can be compressed.
  • I will added the docstring.

@satra satra merged commit e3e7113 into neuronets:master Aug 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants