Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix LMU swapping behaviour during training #28

Merged
merged 1 commit into from Nov 16, 2020
Merged

Conversation

drasmuss
Copy link
Member

In some situations (e.g. calling .fit with None for the time dimension) the layer gets called with a mix of defined and undefined input steps, which messes up the autoswapping logic in keras_lmu.LMU (between the RNN and FFT implementations). This changes it so that the swap type is fixed at build time.

Fixes #27

Base automatically changed from fix-hidden-memory-none to master November 16, 2020 15:15
In some situations (e.g. `.fit`) the layer gets called
with a mix of defined and undefined input shapes, which
messes up the autoswapping logic. This changes it so that
the swap type is fixed at build time.
@drasmuss drasmuss changed the title Fix swap fit Fix LMU swapping behaviour during training Nov 16, 2020
@gsmalik gsmalik self-requested a review November 16, 2020 18:05
@drasmuss drasmuss merged commit ebc9b08 into master Nov 16, 2020
@drasmuss drasmuss deleted the fix-swap-fit branch November 16, 2020 19:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

AttributeError: 'LMUFFT' object has no attribute 'kernel'
2 participants