Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix MLP/Recurrent-based memory inference complications #512

Merged
merged 10 commits into from
Apr 11, 2023
Merged

Conversation

kdgutier
Copy link
Collaborator

@kdgutier kdgutier commented Apr 7, 2023

Here I fix a minor bug affecting MLP's output shape, after the PL 2.0 main update.
The MLP solution involved a simple reshape of the final forward predictions.

Additionally I noticed a similar validation memory error, with an entirely different origin for all Recurrent-based models.
The Recurrent-based solution involved the addition of a new inference_input_size parameter that enables models to truncate the length of the series upon which they are applied to save computation and memory.

@kdgutier kdgutier requested a review from cchallu April 7, 2023 21:20
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@kdgutier kdgutier changed the title Fix output shapes after PL 2.0 Fix MLP/Recurrent-based memory inference complications Apr 10, 2023
@cchallu cchallu merged commit d7245c0 into main Apr 11, 2023
8 checks passed
@cchallu cchallu deleted the fix/validation branch April 11, 2023 15:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment