-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ENH] Loading ability in deep learning modules for classification/regression/clustering #1374
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
Thank you for contributing to
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you should create a temp directory to the writing and loading of files, and possibly only do it on overnight tests (although not sure about that)
this from test_data_writers.py
with tempfile.TemporaryDirectory() as tmp:
write_to_tsfile(X=X, path=tmp, y=y, problem_name=problem_name)
load_path = os.path.join(tmp, problem_name)
newX, newy = load_from_tsfile(full_file_path_and_name=load_path)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
Fix #385
Deep learning base models now have a loading function to load a saved pretrained model and skip the fitting phase. This required testing the functionality in deep base, which i added.
I added a demo on how to use this functionality in the deep learning notebook examples/networks/deep_learning
I also updated some of the figures, using the new ones from here now: https://msd-irimas.github.io/pages/dl4tsc/