New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automatically populate the signal-related parameters #517
Automatically populate the signal-related parameters #517
Conversation
@sliwy I merged the Could you help me update the test you had in them? |
@PierreGtch what do you need from me? some specific test you think about? |
@sliwy I broke the tests:
|
taking a look! |
… returns mocked values
@PierreGtch let's move back here: When it comes to |
Hey @PierreGtch and @sliwy, Please ping me if you need any help to solve the design conflict. To be honest, I looked the discussion at the PierreGtch#3, and I still don't get the big picture. I am available to chat or meet if necessary. |
restored previous tests behavior
As discussed in PierreGtch#3, we deprecate passing an initialized module to |
Codecov Report
@@ Coverage Diff @@
## master #517 +/- ##
==========================================
+ Coverage 84.35% 84.48% +0.12%
==========================================
Files 63 63
Lines 4578 4653 +75
==========================================
+ Hits 3862 3931 +69
- Misses 716 722 +6 |
@sliwy are you ok to merge this version? |
Sorry @bruAristimunha, missed your comment. The goal is to automatically populate all the signal-related parameters when we can and not break things when not possible. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would restore a test with initialized nn.Module
provided to the EEGClassifer
- the one that was failing before (for example test_post_epoch_train_scoring
).
Let's check that the warning is shown and that the model is not reinitialized with default params.
Apart from that and some details (flake8, docstring and test with only pass) everything looks good to me! :)
I missed this, but would prefer we don't do that (deprecating passing initialized module). I think there is a tradeoff between automating more stuff (like setting parameter according to dataset) and being simpler/more transparent. So in my view should be fine to also pass initialized module, and in that case as is now, do not set parameters according to dataset... no reason to break this workflow I think... |
@robintibor I can just turn the warning into an info and remove the deprecated part |
sounds sensible |
Then I'm ready to merge |
Hi @sliwy! I enabled auto merge, if good for you, just approve and will be merged. |
Would be good to have some docs example for usage I think? To see how it all works? |
What's new checker is working (I hope) @PierreGtch |
* Set signal-related parameters in check_data * Merge tests for EEGClassifier and EEGRegressor * Fix coquille * Update test to use module mixin * Rename clf to eegneuralnet in test * Add preds fixture * Update eegneuralnet.py and subclasses * Update test_eegneuralnet.py * restored previous tests behavior where nn layer is ignored and always returns mocked values * Update whats_new.rst * Use two different mock modules for test * Rename mock modules * Use set_params instead of vars * Deprecate passing an initialized module and skip setting signal args in that case * Remove unnecessary f-strings * Try fix python 3.8 * Fix case with non torch dataset * Try fix python 3.8 * Add docstrings for fit and partial_fit (already including braindecode#529) * Test initialized module * Fix Flake8 * Add email * Remove deprecation of initialized module --------- Co-authored-by: Maciej Sliwowski <macieksliwowski@gmail.com> Co-authored-by: Bru <a.bruno@aluno.ufabc.edu.br>
cf #488, #457