-
Notifications
You must be signed in to change notification settings - Fork 174
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Starting point for batch-level data augmentation #254
Conversation
Hi, very interesting thanks, seems a lot of work, will need to have a closer look! This works fine with what was done in https://github.com/braindecode/braindecode/pull/198/files ? Note the discussion there as well |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A first batch of comments
Hey @robintibor, thanks a lot ! Glad it looks interesting. @sbbrandt will soon add his nice MixUp implementation as well and we are still working on polishing things a little with him and @tomMoral. Concerning @hubertjb's PR, yes we are aware. We've discussed that when the PR was opened and have a few ideas on how to ensure both are compatible (since both logics of batch-level and sample-level data augmentation have their own pros and cons). We will get to that as soon as we are done polishing :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Few more comments on transform.
Co-authored-by: Thomas Moreau <thomas.moreau.2010@gmail.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Last few comments for the day. I will do a second review tomorrow.
But this looks good overall.
Co-authored-by: Thomas Moreau <thomas.moreau.2010@gmail.com>
Co-authored-by: Thomas Moreau <thomas.moreau.2010@gmail.com>
Definitely looks like this is growing to something nice! Make sure to also create some examples, an add to API docs etc. . |
-added mixup functional -adapted base to handle tuple as return from transform -added mixup loss -added mixup tests
I would still also argue to write transforms that only transform X on a per example basis using the existing https://github.com/braindecode/braindecode/pull/198/files solution. And keep the dataloader way for those transformations that need access to either X and y or to batch-level examples? Or what do you think? Otherwise it becomes confusing when to use which one? You could try to recreate your Transform Superclass as an "ExampleTransform" superclass it would make this transitition easier. |
also @hubertjb let me know what you think |
Co-authored-by: Alexandre Gramfort <alexandre.gramfort@m4x.org>
Co-authored-by: Alexandre Gramfort <alexandre.gramfort@m4x.org>
grrrr ok !
@cedricrommel you are on the example now then?
… |
yes then he can change to [MRG] when example is there so I see it as well. |
I've fixed things. Will push soon. Just making a final check at the html :) |
Ok Alles gut for me ! 😃 I just need confirmation from @sbbrandt that he is ok with my commit where I put his email address in the file to add him as contributor and then I'm ok for merging 💪 |
@robintibor green button is yours ! |
Great work, merged! |
🎉🎉🎉🎉🎉
|
Niiiice ! 👏 👏 👏 👏 |
🎉🎉🎉 Thank you guys! Also for critics and correction of errors. From me also thanks for the event and thank you @cedricrommel for the nice augmentation implementation :) |
No description provided.