Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question of DER: freeze the old module is left out in the file der.py #40

Closed
fanyan0411 opened this issue Apr 29, 2023 · 4 comments
Closed
Labels
invalid This doesn't seem right

Comments

@fanyan0411
Copy link

Why "freezing old network" of after_task is left out in file der.py, while other methods, such as podnet and WA execute this step as follows:
self._old_network = self._network.copy().freeze()?

@G-U-N
Copy link
Owner

G-U-N commented Apr 29, 2023

Hi, thanks for your interest. WA and PodNet are distillation-based methods, meaning they have to save the model in the last incremental session, while DER does not.

@G-U-N G-U-N closed this as completed Apr 29, 2023
@fanyan0411
Copy link
Author

Thanks for your timely responce!

In original paper, DER noticed that "we freeze the previously learned representation" for several times. If the previous convs in self.convnets are not freezed, which is also not executed in the code, does it means that those previous "resnet" modules will be affected by the back propagation? How to understand "freeze the previously learned representation"?

@G-U-N
Copy link
Owner

G-U-N commented Apr 29, 2023

I understand your confusion now. Refer to Line51 and Line84.

@G-U-N G-U-N added the invalid This doesn't seem right label Apr 29, 2023
@fanyan0411
Copy link
Author

That the key point I am wondering about. Thanks a lot!

I hope your work keeps getting better!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
invalid This doesn't seem right
Projects
None yet
Development

No branches or pull requests

2 participants