Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

augmentation features or logits? #33

Closed
Bunnyqiqi opened this issue Jul 10, 2022 · 1 comment
Closed

augmentation features or logits? #33

Bunnyqiqi opened this issue Jul 10, 2022 · 1 comment

Comments

@Bunnyqiqi
Copy link

Bunnyqiqi commented Jul 10, 2022

Hi, thanks a lot for your works. There are two questions that confused me a lot:

(1) I think CV_temp is the covariance matrix corresponding to each class,
but following augmentations happened on y, and y is logits. If features shape is [N, A], then y's shape is [N, C].
As your paper said, we should augment in features, should we augment on features not y?
As the sigma2 is [N, C] , features is [N, A], If I'd like to augment directly on features, where should I correct?

(2)And what's the meaning of these two steps on sigma2 computation?

Thanks a lot !
image

@Panxuran
Copy link
Collaborator

Thanks for your attention to our work.

For question (1), your understanding is correct than isda needs to augment on features instead of logits. In the code, as we define the last classifier (fc) out of the whole model in an explicit way (see in mage classification on CIFAR/train.py line 347), thus the output y precisely represents the feature.

For question (2), this is because we only consider approximating the diagonal values of covariance matrix due to GPU memory limitation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants