Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question about magic_model #7

Closed
luolanfeixue opened this issue May 27, 2020 · 2 comments
Closed

question about magic_model #7

luolanfeixue opened this issue May 27, 2020 · 2 comments

Comments

@luolanfeixue
Copy link

how to understand magic_model

why augmentation.classifier line 158 dev_loss.backward() can update the Generator weight

@luolanfeixue
Copy link
Author

看名字知道您是中国人,所以冒昧用中文提issue了。
1、这个magic model怎么就能做到这样,dev_loss为什么会更新generator的参数呢?

2、这里是用到了前一次classifer到梯度了吗
deltas = _adam_delta(self._optimizer, self._model, grads)
magic_model.update_params(deltas)

@tanyuqian
Copy link
Owner

Yes, you are right:) I'll use English to reply for possible inspirations to everyone.

"how to understand magic_model?" -- It's for running a specified model (e.g., BERT) whose parameters are the sum of multiple sets of parameters (e.g., \theta + \theta'(\phi)) while we don't need to re-write the original forward() function. Our implementation is a bit like a hack to PyTorch.Module. If you have better ways to do it, please let me know.

"why can dev_loss.backward() update the Generator weight?" -- In the code, the route of gradient propagation is dev_loss (classifier.p::Line154) -> deltas (Line 147) -> grads (Line 140) -> aug_probs (Line 119) -> generator parameters (via gumbel_softmax, generator.py::Line104). For the correspondence between our paper and code, please refer to this.

Sorry that I don't quite understand your question "这里是用到了前一次classifer到梯度了吗". Could you specify it more detailedly?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants