Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pfedme Optimizer Probelem #12

Closed
adam4096 opened this issue Apr 22, 2021 · 6 comments
Closed

pfedme Optimizer Probelem #12

adam4096 opened this issue Apr 22, 2021 · 6 comments

Comments

@adam4096
Copy link

Hello,
I want to ask:
Why is different, code and _algorithm?
That is in fedoptimizer.py line 64:
p.data = p.data - group['lr'] * (p.grad.data + group['lamda'] * (p.data - localweight.data) + group['mu']*p.data)
and Algorithm 1 line 8
捕获

@CharlieDinh
Copy link
Owner

Hi.

The line 64 in optimizer is to solve h_i (Equation 7 in our paper) to find the persionalized model.

Algorithm 1 line 8 is in userpFedMe.py line 53-54 to find local model.

for new_param, localweight in zip(self.persionalized_model_bar, self.local_model):
localweight.data = localweight.data - self.lamda* self.learning_rate * (localweight.data - new_param.data)

@adam4096
Copy link
Author

Thank you!

@adam4096 adam4096 reopened this Apr 23, 2021
@adam4096
Copy link
Author

Hi.

The line 64 in optimizer is to solve h_i (Equation 7 in our paper) to find the persionalized model.

Algorithm 1 line 8 is in userpFedMe.py line 53-54 to find local model.

for new_param, localweight in zip(self.persionalized_model_bar, self.local_model):
localweight.data = localweight.data - self.lamda* self.learning_rate * (localweight.data - new_param.data)

Hello,
For the line 64 in optimizer is to solve h_i, do you modify the NAG to find the personalized model?
It doesn't look like a standard NAG update formula.

@CharlieDinh
Copy link
Owner

No. I just applied gradient descent for h_i

@adam4096
Copy link
Author

No. I just applied gradient descent for h_i

But this update looks strange. Can you explain it? Thamk you very much!
p.data = p.data - group['lr'] * (p.grad.data + group['lamda'] * (p.data - localweight.data) + group['mu']*p.data)

@CharlieDinh
Copy link
Owner

CharlieDinh commented Apr 24, 2021

(p.grad.data ) is gradient update for of f_bar
(group['lamda'] * (p.data - localweight.data) is gradient for moreau envelopes regularization term
group['mu']*p.data is just norm 2 regularizartion, but mu = 0 mean we don;t use it
so (p.grad.data + group['lamda'] * (p.data - localweight.data)) is gradient of h_i

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants