Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

感觉FTRL里面train加锁有问题 #15

Open
applefly opened this issue Aug 7, 2018 · 3 comments
Open

感觉FTRL里面train加锁有问题 #15

applefly opened this issue Aug 7, 2018 · 3 comments

Comments

@applefly
Copy link

applefly commented Aug 7, 2018

多线程训练的情况下,有的线程更新参数g,有的线程读了参数s, 虽然有锁程序运行没问题,但感觉可能出现一条样本对参数的跟新不一致的问题
280 mu.mtx.lock();
个人感觉应该加在循环体外

@CastellanZhang
Copy link
Owner

完全可以把锁加在更外层,甚至加到feature粒度上,但会影响运行效率。所以我这里采用了尽可能小的粒度上加锁。parameter server也是类似的做法,参数更新并非严格一致,实践表明完全不影响收敛。甚至你会发现,即便完全不加锁在大部分情况下也没什么太大影响。

@CasyWang
Copy link

CasyWang commented Jan 4, 2019

hi,castellan,这里有没有相关的证明呢?同样有这个confuse。

@applefly
Copy link
Author

applefly commented Jan 4, 2019

我写了个多线程训练的代码,跟新参数时加锁,测试时发现总会出现模型auc波动很大,多次训练学到的权重也有些起伏,所以我就放弃了多线程版,用单线程一直很稳定

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants