Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

求助:unrolled lstm 如何对输入层共享使用batch norm的aux_states参数 #3076

Closed
quanzongfeng opened this issue Aug 19, 2016 · 5 comments

Comments

@quanzongfeng
Copy link

对于unrolled lstm, 假设序列长度为m, 则将recurrent部门展开有m个input和output, 对input使用batchnorm, 则会建立m个batchnorm节点。
在GrapExecutor::InitDataEntryInfo函数中, 会根据节点数建立op_nodes, 每个op_nodes处按照op->ListAuxiliaryStates()的数量建立aux_states, 则会建立m个aux_states。

如果希望m个batchnorm像lstm参数一样进行共享,那么如何操作才能使得m个aux_states共享呢?

@winstywang
Copy link
Contributor

Could you revise your questions in English? Since we still have many users who cannot understand Chinese.

@mli
Copy link
Member

mli commented Aug 20, 2016

@antinucleon
Copy link
Contributor

@mu not really, LSTM Bn problem need to be fixed after nnvm
On Fri, Aug 19, 2016 at 20:40 Mu Li notifications@github.com wrote:

does this link help?
https://github.com/dmlc/mxnet-notebooks/blob/master/python/rnn/lstm.ipynb


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
#3076 (comment), or mute
the thread
https://github.com/notifications/unsubscribe-auth/ABM13jl7tIuvc8fsRy8kcRia0paGIWFUks5qhncwgaJpZM4JoYGw
.

Sent from mobile phone

@horserma
Copy link

horserma commented Dec 6, 2016

I have the same problem. How to share the auxiliary states and weights of BatchNorm layer?

@yajiedesign
Copy link
Contributor

This issue is closed due to lack of activity in the last 90 days. Feel free to reopen if this is still an active issue. Thanks!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants