-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added batchNormLayerNN to replace NN #52
Added batchNormLayerNN to replace NN #52
Conversation
Looks good to me |
I think that this separation is a very good idea
E
… On Feb 9, 2018, at 11:47 AM, Lars Ruthotto ***@***.***> wrote:
Looks good to me
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or mute the thread.
|
tmp[1,1] = Y0 | ||
end | ||
|
||
Ydata::Array{T,2} = zeros(T,0,nex) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of initializing with 0 rows, you could init it with with how many non zero entries there are in this.outTimes
, then the data is pre allocated and you just need to store into it
nex = div(length(Y),nFeatIn(this)) | ||
nt = length(this.layers) | ||
cnt = 0 | ||
dYdata = zeros(T,0,nex) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same
dY = 0*Y | ||
end | ||
|
||
dYdata = zeros(T,0,nex) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same
for i=nt:-1:1 | ||
if this.outTimes[i]==1 | ||
nn = nFeatOut(this.layers[i]) | ||
W += this.Q'*Wdata[end-cnt2-nn+1:end-cnt2,:] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Depending on how big this sub array is, it might be worth using a @view to avoid the allocation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These are all things that are also in NN so lets put this on the TODO list and make the changes together
Makes it much easier to debug and there are probably some optimizations we can do since we know exactly what is contained in batchNormLayerNNs layers