New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

training julia models #44

Merged
merged 3 commits into from Jul 3, 2017

Conversation

Projects
None yet
2 participants
@ylxdzsw
Contributor

ylxdzsw commented Jun 12, 2017

This PR fixes bugs of back! in core.jl (the fallback method) and Chain, and add the support of Affine. With these we can run the training process of some basic linear models like LogisticRegression on our pure Julia implementation. This allow us to test training process and optimizers before we can run them on backends.

@MikeInnes

Great. Eventually it would be nice to generate the backward pass for @net layers – we should at least do the update! method, which is trivial. But this is good for now.

update!(m.W, η)
update!(m.b, η)
m
end

This comment has been minimized.

@MikeInnes

MikeInnes Jun 13, 2017

Member

would be good to have the newline here

This comment has been minimized.

@ylxdzsw

ylxdzsw Jun 14, 2017

Contributor

Yes, usually I use editorconfig for trimming trailling white spaces and insert new line automatically. It's simple and works on almost all editors and platforms. Would you mind me adding one for Flux?

This comment has been minimized.

@MikeInnes

MikeInnes Jun 15, 2017

Member

Better to just keep editor config locally, I think.

This comment has been minimized.

@MikeInnes

MikeInnes Jul 3, 2017

Member

By the way, this method is already output by @net, but may be incorrect. Could you fix that and remove this version?

for i in 1:N-1
xs = s.layers[i](xs...)
xs isa Tuple || (xs = (xs, ))

This comment has been minimized.

@MikeInnes

MikeInnes Jun 13, 2017

Member

we can assume that all layers in a chain are single input/output. Then it should be easy to do this as fold agains as well.

This comment has been minimized.

@ylxdzsw

ylxdzsw Jun 14, 2017

Contributor

OK, that would make things easier.

@MikeInnes

This comment has been minimized.

Member

MikeInnes commented Jun 15, 2017

Great, can you add a test?

@MikeInnes

This comment has been minimized.

Member

MikeInnes commented Jul 3, 2017

Thanks!

@MikeInnes MikeInnes merged commit 7e48018 into FluxML:master Jul 3, 2017

1 check passed

continuous-integration/travis-ci/pr The Travis CI build passed
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment