Skip to content
This repository has been archived by the owner on May 24, 2018. It is now read-only.

Is it easy to implement a RNN? #46

Open
byzhang opened this issue Mar 7, 2015 · 5 comments
Open

Is it easy to implement a RNN? #46

byzhang opened this issue Mar 7, 2015 · 5 comments

Comments

@byzhang
Copy link

byzhang commented Mar 7, 2015

Could you please add a simple example if possible?

@tqchen
Copy link
Member

tqchen commented Mar 7, 2015

RNN was not yet implemented in cxxnet. However, it could be implemented with mshadow easily with matrix operation.

A notable thing about V2 is that we enables natural layer weight sharing by distinguishing layer and connections, this property could be helpful when building up an RNN.

@byzhang
Copy link
Author

byzhang commented Mar 7, 2015

Do you have an estimation on when will the V2 be public?

@tqchen
Copy link
Member

tqchen commented Mar 7, 2015

We are on final stages in testing things to make sure things are stable
before pulling into master. You can already use the code in V2-refactor
branch.
On Saturday, March 7, 2015, byzhang notifications@github.com wrote:

Do you have an estimation on when will the V2 be public?


Reply to this email directly or view it on GitHub
#46 (comment).

Sincerely,

Tianqi Chen
Computer Science & Engineering, University of Washington

@byzhang
Copy link
Author

byzhang commented Mar 8, 2015

Do you mind to add a RNN example in V2-refactor branch? Thanks!

@huashiyiqike
Copy link

I have a RNN Library much like CXXNET, which is also built on top of Mshadow and also implemented LSTM and RTRBM layers. It is at https://github.com/huashiyiqike/NETLAB.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants