Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Chainer v2 #100

Merged
merged 34 commits into from
Jun 7, 2017
Merged

Use Chainer v2 #100

merged 34 commits into from
Jun 7, 2017

Conversation

toslunar
Copy link
Member

No description provided.

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling a855613 on toslunar:chainerv2 into ** on pfnet:master**.

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling f7e6a99 on toslunar:chainerv2 into ** on pfnet:master**.

@coveralls
Copy link

coveralls commented May 30, 2017

Coverage Status

Changes Unknown when pulling 04ede2d on toslunar:chainerv2 into ** on pfnet:master**.

@@ -107,7 +119,7 @@ def save(self, dirname):
def load(self, dirname):
"""Load internal states."""
for attr in self.saved_attributes:
serializers.load_npz(
load_npz_no_strict(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the reason behind this change? It's better to add comments if this is a fix for some problem.

@@ -226,7 +226,9 @@ def update(self, statevar):
target_link=self.shared_model, source_link=self.model)
# Update the globally shared model
if self.process_idx == 0:
norm = self.optimizer.compute_grads_norm()
# norm = self.optimizer.compute_grads_norm()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove commented out code.

@@ -523,7 +523,9 @@ def update(self, t_start, t_stop, R, states, actions, rewards, values,
target_link=self.shared_model, source_link=self.model)
# Update the globally shared model
if self.process_idx == 0:
norm = self.optimizer.compute_grads_norm()
# norm = self.optimizer.compute_grads_norm()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove commented out code.

@@ -248,7 +248,10 @@ def update(self, loss):
copy_param.copy_grad(
target_link=self.shared_model, source_link=self.model)
if self.process_idx == 0:
norm = self.optimizer.compute_grads_norm()
# norm = self.optimizer.compute_grads_norm()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove commented out code.

@@ -137,7 +137,7 @@ def __init__(self, model, optimizer,
self.pi_loss_coef = pi_loss_coef
self.v_loss_coef = v_loss_coef
self.rollout_len = rollout_len
self.batchsize = batchsize
self.batchsize = self.xp.int32(batchsize)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess this is a fix for newint, but it's not clear from the code. Can you add a comment?

class LeCunNormal(chainer.initializers.HeNormal):
"""sorry

"""
Copy link
Member

@muupan muupan Jun 2, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Elaborate the docstring.

def ensure_initialized_update_rule(param):
u = param.update_rule
if u.state is None:
u._state = {} # Sorry!
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Eraborate the comment.

.travis.yml Outdated
- cd chainer
- git checkout _v2
- python setup.py install
- cd ..
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now that chainer v2 is released, can you undo these changes?

@@ -67,42 +68,56 @@ def test_share_states(self):
opt_a.setup(model)
arrays = async.share_states_as_shared_arrays(opt_a)
opt_b = optimizers.RMSprop()
opt_b.setup(model)
opt_b.setup(copy.deepcopy(model))
opt_b.update() # sorry
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Elaborate the comment.

@@ -67,42 +68,56 @@ def test_share_states(self):
opt_a.setup(model)
arrays = async.share_states_as_shared_arrays(opt_a)
opt_b = optimizers.RMSprop()
opt_b.setup(model)
opt_b.setup(copy.deepcopy(model))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is copy.deepcopy required here?

@muupan
Copy link
Member

muupan commented Jun 2, 2017

Great work! Have you checked tests (including gpu and slow tests) in your environment?

@toslunar
Copy link
Member Author

toslunar commented Jun 2, 2017

Some slow tests fail because of BLAS errors that seem to be chainer/chainer#2744.
I'm doing gpu tests.

@muupan
Copy link
Member

muupan commented Jun 7, 2017

LGTM

@muupan muupan merged commit 6ccc61d into chainer:master Jun 7, 2017
@toslunar toslunar deleted the chainerv2 branch June 7, 2017 08:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants