Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Keyword-only link argument in GradientMethod #3940

Closed

Conversation

niboshi
Copy link
Member

@niboshi niboshi commented Nov 22, 2017

Improves #3488 by making link argument keyword-only.

optimizer = chainer.optimizers.SGD(model)  # does not work

In #3488, this code does not work and dangerous because model is treated as a hyperparameter.
In this PR, user must explicitly specify link= keyword.

optimizer = chainer.optimizers.SGD(link=model)

Also, type check on hyperparameter assignment is implemented, so the former code raises TypeError.

I wondered which is better keyword, link or model, but as Optimizer.setup() method has link argument, I chose link.

@niboshi niboshi added the cat:enhancement Implementation that does not break interfaces. label Nov 22, 2017
@niboshi niboshi force-pushed the improve-optimizer-model-argument branch from fde37e4 to 8e9d41e Compare November 22, 2017 03:16
@niboshi niboshi added this to the v4.0.0b2 milestone Nov 22, 2017
# If the attribute is not defined as the class attribute of
# `Hyperparameter`, it's assumed to be a hyperparameter.
if not hasattr(Hyperparameter, name):
if not (isinstance(value, (numpy.ndarray, cuda.ndarray))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suppose this change makes Hyperparameter less flexible.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean you may want to set a hyperparameter which is not an array nor a scalar?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes.
I've encountered the case I'd like to pass a string argument to an optimizer, which syncs parameters 'soft' or 'hard'. To be honest, my final code didn't pass the string, and I've not set a string value to a hyperparameter (because the optimizer was not an instance of GradientMethod.)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, we need to decide to which extent Hyperparameter accepts as a value.

The most conservative option is to accept anything but Link, but in this case we would need to add such rules in the future depending on how it is used. I don't think that's a good choice.

I think the minimum is scalars and ndarrays (as in the current code).
Maybe str is reasonable, too.

@beam2d beam2d removed this from the v4.0.0b2 milestone Dec 12, 2017
@niboshi
Copy link
Member Author

niboshi commented Jan 25, 2018

Now that this argument has been removed (See #4141), I'm closing this PR.

@niboshi niboshi closed this Jan 25, 2018
@niboshi niboshi added this to the Closed issues and PRs milestone Jan 25, 2018
@niboshi niboshi deleted the improve-optimizer-model-argument branch June 12, 2019 17:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cat:enhancement Implementation that does not break interfaces.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants