-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding a random seed parameter for tensorflow.layer.dense #9744
Comments
Short answer to the seed request: it is unlikely this will be accepted, since TensorFlow's universal convention has been to use "initializers" instead of a simple seed. For example, see init_ops.py. That said, we should perhaps document the default inits used in |
The way to get deterministic weight initializations in If |
Thanks for the clarification. So the following would be the current default behavior based on the global default weight initializer? import tensorflow
from tensorflow.python.ops.init_ops import glorot_uniform_initializer
tensorflow.layers.dense(..., kernel_initializer=glorot_uniform_initializer()) I am happy to put together a PR to add a note to the docstrings of if kernel_initializer is None:
self.kernel_initializer = glorot_uniform_initializer()
else:
self.kernel_initializer = kernel_initializer Not sure what the current TensorFlow convention is; would it be okay to do these checks & assignments in the class |
This code snippet in |
Okay, will open a PR tomorrow.
I saw that in def dense(
inputs, units,
activation=None,
use_bias=True,
kernel_initializer=None,
bias_initializer=init_ops.zeros_initializer(),
kernel_regularizer=None,
bias_regularizer=None,
activity_regularizer=None,
trainable=True,
name=None,
reuse=None): I am curious why setting |
We should avoid instantiating anything in a class constructor signature, because the instance becomes global to all instances of the class. There are no immediate side effects in this case but it is bad practice and potentially dangerous for the user. For the same reason We should instantiate the initializers in the |
Ah thanks, makes sense now!
Yeah, previously, I think I misinterpreted this. I thought you meant that it wouldn't have an effect setting something in |
Closing since specifying the initializer is the right solution. |
Hi,
I noticed that the
tensorflow.layer.dense
wrapper does not have aseed
parameter, and I was wondering what you think about adding one. E.g., this seed would be used to initialize the random weights and bias units (if they are not initialized to zero).Related to the points above, it is currently also not clear how the weights are initialized. I.e., what distribution the random numbers are drawn from (if not zero). Does someone have some info on this? -- maybe this could be added to the API docs.
What do you think? (I am happy to contribute a random seed implementation and a doc update if that's desirable.)
The text was updated successfully, but these errors were encountered: