New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[TF 2.0] tf.summary should be easier to use with graphs #26409
Comments
@nfelt can you please guide I want to work on this. |
Unlike Say I have a top-level def build_bnn(weights_list, biases_list, activation=tf.nn.relu):
def model(X):
net = X
for (weights, biases) in zip(weights_list[:-1], biases_list[:-1]):
net = dense(net, weights, biases, activation)
# final linear layer
net = tf.matmul(net, weights_list[-1]) + biases_list[-1]
pred = net[:, 0]
std_dev = net[:, 1]
scale = tf.nn.softplus(std_dev) + 1e-6 # ensure scale is positive
return tfd.Normal(loc=pred, scale=scale)
return model If I try to insert
|
Any progress on this? |
@nfelt We are checking to see if you still need help on this issue . could you please check with latest released |
This is pretty much all still relevant in current TF, yes. |
Update: this is still relevant in current TF (tensorflow/tensorboard#5866). |
This feature request tracks improving the usability of tf.summary in TF 2.0 when used with graphs - specifically with
tf.function
and legacy graph mode.Currently there are a number of interrelated limitations that make using tf.summary somewhat awkward and error-prone outside of eager mode:
In legacy graph mode, the writer must be configured in advance
create_file_writer()
before any graph construction happens, or all summary-writing functions become no-opswriter.init()
later), but all options to initialization, in particular the logdir, still must be passed earlier tocreate_file_writer()
In tf.functions it's a similar story
tf.function has the additional complication that it can't own any state
with writer.as_default()
, and make sure the writer object exists as long as the tf.function is being usedThe step and recording condition (
tf.summary.record_if()
) have milder but similar issuestf.Variable
or a placeholder (tf.compat.v1.placeholder
for legacy graph mode, or a function argument fortf.function
) and then set that value when executing the graphThe text was updated successfully, but these errors were encountered: