Skip to content

Conversation

MarkDaoust
Copy link
Member

@MarkDaoust MarkDaoust commented Jul 10, 2018

From tensorflow/tensorflow/blob/8a6ef2cb4f98bacc1f821f60c21914b4bd5faaef
@MarkDaoust MarkDaoust requested a review from lamberta as a code owner July 10, 2018 16:34
@MarkDaoust MarkDaoust requested review from mdanatg and wolffg July 10, 2018 16:34
@lamberta
Copy link
Member

This is meant to be the guide page, yes? (Not a tutorial)
If so, can you rename autograph_control_flow.ipynb to autograph.ipynb.
Also, save notebook with open TOC.
Some of the heading hierarchy looks off.

},
"cell_type": "markdown",
"source": [
"Into graph-compatible functions like this:"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

or graph-building, if you prefer (it feels difficult to explain what is a graph-compatible function)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree with Dan for using "graph-building".
Also, should avoid using "eager mode" and "graph mode" in this doc. Prefer "eager execution" and "graph execution".

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

Copy link

@mdanatg mdanatg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice! A few cosmetic details to double check.

},
"cell_type": "markdown",
"source": [
"You can take code written for eager execution and run it in graph mode. You get the same results, but with all the benfits of graphs:"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This block might be confusing, because we're not in eager mode in this cell. The code does run with the input 9.0, but it doesn't run any ops - it just does computations in Python. It might be simpler to just omit the cell and only leave the text note.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point.

Given that we're using Graph and Session contexts everywhere... it looks like we can just switch on eager_execution, and wrap these in tf.constants.

Is that a reasonable approach?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SGTM

" # You can inspect the graph using tf.get_default_graph().as_graph_def()\n",
" g_ops = tf_g(tf.constant(9.0))\n",
" with tf.Session() as sess:\n",
" print('Autograph value: %2.2f\\n' % sess.run(g_ops)) "
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alternatively, could be also "Tensor value", if we want to avoid overbranding (at any rate, should be consistent with the subsequent cells).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Switched these to "Eager result" and "Graph result".

"with tf.Graph().as_default(): \n",
" # The result works like a regular op: takes tensors in, returns tensors.\n",
" # You can inspect the graph using tf.get_default_graph().as_graph_def()\n",
" input = tf.placeholder(tf.int32)\n",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: "input" is a python keyword, perhaps name the symbol "input_"?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Switched to "num" to match the argument name.

"source": [
"### Lists\n",
"\n",
"Appending to lists in loops also works (we create a `TensorArray` for you behind the scenes)"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is no longer always the case. Rephrasing to this would be ok: "(we create tensor list ops for you behind the scenes)"

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

"def f(n):\n",
" z = []\n",
" # We ask you to tell us the element dtype of the list\n",
" z = autograph.utils.set_element_type(z, tf.int32)\n",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bit of a last-minute update: we no longer require to assign to z. So this would be better:

autograph.utils.set_element_type(z, tf.int32)

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Side note: we're rolling out a change to allow the shorter form:

autograph.set_element_type(z, tf.int32)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With the current tf-nightly this works:

z = []
autograph.utils.set_element_type(z, tf.int32)

but this fails:

z=[]
autograph.set_element_type(z, tf.int32)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

b/111372661

"\n",
"with tf.Graph().as_default(): \n",
" with tf.Session():\n",
" print(tf_f(tf.constant(3)).eval())"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any thought about using eval() / sess.run() consistently? I'm ok if you feel both should be exercised.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any thought about using eval() / sess.run() consistently? I'm ok if you feel both should be exercised.

Consistency, sure! Do you have a preferred style? I have no strong feelings on this.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I generally use sess.run(), because eval() is not as idiomatic.

},
"cell_type": "markdown",
"source": [
"### Nested If statement"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Nested if statements"? (plural, not capitalized)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

},
"cell_type": "markdown",
"source": [
"## Advanced example: A training, loop in-graph\n",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unnecessary comma?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

" while i < hp.max_steps:\n",
" train_x, train_y = get_next_batch(train_ds)\n",
" test_x, test_y = get_next_batch(test_ds)\n",
" # add get next\n",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comment feels unnecessary.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

"\n",
"Writing control flow in AutoGraph is easy, so running a training loop in a TensorFlow graph should be easy as well! \n",
"\n",
"Here, we show an example of training a simple Keras model on MNIST, where the entire training process -- loading batches, calculating gradients, updating parameters, calculating validation accuracy, and repeating until convergence -- is done in-graph."
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

WDYT about a sentence somewhere ahead of this that mentions something in the lines of "oh, and you can use Keras libraries in AutoGraph as well"? Not sure if it's something the user would expect or not.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. I'll link to the other notebook here when it's ready.

@MarkDaoust
Copy link
Member Author

This is meant to be the guide page, yes? (Not a tutorial)

I'm open to suggestions. I was thinking this notebook could go in the "Low Level APIs" section of the guide, since that's the only place where we expect users to understand the details of graph mode.

I'm (still) working on a keras-model example that could go in tutorials.

If so, can you rename autograph_control_flow.ipynb to autograph.ipynb.

done.

Also, save notebook with open TOC.

Done.

Some of the heading hierarchy looks off.

Fixed.

@lamberta
Copy link
Member

@lamberta
Copy link
Member

@mdanatg
Copy link

mdanatg commented Jul 13, 2018

Latest draft looks good. A note about the import lines - the reader might be confused into thinking AutoGraph requires eager mode to be enabled. Could we add a note that we only enable it for comparison purposes?

@lamberta
Copy link
Member

I was kinda wondering about that ...
Done, added note about eager and graph in my PR: MarkDaoust#7

Stage: https://colab.sandbox.google.com/github/lamberta/models/blob/mark-autograph/samples/core/guide/autograph.ipynb

@mdanatg
Copy link

mdanatg commented Jul 13, 2018

Looks great! I'll be able to assist with debugging the convergence TODO later today if you wish.

"source": [
"We'll enable [eager execution](https://www.tensorflow.org/guide/eager) for demonstration purposes, but AutoGraph works in both eager and [graph execution](https://www.tensorflow.org/guide/graphs) environments:",
"",
"Important: The converted code will _work_ directly in eager mode. But it runs as an eager-python function. To run in-graph use explicit graphs, as in this doc, or use `tf.contrib.eager.defun`."
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we added similiar notes, can you remove this one?

@lamberta
Copy link
Member

Updated note in PR: MarkDaoust#8

@MarkDaoust
Copy link
Member Author

MarkDaoust commented Jul 13, 2018 via email

@MarkDaoust MarkDaoust merged commit 78e871c into tensorflow:master Jul 13, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants