You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
FYI I am working with Example protos for model input, and I am learning that use tf.parse_example to parse a (shuffled) batch of serialized examples is much faster than using tf.parse_single_example prior to batching. For my particular dataset, using parse_single_example allows me to create feed_dicts with batch size 128 at about 100/min; batching the serialized Example protos and then using parse_example is running at around 3000/min.
You may want to update the documentation to suggest using tf.parse_example everywhere, as is suggested when using sparse input data.
The text was updated successfully, but these errors were encountered:
FYI I am working with Example protos for model input, and I am learning that use
tf.parse_example
to parse a (shuffled) batch of serialized examples is much faster than usingtf.parse_single_example
prior to batching. For my particular dataset, using parse_single_example allows me to createfeed_dict
s with batch size 128 at about 100/min; batching the serialized Example protos and then using parse_example is running at around 3000/min.You may want to update the documentation to suggest using
tf.parse_example
everywhere, as is suggested when using sparse input data.The text was updated successfully, but these errors were encountered: