Skip to content

Conversation

luomai
Copy link
Member

@luomai luomai commented Aug 1, 2018

Checklist

  • I've tested that my changes are compatible with the latest version of Tensorflow.
  • I've read the Contribution Guidelines
  • I've updated the documentation if necessary.

Motivation and Context

When the data preparation is heavy, we observe low GPU utilisation as the GPU is waiting for the CPU to finish data pre-processing. Enabling dataset prefetch can overlap CPU pre-processing and GPU training.

Description

The dataset prefetch is useful when the data preparation step is heavy.

@luomai luomai changed the title add prefetch to distributed trainer. Support dataset prefetch to distributed trainer. Aug 1, 2018
@luomai luomai changed the title Support dataset prefetch to distributed trainer. Support input dataset prefetch in distributed trainer. Aug 1, 2018
@luomai luomai merged commit ec095bd into master Aug 1, 2018
@luomai luomai deleted the distributed2 branch August 1, 2018 23:51
luomai added a commit that referenced this pull request Nov 21, 2018
* add prefetch to distributed trainer.

* update changelog

* yapf
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants