Skip to content

Add Keras XLA Tests#6286

Merged
tfboyd merged 2 commits intomasterfrom
haoyuzhang-xla-test
Mar 1, 2019
Merged

Add Keras XLA Tests#6286
tfboyd merged 2 commits intomasterfrom
haoyuzhang-xla-test

Conversation

@haoyuz
Copy link
Contributor

@haoyuz haoyuz commented Feb 28, 2019

Add Keras XLA benchmarks and monkey-patched the assert_broadcastable op to avoid OOM. The monkey patch should be reverted once the OOM issue is fixed.

Tested with PerfZero on Google Cloud.

@haoyuz haoyuz requested review from a team and karmel as code owners February 28, 2019 18:58
@haoyuz haoyuz force-pushed the haoyuzhang-xla-test branch from a0e3d81 to bdbdac4 Compare February 28, 2019 19:02
@haoyuz haoyuz removed the request for review from karmel February 28, 2019 19:02
@haoyuz haoyuz changed the title [WIP PLEASE DO NOT MERGE] Add Keras XLA Tests Add Keras XLA Tests Feb 28, 2019
@haoyuz haoyuz requested a review from tfboyd February 28, 2019 19:35
@tfboyd tfboyd removed the request for review from a team February 28, 2019 23:44
@tfboyd
Copy link
Member

tfboyd commented Feb 28, 2019

I will look at this tomorrow / Friday. I want to make sure the monkey patch does not mess up other tests or we run in isolation. I want to get this in ASAP.

@haoyuz
Copy link
Contributor Author

haoyuz commented Mar 1, 2019

@tfboyd Sure! Maybe good to isolate XLA tests, or we can call _undo_monkey_patch_... at the boundary of tests if necessary.

Copy link
Member

@tfboyd tfboyd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. These will run with all the other tests at first and I will segregate them if I have time next week.

Copy link
Member

@tfboyd tfboyd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some lint fixes

self._run_and_report_benchmark()

def benchmark_xla_8_gpu(self):
self._setup()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need doc string to fix LINT issue.

self._run_and_report_benchmark()

def benchmark_graph_xla_8_gpu(self):
self._setup()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need doc string to fix LINT issue.

FLAGS.batch_size = 128
self._run_and_report_benchmark()

def benchmark_graph_xla_1_gpu(self):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add doc string to avoid future LINT issue.

self._run_and_report_benchmark()

def benchmark_xla_1_gpu(self):
self._setup()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add doc string to avoid future LINT issue.

@haoyuz haoyuz force-pushed the haoyuzhang-xla-test branch from bdbdac4 to 53dcca7 Compare March 1, 2019 18:31
@haoyuz haoyuz requested a review from tfboyd March 1, 2019 18:57
@haoyuz
Copy link
Contributor Author

haoyuz commented Mar 1, 2019

@tfboyd Added doc strings to fix lint errors, PTAL. Thanks!

Copy link
Member

@tfboyd tfboyd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM will wait for tests to run then merge.

@tfboyd tfboyd merged commit fa9ed45 into master Mar 1, 2019
@haoyuz haoyuz deleted the haoyuzhang-xla-test branch March 23, 2019 04:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants