Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Backport #17002, #17068 and #17114 to 1.6 branch #17137

Merged
merged 3 commits into from Dec 20, 2019

Conversation

ptrendx
Copy link
Member

@ptrendx ptrendx commented Dec 20, 2019

Description

Backport #17002, #17068 and #17114 to 1.6 branch.

@eric-haibin-lin @MoisesHer FYI

ptrendx and others added 3 commits December 20, 2019 10:50
* Debug the long startup time

* Optimize backward fusion

* Figure out why the fusion pass is called twice

* Cleaning

* Small optimization
* fix trainer param order

* Update trainer.py

* Update trainer.py

* Update trainer.py
* Update multi_sum_sq to avoid AtomicAdd

* Add specific test for multi_sum_sq

* Add a determism test and lint issues

* better test for cheching op is deterministic

* Follow MXNet letters case format

* Reduce dimensions of tensors in the test
@ptrendx ptrendx added the R1.6.0 label Dec 20, 2019
@ptrendx ptrendx merged commit 0c6f49f into apache:v1.6.x Dec 20, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants