Skip to content

Conversation

@miladm
Copy link
Collaborator

@miladm miladm commented Sep 8, 2021

Issue #3115

@miladm miladm self-assigned this Sep 8, 2021
@miladm miladm linked an issue Sep 8, 2021 that may be closed by this pull request
@miladm miladm requested a review from JackCaoG September 8, 2021 03:28
AllClose(b, xla_b);

if (DebugUtil::ExperimentEnabled("nonzero")) {
if (DebugUtil::ExperimentEnabled("nonzero") &&
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please rebase

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Diff is still here, can you rebase again? bridge::AtenDeviceToXlaDevice(device).hw_type == DeviceType::TPU here has been removed from master. You might need to pull first.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for some odd reason, the rebase and merge operations do not work as intended saying everything is up to date and showing the latest master commit present in this branch. Clearly something is wrong given this delta is still present. I ran a git checkout --patch on this file to resolve the problem.

auto element_type = TensorTypeToRawXlaType(self.scalar_type());
XLATensor input_tensor = bridge::GetXlaTensor(self);
const Device& device = input_tensor.GetDevice();
auto element_type = GetDevicePrimitiveType(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can use MakeXlaPrimitiveType which takes scalar_type directly.

Copy link
Collaborator Author

@miladm miladm Sep 16, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good shortcut. Done.

@miladm miladm requested a review from JackCaoG September 16, 2021 04:06
@miladm miladm added the bug Something isn't working label Sep 16, 2021
Copy link
Collaborator

@JackCaoG JackCaoG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

most lgtm, please pull and rebase

AllClose(b, xla_b);

if (DebugUtil::ExperimentEnabled("nonzero")) {
if (DebugUtil::ExperimentEnabled("nonzero") &&
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Diff is still here, can you rebase again? bridge::AtenDeviceToXlaDevice(device).hw_type == DeviceType::TPU here has been removed from master. You might need to pull first.

@miladm miladm requested a review from JackCaoG September 17, 2021 17:13
Copy link
Collaborator

@JackCaoG JackCaoG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you make sure that test pass on TPU?

@miladm
Copy link
Collaborator Author

miladm commented Sep 17, 2021

Confirming this code passes on TPU tests. CC @JackCaoG

@JackCaoG JackCaoG merged commit 088607f into master Sep 17, 2021
@JackCaoG JackCaoG deleted the fix_nantonum_tests branch September 17, 2021 19:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

NanToNum fails on TPU tests

3 participants