Skip to content

Conversation

ZolotukhinM
Copy link

@ZolotukhinM ZolotukhinM commented Apr 30, 2021

Stack from ghstack:

BufferArg is used to describe parameters passed to the codegen: it
indicates whether the parameter is a var or a buf and holds a pointer to
the corresponding var/buf. Both var and buf contain dtype, and thus
duplicating it in BufferArg is unnecessary - we can always get it from
the var/buf. Hence we're removing dtype_ field from BufferArg in this
PR. We're also adding a buf_ field here: this is done so that
BufferArg truly has all the info about the parameter.

Differential Revision: D28128329

…rArg`.

`BufferArg` is used to describe parameters passed to the codegen: it
indicates whether the parameter is a var or a buf and holds a pointer to
the corresponding var/buf. Both var and buf contain dtype, and thus
duplicating it in BufferArg is unnecessary - we can always get it from
the var/buf. Hence we're removing dtype_ field from BufferArg in this
PR. We're also adding a `buf_` field here: this is done so that
BufferArg truly has all the info about the parameter.

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Apr 30, 2021

💊 CI failures summary and remediations

As of commit f81f2a4 (more details on the Dr. CI page):


  • 3/3 failures possibly* introduced in this PR
    • 1/3 non-scanned failure(s)

🕵️ 1 new failure recognized by patterns

The following CI failures do not appear to be due to upstream breakages:

See CircleCI build pytorch_xla_linux_bionic_py3_6_clang9_test (1/1)

Step: "Run tests" (full log | diagnosis details | 🔁 rerun)

May 03 23:20:26 ERROR [1.705s]: test_clamp_xla_int64 (__main__.TestDevicePrecisionXLA)
May 03 23:20:26 	PyRun_FileExFlags
May 03 23:20:26 	PyRun_SimpleFileExFlags
May 03 23:20:26 	Py_Main
May 03 23:20:26 	main
May 03 23:20:26 	__libc_start_main
May 03 23:20:26 	
May 03 23:20:26 *** End stack trace ***
May 03 23:20:26 
May 03 23:20:26 
May 03 23:20:26 ======================================================================
May 03 23:20:26 ERROR [1.705s]: test_clamp_xla_int64 (__main__.TestDevicePrecisionXLA)
May 03 23:20:26 ----------------------------------------------------------------------
May 03 23:20:26 Traceback (most recent call last):
May 03 23:20:26   File "/opt/conda/lib/python3.6/site-packages/torch/testing/_internal/common_device_type.py", line 297, in instantiated_test
May 03 23:20:26     raise rte
May 03 23:20:26   File "/opt/conda/lib/python3.6/site-packages/torch/testing/_internal/common_device_type.py", line 292, in instantiated_test
May 03 23:20:26     result = test_fn(self, *args)
May 03 23:20:26   File "/var/lib/jenkins/workspace/xla/test/../../test/test_torch.py", line 7500, in test_clamp
May 03 23:20:26     actual = x[..., :1].clamp(lb, ub)
May 03 23:20:26 RuntimeError: /var/lib/jenkins/workspace/xla/third_party/tensorflow/bazel-tensorflow/tensorflow/compiler/xla/xla_client/debug_macros.h:27 : Check failed: status.status() == ::tensorflow::Status::OK() (Invalid argument: Input dimension should be either 1 or equal to the output dimension it is broadcasting into; the 1th operand dimension is 50, the 1th output dimension is 1. vs. OK)
May 03 23:20:26 *** Begin stack trace ***

1 failure not recognized by patterns:

Job Step Action
CircleCI pytorch_linux_xenial_py3_6_gcc5_4_test Run tests 🔁 rerun

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

dgl-intel pushed a commit to dgl-intel/pytorch that referenced this pull request Apr 30, 2021
…rArg`.

`BufferArg` is used to describe parameters passed to the codegen: it
indicates whether the parameter is a var or a buf and holds a pointer to
the corresponding var/buf. Both var and buf contain dtype, and thus
duplicating it in BufferArg is unnecessary - we can always get it from
the var/buf. Hence we're removing dtype_ field from BufferArg in this
PR. We're also adding a `buf_` field here: this is done so that
BufferArg truly has all the info about the parameter.

ghstack-source-id: e9e9ff2
Pull Request resolved: pytorch#57382
…eGen::BufferArg`."

`BufferArg` is used to describe parameters passed to the codegen: it
indicates whether the parameter is a var or a buf and holds a pointer to
the corresponding var/buf. Both var and buf contain dtype, and thus
duplicating it in BufferArg is unnecessary - we can always get it from
the var/buf. Hence we're removing dtype_ field from BufferArg in this
PR. We're also adding a `buf_` field here: this is done so that
BufferArg truly has all the info about the parameter.

Differential Revision: [D28128329](https://our.internmc.facebook.com/intern/diff/D28128329)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

@ZolotukhinM merged this pull request in 030692c.

@facebook-github-bot facebook-github-bot deleted the gh/ZolotukhinM/446/head branch May 7, 2021 14:15
krshrimali pushed a commit to krshrimali/pytorch that referenced this pull request May 19, 2021
…rArg`. (pytorch#57382)

Summary:
Pull Request resolved: pytorch#57382

`BufferArg` is used to describe parameters passed to the codegen: it
indicates whether the parameter is a var or a buf and holds a pointer to
the corresponding var/buf. Both var and buf contain dtype, and thus
duplicating it in BufferArg is unnecessary - we can always get it from
the var/buf. Hence we're removing dtype_ field from BufferArg in this
PR. We're also adding a `buf_` field here: this is done so that
BufferArg truly has all the info about the parameter.

Test Plan: Imported from OSS

Reviewed By: bertmaher

Differential Revision: D28128329

Pulled By: ZolotukhinM

fbshipit-source-id: c03bff54bc6860f7ac6edfcb42ce6a82d8309589
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed Merged oncall: jit Add this issue/PR to JIT oncall triage queue
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants