Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tf.raw_ops.ResourceScatterDiv crash with abortion #60123

Closed
trickiwoo opened this issue Mar 27, 2023 · 6 comments
Closed

tf.raw_ops.ResourceScatterDiv crash with abortion #60123

trickiwoo opened this issue Mar 27, 2023 · 6 comments
Assignees
Labels
comp:ops OPs related issues stale This label marks the issue/pr stale - to be closed automatically if no activity stat:awaiting response Status - Awaiting response from author TF 2.12 For issues related to Tensorflow 2.12 type:bug Bug

Comments

@trickiwoo
Copy link

trickiwoo commented Mar 27, 2023

Click to expand!

Issue Type

Bug

Have you reproduced the bug with TF nightly?

Yes

Source

source

Tensorflow Version

2.13.0-dev20230208

Custom Code

Yes

OS Platform and Distribution

No response

Mobile device

No response

Python version

No response

Bazel version

No response

GCC/Compiler version

No response

CUDA/cuDNN version

8.2.4

GPU model and memory

No response

Current Behaviour?

2.13.0-dev20230208

Standalone code to reproduce the issue

import tensorflow as tf
from tensorflow.python.eager import context

input1 = tf.raw_ops.VarHandleOp(dtype=tf.int32, shape=[2, 3], shared_name=context.anonymous_name())
input2 = tf.constant([],dtype=tf.float32)
output = tf.raw_ops.ResourceScatterDiv(resource=input1, indices=[0], updates=input2)

Relevant log output

2023-03-26 22:11:23.135344: F tensorflow/core/framework/tensor.cc:770] Check failed: dtype() == expected_dtype (3 vs. 1) float expected, got int32
Aborted (core dumped)
@google-ml-butler google-ml-butler bot added the type:bug Bug label Mar 27, 2023
@synandi synandi added the comp:ops OPs related issues label Mar 27, 2023
@synandi
Copy link
Contributor

synandi commented Mar 27, 2023

Hi @trickiwoo
Thank you for reporting the issue!
I was able to replicate the issue using the latest tf-nightly(2.13.0.dev20230327). Please find the screenshot below.
image
We are investigating this issue and will update here soon. Thank you!

@synandi synandi assigned tilakrayal and unassigned synandi Mar 29, 2023
@tilakrayal tilakrayal added the TF 2.12 For issues related to Tensorflow 2.12 label Apr 10, 2023
@tilakrayal tilakrayal added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Apr 17, 2023
@tilakrayal
Copy link
Contributor

tilakrayal commented Apr 17, 2023

@trickiwoo,
There was a PR that was raised internally and the developer is working on that issue. Once it is resolved we will update the result here. Thank you!

@tilakrayal
Copy link
Contributor

@trickiwoo,
I tried to execute the mentioned code on tf-nightly and the code was executed with the error and also observed that the crash did not happen. And the same has been in the respective files. Kindly find the gist for the reference.

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/kernels/quantize_and_dequantize_op.cc#L22

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/kernels/resource_variable_ops.cc#L1120

// Check data type of update and resource to scatter.
const DataType update_dtype = c->input(2).dtype();
OP_REQUIRES(c, v->tensor()->dtype() == update_dtype,
errors::InvalidArgument(
"DType of scatter resource and updates does not match."));

Thank you!

@tilakrayal tilakrayal added the stat:awaiting response Status - Awaiting response from author label Jan 25, 2024
@sachinprasadhs sachinprasadhs removed the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Jan 25, 2024
Copy link

github-actions bot commented Feb 2, 2024

This issue is stale because it has been open for 7 days with no activity. It will be closed if no further activity occurs. Thank you.

@github-actions github-actions bot added the stale This label marks the issue/pr stale - to be closed automatically if no activity label Feb 2, 2024
Copy link

This issue was closed because it has been inactive for 7 days since being marked as stale. Please reopen if you'd like to work on this further.

Copy link

Are you satisfied with the resolution of your issue?
Yes
No

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:ops OPs related issues stale This label marks the issue/pr stale - to be closed automatically if no activity stat:awaiting response Status - Awaiting response from author TF 2.12 For issues related to Tensorflow 2.12 type:bug Bug
Projects
None yet
Development

No branches or pull requests

4 participants