New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Scatter_nd doc not clear about concurrent updates #8102
Comments
@vrv the document is still confusing for this function.
Even if order is nondeterministic, the summation is deterministic. Why not just promise that duplicates will be summed? |
I didn't write the code or the doc, so I'm not the best person to clarify this, but maybe because order of operations for floating point values matter that it's still technically a non-deterministic result. See "catastrophic cancellation" for cases where addition order matters. |
Also, I think the language implies all updates will be attempted in some order, but feel free to send a PR to update the doc to clarify that point, if you think it's useful. |
OK. Thanks. |
Is there a way to change add to update? |
If you are in the unpooling business: @teramototoya what I did as a hack: with https://www.tensorflow.org/api_docs/python/tf/unique_with_counts i counted the multiplication in the indices and i divided the tensor which was holding the values (aka tensor named 'updates' in the first comment) with this counter you can check this simple script: https://github.com/csnemes2/conv-net-viz/blob/master/ut_unpool.py If not, so your problem is general, then you have to somehow flatten your indices, and then tf.unique, see this post: |
@csnemes2 I also tried the same implementation as your hack, but I could not do it well. It worked well with your implementation! Thank you! |
The doc on the scatter_nd does not specify the consequence of multiple updates that reference the same location.
I've tested this using the following code:
The resulting tensor is:
So it seems the updates are added. Is this the intended usage? If so, it would be great to clarify this in the docs.
Thanks!
The text was updated successfully, but these errors were encountered: