Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Printing lines without logging prefix using tf.Print() #12537

Closed
danijar opened this issue Aug 23, 2017 · 1 comment · Fixed by #69149
Closed

Printing lines without logging prefix using tf.Print() #12537

danijar opened this issue Aug 23, 2017 · 1 comment · Fixed by #69149
Labels
stat:contribution welcome Status - Contributions welcome type:feature Feature requests

Comments

@danijar
Copy link
Contributor

danijar commented Aug 23, 2017

Right now, all messages printed via tf.Print() are prefixed by:

<timestamp>: I tensorflow/core/kernels/logging_ops.cc:79]

It would be useful to have a way to print user-friendly output from within the graph. Therefore, suggest adding a parameter for the logging format to tf.Print(), or at least adding a flag that disables the logging prefix.

@danijar danijar added the type:feature Feature requests label Aug 23, 2017
yoon-hyoung added a commit to yoon-hyoung/tensorflow that referenced this issue Aug 24, 2017
*print prefix message by flag value
@vrv
Copy link

vrv commented Aug 25, 2017

The prefix comes from LOG(INFO) inside the logging_ops kernel. The fix needs to be made at the C++ level, though I don't have a good recommendation as to how.

@aselle aselle added the stat:contribution welcome Status - Contributions welcome label Nov 29, 2017
jzju added a commit to jzju/tensorflow that referenced this issue Dec 7, 2017
caisq added a commit that referenced this issue Dec 14, 2017
copybara-service bot pushed a commit that referenced this issue Jun 4, 2024
…ax rewriter

Imported from GitHub PR openxla/xla#12537

This PR fixes :
XLA Issue : openxla/xla#11772
corresponding JAX issue : google/jax#20856

Accuracy fix :
This PR adds support for softmax axis other than -1, previously oneDNN softmax was always executed with axis = -1 (last dimension)
Accuracy issue is observed from JAX 0.4.22 till last release, with current main branch we don't see accuracy issue as the JAX softmax HLO pattern changed (google/jax#20643) and HLO pattern is not re-written to oneDNN softmax. Hence, this PR also adjusts the oneDNN softmax pattern to recognize new HLO pattern and rewrite into oneDNN softmax custom call.
Copybara import of the project:

--
b983bd61044bf52838661057f570e9e218803549 by Sachin Muradi <sachin.muradi@intel.com>:

Add axis support

--
07f7d14aa1f00676b21e8c1f48719fe7e56bcfd2 by Sachin Muradi <sachin.muradi@intel.com>:

address more comments + optional broadcast

Merging this change closes #12537

FUTURE_COPYBARA_INTEGRATE_REVIEW=openxla/xla#12537 from Intel-tensorflow:sachin/softmax-axis de9d6bd36272962faf76077f594649d229bb727a
PiperOrigin-RevId: 640140990
copybara-service bot pushed a commit that referenced this issue Jun 4, 2024
…ax rewriter

Imported from GitHub PR openxla/xla#12537

This PR fixes :
XLA Issue : openxla/xla#11772
corresponding JAX issue : google/jax#20856

Accuracy fix :
This PR adds support for softmax axis other than -1, previously oneDNN softmax was always executed with axis = -1 (last dimension)
Accuracy issue is observed from JAX 0.4.22 till last release, with current main branch we don't see accuracy issue as the JAX softmax HLO pattern changed (google/jax#20643) and HLO pattern is not re-written to oneDNN softmax. Hence, this PR also adjusts the oneDNN softmax pattern to recognize new HLO pattern and rewrite into oneDNN softmax custom call.
Copybara import of the project:

--
b983bd61044bf52838661057f570e9e218803549 by Sachin Muradi <sachin.muradi@intel.com>:

Add axis support

--
07f7d14aa1f00676b21e8c1f48719fe7e56bcfd2 by Sachin Muradi <sachin.muradi@intel.com>:

address more comments + optional broadcast

Merging this change closes #12537

FUTURE_COPYBARA_INTEGRATE_REVIEW=openxla/xla#12537 from Intel-tensorflow:sachin/softmax-axis de9d6bd36272962faf76077f594649d229bb727a
PiperOrigin-RevId: 640140990
copybara-service bot pushed a commit that referenced this issue Jun 4, 2024
…ax rewriter

Imported from GitHub PR openxla/xla#12537

This PR fixes :
XLA Issue : openxla/xla#11772
corresponding JAX issue : google/jax#20856

Accuracy fix :
This PR adds support for softmax axis other than -1, previously oneDNN softmax was always executed with axis = -1 (last dimension)
Accuracy issue is observed from JAX 0.4.22 till last release, with current main branch we don't see accuracy issue as the JAX softmax HLO pattern changed (google/jax#20643) and HLO pattern is not re-written to oneDNN softmax. Hence, this PR also adjusts the oneDNN softmax pattern to recognize new HLO pattern and rewrite into oneDNN softmax custom call.
Copybara import of the project:

--
b983bd61044bf52838661057f570e9e218803549 by Sachin Muradi <sachin.muradi@intel.com>:

Add axis support

--
07f7d14aa1f00676b21e8c1f48719fe7e56bcfd2 by Sachin Muradi <sachin.muradi@intel.com>:

address more comments + optional broadcast

Merging this change closes #12537

Reverts dbf3cd3

FUTURE_COPYBARA_INTEGRATE_REVIEW=openxla/xla#12537 from Intel-tensorflow:sachin/softmax-axis de9d6bd36272962faf76077f594649d229bb727a
PiperOrigin-RevId: 640140990
copybara-service bot pushed a commit that referenced this issue Jun 4, 2024
…ax rewriter

Imported from GitHub PR openxla/xla#12537

This PR fixes :
XLA Issue : openxla/xla#11772
corresponding JAX issue : google/jax#20856

Accuracy fix :
This PR adds support for softmax axis other than -1, previously oneDNN softmax was always executed with axis = -1 (last dimension)
Accuracy issue is observed from JAX 0.4.22 till last release, with current main branch we don't see accuracy issue as the JAX softmax HLO pattern changed (google/jax#20643) and HLO pattern is not re-written to oneDNN softmax. Hence, this PR also adjusts the oneDNN softmax pattern to recognize new HLO pattern and rewrite into oneDNN softmax custom call.
Copybara import of the project:

--
b983bd61044bf52838661057f570e9e218803549 by Sachin Muradi <sachin.muradi@intel.com>:

Add axis support

--
07f7d14aa1f00676b21e8c1f48719fe7e56bcfd2 by Sachin Muradi <sachin.muradi@intel.com>:

address more comments + optional broadcast

Merging this change closes #12537

FUTURE_COPYBARA_INTEGRATE_REVIEW=openxla/xla#12537 from Intel-tensorflow:sachin/softmax-axis de9d6bd36272962faf76077f594649d229bb727a
PiperOrigin-RevId: 639049433
copybara-service bot pushed a commit that referenced this issue Jun 4, 2024
…ax rewriter

Imported from GitHub PR openxla/xla#12537

This PR fixes :
XLA Issue : openxla/xla#11772
corresponding JAX issue : google/jax#20856

Accuracy fix :
This PR adds support for softmax axis other than -1, previously oneDNN softmax was always executed with axis = -1 (last dimension)
Accuracy issue is observed from JAX 0.4.22 till last release, with current main branch we don't see accuracy issue as the JAX softmax HLO pattern changed (google/jax#20643) and HLO pattern is not re-written to oneDNN softmax. Hence, this PR also adjusts the oneDNN softmax pattern to recognize new HLO pattern and rewrite into oneDNN softmax custom call.
Copybara import of the project:

--
b983bd61044bf52838661057f570e9e218803549 by Sachin Muradi <sachin.muradi@intel.com>:

Add axis support

--
07f7d14aa1f00676b21e8c1f48719fe7e56bcfd2 by Sachin Muradi <sachin.muradi@intel.com>:

address more comments + optional broadcast

Merging this change closes #12537

PiperOrigin-RevId: 640246750
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stat:contribution welcome Status - Contributions welcome type:feature Feature requests
Projects
None yet
4 participants