Skip to content

Commit

Permalink
Revert "[SPARK-47708][CONNECT] Do not log gRPC exception to stderr in…
Browse files Browse the repository at this point in the history
… PySpark

This reverts commit d87ac8e.

Turns out the sparkconnect logger is disabled by default. This make us loose the ability to log the error message if required (by explicitly turning on the logger).

### What changes were proposed in this pull request?

### Why are the changes needed?

### Does this PR introduce _any_ user-facing change?

### How was this patch tested?

### Was this patch authored or co-authored using generative AI tooling?

Closes #45878 from nemanja-boric-databricks/revert-logger.

Authored-by: Nemanja Boric <nemanja.boric@databricks.com>
Signed-off-by: Wenchen Fan <wenchen@databricks.com>
  • Loading branch information
nemanja-boric-databricks authored and cloud-fan committed Apr 4, 2024
1 parent bffb02d commit 3b8aea3
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions python/pyspark/sql/connect/client/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -1719,6 +1719,7 @@ def _handle_rpc_error(self, rpc_error: grpc.RpcError) -> NoReturn:
-------
Throws the appropriate internal Python exception.
"""
logger.exception("GRPC Error received")
# We have to cast the value here because, a RpcError is a Call as well.
# https://grpc.github.io/grpc/python/grpc.html#grpc.UnaryUnaryMultiCallable.__call__
status = rpc_status.from_call(cast(grpc.Call, rpc_error))
Expand Down

0 comments on commit 3b8aea3

Please sign in to comment.