Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Statement's execution ( PgStatement.execute() ) may hung in PgStatement.killTimerTask() forever if connectuion pool will try to close statement before it will finish execution #1022
I've used com.mchange.v2.c3p0.ComboPooledDataSource with enabled "unreturnedConnectionTimeout"
As result connection pool has called close() for my prepared statement before query execution was finished and before than cancelTimerTask of PgStatement call cancel() for my prepared statement.
Then, connection to postgres has not been closed, because thread of connection pool which was responsible to that was hung on lock of QueryExecutor when it tries to setup autoCommit flag of connection to default state.
thread which has runs my query hold lock of QueryExecutor and hung on socket
This is looks like deadlock by itself.
Of course this is behavior of connection pool. May be it breaks API contract. It mustn't try to call
I don't think what this is bug itself, because similar behavior exists in other JDBC drivers, i speak about
But really PgStatement is in invalid state now: cancelTimerTask is null and timeout > 0
stack trace of thread
This has happens because cleanupTimer() method returns false from here
and then this code inside killTimerTask()
can't decide that timer already was canceled but statements state is IN QUERY and goes to infinite loop
Looks like that PgStatement.cleanupTimer() must return false only if task is really pending and
In general, several threads can call cleanupTimer(), thread which runs query, timer's thread and any external thread, for example service thread of connection pool.
should be replaced by this
@alextomaili , great analysis, unfortunately it is not flawless.
The below change is not sufficient.
Thanks i see.
I think main purpose of waiting loop in PgStatement.killTimerTask() to block
But problem of hung is really exist.
Of course, this is expected behavior.