Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support binary streams with JdbcClient #32161

Closed
dopsun opened this issue Jan 30, 2024 · 11 comments
Closed

Support binary streams with JdbcClient #32161

dopsun opened this issue Jan 30, 2024 · 11 comments
Assignees
Labels
in: data Issues in data modules (jdbc, orm, oxm, tx) status: feedback-provided Feedback has been provided type: enhancement A general enhancement
Milestone

Comments

@dopsun
Copy link

dopsun commented Jan 30, 2024

In Postgres database, there is a column type bytea. According to Storing Binary Data, here is the code to write files to bytea column with PreparedStatement:

File file = new File("myimage.gif");
try (FileInputStream fis = new FileInputStream(file);
        PreparedStatement ps = conn.prepareStatement("INSERT INTO images VALUES (?, ?)"); ) {
    ps.setString(1, file.getName());
    ps.setBinaryStream(2, fis, (int) file.length());  // <-- setBinaryStream needed here
    ps.executeUpdate();
}

For the current version of JdbcClient (6.1.1), StatementSpec.param() does not support InputStream. I have tried to open up underlying code and cannot find places where doing stream related code.

Missing this feature, will prevent JdbcClient to be used when bytea column exists.

@spring-projects-issues spring-projects-issues added the status: waiting-for-triage An issue we've not yet triaged or decided on label Jan 30, 2024
@dopsun dopsun changed the title Binary stream parameter support is required for JdbcClient to handle Postgresql bytea column type Binary stream parameter support is needed for JdbcClient to handle Postgresql bytea column type Jan 30, 2024
@sbrannen sbrannen added the in: data Issues in data modules (jdbc, orm, oxm, tx) label Jan 30, 2024
@sbrannen sbrannen changed the title Binary stream parameter support is needed for JdbcClient to handle Postgresql bytea column type Binary stream parameter support is needed for JdbcClient to handle Postgres bytea column type Jan 30, 2024
@sbrannen sbrannen changed the title Binary stream parameter support is needed for JdbcClient to handle Postgres bytea column type Support binary streams with JdbcClient Jan 30, 2024
@sbrannen
Copy link
Member

sbrannen commented Jan 30, 2024

Have you tried using Spring JDBC's support for BLOBs (Binary Large OBjects) via the LobHandler abstraction -- for example, DefaultLobHandler?

@sbrannen sbrannen added the status: waiting-for-feedback We need additional information before we can continue label Jan 30, 2024
@dopsun
Copy link
Author

dopsun commented Jan 30, 2024

I only tried using SqlLobValue with default LobHandler. Below is what it looks like:

InputStream inputStream = new FileInputStream(file);
SqlLobValue lobValue = new SqlLobValue(inputStream, (int) file.length());

this.jdbcClient.sql(sql)
    .param("data", lobValue)
    .update();

It throws exception with text "SqlLobValue only supports SQL types BLOB and CLOB".

@spring-projects-issues spring-projects-issues added status: feedback-provided Feedback has been provided and removed status: waiting-for-feedback We need additional information before we can continue labels Jan 30, 2024
@quaff
Copy link
Contributor

quaff commented Jan 31, 2024

It should be supported at JdbcOperations not JdbcClient level, I confirm that is not supported currently.

package com.example.demo;

import static org.assertj.core.api.Assertions.assertThat;

import java.io.ByteArrayInputStream;
import java.io.InputStream;

import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.ImportAutoConfiguration;
import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase;
import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase.Replace;
import org.springframework.boot.test.autoconfigure.jdbc.JdbcTest;
import org.springframework.boot.testcontainers.context.ImportTestcontainers;
import org.springframework.boot.testcontainers.service.connection.ServiceConnection;
import org.springframework.boot.testcontainers.service.connection.ServiceConnectionAutoConfiguration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.test.context.jdbc.Sql;
import org.testcontainers.containers.PostgreSQLContainer;
import org.testcontainers.junit.jupiter.Container;

@JdbcTest
@ImportAutoConfiguration(classes = ServiceConnectionAutoConfiguration.class)
@AutoConfigureTestDatabase(replace = Replace.NONE)
@Sql(statements = "create table test(value bytea)")
@ImportTestcontainers
public class PostgreSQLByteaTests {

	@Container
	@ServiceConnection
	static PostgreSQLContainer<?> container = new PostgreSQLContainer<>("postgres");

	@Autowired
	private JdbcTemplate jdbcTemplate;

	@Test
	void test() {
		InputStream binaryStream = new ByteArrayInputStream("test".getBytes());
		assertThat(jdbcTemplate.update("insert into test values(?)", binaryStream)).isEqualTo(1);
	}

}

Exception is thrown as:

org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [insert into test values(?)]
	at org.springframework.jdbc.support.SQLStateSQLExceptionTranslator.doTranslate(SQLStateSQLExceptionTranslator.java:112)
	at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:107)
	at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:116)
	at org.springframework.jdbc.core.JdbcTemplate.translateException(JdbcTemplate.java:1548)
	at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:677)
	at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:970)
	at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:1014)
	at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:1024)
	at com.example.demo.PostgreSQLByteaTests.test(PostgreSQLByteaTests.java:35)
	at java.base/java.lang.reflect.Method.invoke(Method.java:568)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
Caused by: org.postgresql.util.PSQLException: Can't infer the SQL type to use for an instance of java.io.ByteArrayInputStream. Use setObject() with an explicit Types value to specify the type to use.
	at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:1051)
	at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.setObject(HikariProxyPreparedStatement.java)
	at org.springframework.jdbc.core.StatementCreatorUtils.setValue(StatementCreatorUtils.java:452)
	at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValueInternal(StatementCreatorUtils.java:249)
	at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValue(StatementCreatorUtils.java:181)
	at org.springframework.jdbc.core.ArgumentPreparedStatementSetter.doSetValue(ArgumentPreparedStatementSetter.java:72)
	at org.springframework.jdbc.core.ArgumentPreparedStatementSetter.setValues(ArgumentPreparedStatementSetter.java:51)
	at org.springframework.jdbc.core.JdbcTemplate.lambda$update$2(JdbcTemplate.java:973)
	at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:658)
	... 7 more


@dopsun
Copy link
Author

dopsun commented Jan 31, 2024

Thanks @quaff I just changed my code to Jdbctemplate and got the same exception as yours.

For time being, I have changed my code to use PreparedStatement.

@quaff
Copy link
Contributor

quaff commented Jan 31, 2024

@dopsun I've fixed this by #32163, please patch StatementCreatorUtils to your project to verify.

@dopsun
Copy link
Author

dopsun commented Jan 31, 2024

Great @quaff

I tried with your new patch and build locally. It works for my case with JdbcClient.

This is my first time to build this repository from source and test locally, seems like IDE local cache causes my initial false errors (hence I deleted my previous comment with error report). Restarted IDE with a clean start, your patch works for me.

@quaff
Copy link
Contributor

quaff commented Jan 31, 2024

Great @quaff

I tried with your new patch and build locally. It works for my case with JdbcClient.

This is my first time to build this repository from source and test locally, seems like IDE local cache causes my initial false errors (hence I deleted my previous comment with error report). Restarted IDE with a clean start, your patch works for me.

You can try the trick that paste changed files to your project src directory to override classes in jars.

@jhoeller
Copy link
Contributor

@dopsun as mentioned, for a BLOB type, we got SqlLobValue... but that is indeed restricted to BLOB handling, not covering direct byte array types as with Postgres here. For the Postgres bytea type, you could implement a custom SqlTypeValue where you call setBinaryStream or setBytes in your setTypeValue implementation, and pass that as an argument into JdbcClient. That's probably the best you can do with existing Spring versions.

The plain handling of an InputStream argument in StatementCreatorUtils (as in the PR) might work for simple scenarios, as long as the caller makes sure to close the InputStream afterwards, e.g. when the InputStream never reaches the PreparedStatement. However, we generally do not intend to support such plain "hot" streams as arguments. Also, JDBC drivers usually insist on knowing the length of the stream upfront in order to allocate accordingly, so a plain InputStream object won't be sufficient.

We intend to address the file use case with support for special value types, e.g. Resource handles such as FileSystemResource, that lazily retrieve the InputStream and also provide the content length upfront. Alternatively, we may also support byte[] arguments directly. I'm going to turn this issue into a corresponding enhancement request for 6.1.4.

For that reason, I'm going to close the PR. Thanks for your efforts there in any case, @quaff!

@jhoeller jhoeller added type: enhancement A general enhancement and removed status: waiting-for-triage An issue we've not yet triaged or decided on labels Jan 31, 2024
@jhoeller jhoeller self-assigned this Jan 31, 2024
@jhoeller jhoeller added this to the 6.1.4 milestone Jan 31, 2024
@dopsun
Copy link
Author

dopsun commented Jan 31, 2024

@jhoeller Thanks.

I tried again with your suggestion on overwriting setTypeValue. And it works with code like below.

static class InputStreamSqlLobValue extends SqlLobValue {
  private final InputStream stream;
  private final int length;

  public InputStreamSqlLobValue(InputStream stream, int length) {
    super(stream, length);
    this.stream = stream;
    this.length = length;
  }

  @Override
  public void setTypeValue(PreparedStatement ps, int paramIndex, int sqlType, @Nullable String typeName) throws SQLException {
    ps.setBinaryStream(paramIndex, this.stream, this.length);
  }
}

@jhoeller
Copy link
Contributor

jhoeller commented Jan 31, 2024

For the above, you could simply declare InputStreamValue implements SqlTypeValue. No need to extend from SqlLobValue there, you could rather implement the same callback interface that SqlLobValue implements as well.

As for standard value types, it looks like we will introduce a pair of SqlBinaryValue and SqlCharacterValue classes, similar to SqlLobValue in design but delegating to the corresponding PreparedStatement methods for any target SQL type (no LobHandler involved). Those value types can even have BLOB/CLOB/NCLOB support when the corresponding SQL type is specified, e.g. through a SqlParameterValue that combines Types.BLOB with a SqlBinaryValue as value object.

As a side note, a parameter of type byte[] might work as-is already (without the need for a custom value type) since most database drivers explicitly handle that when passed into PreparedStatement.setObject (which Spring's StatementCreatorUtils uses by default). We can make this more explicit through dedicated setBytes support for a given byte array in StatementCreatorUtils but that's just for consistent behavior across databases.

@dopsun
Copy link
Author

dopsun commented Jan 31, 2024

In my case, the reason to choose InputStream over byte[], is based on my assumption that underlying driver will have better memory control/ optimization with InputStream, especially for relatively large file size. I should admit that I have not validated this assumption with proper benchmarks.

Thanks for your suggestion on improvements. Here the latest version which is working:

static class InputStreamValue implements SqlTypeValue {
  private final InputStream stream;
  private final int length;

  public InputStreamValue(InputStream stream, int length) {
    this.stream = stream;
    this.length = length;
  }

  @Override
  public void setTypeValue(PreparedStatement ps, int paramIndex, int sqlType, @Nullable String typeName) throws SQLException {
    ps.setBinaryStream(paramIndex, this.stream, this.length);
  }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
in: data Issues in data modules (jdbc, orm, oxm, tx) status: feedback-provided Feedback has been provided type: enhancement A general enhancement
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants