New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BigDecimal has mismatching scale value for given Decimal schema #44

Closed
pradnyaalc opened this Issue Feb 16, 2016 · 10 comments

Comments

Projects
None yet
10 participants
@pradnyaalc

Used the latest kafka-connect-jdbc-2.1.0-SNAPSHOT jar but I am unable to parse the Number format of oracle db. It gives the following exception:

org.apache.kafka.connect.errors.DataException: BigDecimal has mismatching scale value for given Decimal schema
at org.apache.kafka.connect.data.Decimal.fromLogical(Decimal.java:69)
at io.confluent.connect.avro.AvroData$5.convert(AvroData.java:218)
at io.confluent.connect.avro.AvroData.fromConnectData(AvroData.java:318)
at io.confluent.connect.avro.AvroData.fromConnectData(AvroData.java:450)
at io.confluent.connect.avro.AvroData.fromConnectData(AvroData.java:267)
at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:90)
at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:142)
at org.apache.kafka.connect.runtime.WorkerSourceTask.access$600(WorkerSourceTask.java:50)
at org.apache.kafka.connect.runtime.WorkerSourceTask$WorkerSourceTaskThread.execute(WorkerSourceTask.java:356)
at org.apache.kafka.connect.util.ShutdownableThread.run(ShutdownableThread.java:82)

It seems that the latest version has fixed the issue but even after using the latest jar I am facing the same issue.

Please help..

@erhwenkuo

This comment has been minimized.

Show comment
Hide comment
@erhwenkuo

erhwenkuo May 11, 2016

I am also encountered this error message when I try to use JDBC connector of oracle db. According to Stackoverflow-Difference between number and integer datatype in oracle dictionary views, I found out the "number" type in oracle table caused this issue. In my case, I have a field "ShipAmt" (Type: Number , Scale: 3) defined in oracle, and have one record "ShipAmt=10". From oracle jdbc resultset meta data, the scale (3) is correct but the decimal value using resultset.getBigDecimal() only show scale(0) which cause value / schema mismatch validation error.
Below is a quick fix: Modify "DataConvert.convertFieldValue" method to enable "scale" check and force setting right scale number on BigDecimal object.

public class DataConverter {
private static void convertFieldValue(ResultSet resultSet, int col, int colType,
                                        Struct struct, String fieldName)
      throws SQLException, IOException {
.....
.....
case Types.NUMERIC:
      case Types.DECIMAL: {
        //check scale from meta and reflect on BigDecimal object
        int scale = resultSet.getMetaData().getScale(col);
        colValue = resultSet.getBigDecimal(col).setScale(scale);;
        break;
      }
....

I am also encountered this error message when I try to use JDBC connector of oracle db. According to Stackoverflow-Difference between number and integer datatype in oracle dictionary views, I found out the "number" type in oracle table caused this issue. In my case, I have a field "ShipAmt" (Type: Number , Scale: 3) defined in oracle, and have one record "ShipAmt=10". From oracle jdbc resultset meta data, the scale (3) is correct but the decimal value using resultset.getBigDecimal() only show scale(0) which cause value / schema mismatch validation error.
Below is a quick fix: Modify "DataConvert.convertFieldValue" method to enable "scale" check and force setting right scale number on BigDecimal object.

public class DataConverter {
private static void convertFieldValue(ResultSet resultSet, int col, int colType,
                                        Struct struct, String fieldName)
      throws SQLException, IOException {
.....
.....
case Types.NUMERIC:
      case Types.DECIMAL: {
        //check scale from meta and reflect on BigDecimal object
        int scale = resultSet.getMetaData().getScale(col);
        colValue = resultSet.getBigDecimal(col).setScale(scale);;
        break;
      }
....
@cddr

This comment has been minimized.

Show comment
Hide comment
@cddr

cddr Jun 2, 2016

We get this on postgres too.

cddr commented Jun 2, 2016

We get this on postgres too.

@ryananguiano

This comment has been minimized.

Show comment
Hide comment
@ryananguiano

ryananguiano Jul 5, 2016

This is prohibiting us from using kafka connect jdbc as a source. We are running into incorrect avro typing for decimal and boolean fields. Is this in the pipeline to get fixed soon? Or should we look to running PR code or looking for other solutions.

Thanks.

This is prohibiting us from using kafka connect jdbc as a source. We are running into incorrect avro typing for decimal and boolean fields. Is this in the pipeline to get fixed soon? Or should we look to running PR code or looking for other solutions.

Thanks.

@ewencp

This comment has been minimized.

Show comment
Hide comment
@ewencp

ewencp Jul 8, 2016

Member

@ryananguiano There's a PR outstanding but it still needs some fixes since the current version can cause ArithmeticExceptions by reducing the precision of the BigDecimal.

Member

ewencp commented Jul 8, 2016

@ryananguiano There's a PR outstanding but it still needs some fixes since the current version can cause ArithmeticExceptions by reducing the precision of the BigDecimal.

@shikhar shikhar closed this in bcb0b0b Jul 26, 2016

@shikhar shikhar reopened this Jul 26, 2016

@clumsy

This comment has been minimized.

Show comment
Hide comment
@clumsy

clumsy Jul 26, 2016

I had a similar issue and as far as I was able to investigate - setting scale won't work for some JDBC sources.
One problem I had in particular was that Oracle DB will not provide proper metadata for numeric columns - especially if the column was constructed via aggregates (like sum, min, etc.). It will say the scale is 0 even though you have a floating point value as a result.

The way I was able to address some of the issues was to use an explicit CAST in my queries, where I was able to specify the precision and scale or use another data type (DOUBLE instead of NUMERIC).

Hope that helps.

clumsy commented Jul 26, 2016

I had a similar issue and as far as I was able to investigate - setting scale won't work for some JDBC sources.
One problem I had in particular was that Oracle DB will not provide proper metadata for numeric columns - especially if the column was constructed via aggregates (like sum, min, etc.). It will say the scale is 0 even though you have a floating point value as a result.

The way I was able to address some of the issues was to use an explicit CAST in my queries, where I was able to specify the precision and scale or use another data type (DOUBLE instead of NUMERIC).

Hope that helps.

@ynaguib

This comment has been minimized.

Show comment
Hide comment
@ynaguib

ynaguib Jul 27, 2016

@clumsy: how did you fix the oracle cast?
I tried Select cast( ... as integer) from tab, and nothing seems to work!

ynaguib commented Jul 27, 2016

@clumsy: how did you fix the oracle cast?
I tried Select cast( ... as integer) from tab, and nothing seems to work!

@clumsy

This comment has been minimized.

Show comment
Hide comment
@clumsy

clumsy Jul 27, 2016

@ynaguib Using current kafka-connect-jdbc I was only able to get rid of all BigDecimals and their scale problems for Oracle jdbc by casting to FLOAT and DOUBLE (casting to INTEGER will still result in NUMERIC type).
Later I was able to come up with a more refine converter for NUMERIC type that I've described in #101.
So currently I'm using a custom build with my patch and it works just fine.

clumsy commented Jul 27, 2016

@ynaguib Using current kafka-connect-jdbc I was only able to get rid of all BigDecimals and their scale problems for Oracle jdbc by casting to FLOAT and DOUBLE (casting to INTEGER will still result in NUMERIC type).
Later I was able to come up with a more refine converter for NUMERIC type that I've described in #101.
So currently I'm using a custom build with my patch and it works just fine.

@hanshenrik

This comment has been minimized.

Show comment
Hide comment
@hanshenrik

hanshenrik Jan 4, 2017

Any progress on pull request https://github.com/confluentinc/kafka-connect-jdbc/pull/89? Encountered same issue for data type Number in an Oracle DB using kafka-connect-jdbc-3.1.1.jar.

ERROR Task test-booking-source-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:142)
org.apache.kafka.connect.errors.DataException: BigDecimal has mismatching scale value for given Decimal schema
	at org.apache.kafka.connect.data.Decimal.fromLogical(Decimal.java:69)
	at org.apache.kafka.connect.json.JsonConverter$17.convert(JsonConverter.java:232)
[...]

hanshenrik commented Jan 4, 2017

Any progress on pull request https://github.com/confluentinc/kafka-connect-jdbc/pull/89? Encountered same issue for data type Number in an Oracle DB using kafka-connect-jdbc-3.1.1.jar.

ERROR Task test-booking-source-0 threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:142)
org.apache.kafka.connect.errors.DataException: BigDecimal has mismatching scale value for given Decimal schema
	at org.apache.kafka.connect.data.Decimal.fromLogical(Decimal.java:69)
	at org.apache.kafka.connect.json.JsonConverter$17.convert(JsonConverter.java:232)
[...]
@rnpridgeon

This comment has been minimized.

Show comment
Hide comment
@rnpridgeon

rnpridgeon Jan 26, 2017

Contributor

As @shikhar has pointed out PR #177 should remedy this. Since it has recently been merged I'll mark this as closed for now.

Contributor

rnpridgeon commented Jan 26, 2017

As @shikhar has pointed out PR #177 should remedy this. Since it has recently been merged I'll mark this as closed for now.

@rnpridgeon rnpridgeon closed this Jan 26, 2017

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment