Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Map Oracle type NUMBER(19,0) to long #393

Open
marcushalonen opened this issue Apr 1, 2018 · 7 comments
Open

Map Oracle type NUMBER(19,0) to long #393

marcushalonen opened this issue Apr 1, 2018 · 7 comments

Comments

@marcushalonen
Copy link

I have several Oracle tables with fields defined as NUMBER(19,0). I configured numeric.precision.mapping=true and was expecting long but I'm getting bytes.

On DataConverter.java#L208 and DataConverter.java#L399 there is a check for "precision < 19" Shouldn't this be "precision <= 19"?

In the proposal for #101 we had "precision < 20"

If I make the above changes I can ingest all my Oracle tables without any problems.

References:

@szalapski
Copy link

szalapski commented Nov 20, 2020

This is a glaring issue that in my opinion needs to be addressed: when an Oracle databse is the source for this JdbcSourceConnector, it should work for any Oracle numeric type. Not just NUMBER(19,0), but NUMBER(28,0), NUMBER(19,2), NUMBER(29,9), NUMBER(38,0) and of course, NUMBER with no specified precision.

This is throwing us off over and over again, and the maintenance cost is quite high to figure out what is wrong and divine the right kludge to work around it.

See also: https://docs.oracle.com/cd/B28359_01/server.111/b28318/datatype.htm#CNCPT313

I would suggest these requirements:

  • No number type should serialize to a byte character string
  • Higher precision/scale NUMBER(x) and NUMBER(x,y) types should work without need to CAST.
  • The "any precision" NUMBER type should "just work": it should serialize as a decimal properly.

related: #101

I realize that making it work for higher-precision types might not be so good for lower-precision existing usages. I'd love to be able to just supply a type mapping property alongside my SQL if it is really necessary. I'd gladly learn a bit more of the guts of how the types serialize just to make things explicit of how the types will be regarded. Shouldn't have to modify my SQL or discard needed precision just to read a big number, though.

This is throwing us off everywhere. Really, as it is now, "best_fit" should be called "halfhearted_fit". Can you prioritize a fix for this?

@guija
Copy link

guija commented Jul 8, 2021

I completely agree to @szalapski. We're also stumbling over this every few weeks. Please prioritize this.

@jonariver
Copy link

@Confluent-KG you guys get paid. what are you waiting for? christmas in 2030?

@tkhasimbasha
Copy link

is this issue fixed..

@szalapski
Copy link

No. They are ignoring it.

@kinghuang
Copy link

Very annoying that this still isn't addressed after all these years. My team work around this bug back in 2019, and still continues to do so. Confluent just isn't serious about Kafka Connect, it seems.

@YeonghyeonKO
Copy link

YeonghyeonKO commented Oct 25, 2023

Notice me when you fix this issue. I'm eager to use Kafka Connect corretly.
Even I try to cast byte (originally, Number(19) type) number to int64 using Cast SMT,
the value of messages keeps to be rounded. (ex. 6231025164948372027 -> 6231025164948372030)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants