-
Notifications
You must be signed in to change notification settings - Fork 594
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add support for TIMESTAMPTZ #9220
Comments
Hi @djouallah thanks for opening this issue. Can you give us more context on what are you trying to achieve? If possible a minimal reproducible example? What's the original type of the You can create an example using a memtable, this way we can reproduce it and help you out. import ibis
t = ibis.memtable([{"a": 1}, {"a": 2}]) |
alright, I am trying ibis for the first time, my approach, load some data, do the transformation and write back to delta table, which works fine except that there is a subtle difference how arrow write timestamp and pyspark, i endup with two data type one with time zone the other not see example here : https://colab.research.google.com/drive/18JGPcj3VaJ2tcxDZx1XB7UufTnEPjRfu#scrollTo=51jIBhgTXzrS |
@djouallah I see there are two different issues here. The one initially reported, about import ibis.expr.datatypes as dt
#then you can do something like
DUNIT = DUNIT.cast({"SETTLEMENTDATE": dt.Timestamp(timezone="UTC")}) The other issue, you are reporting on the second comment might be a bug. What seems to be happening is that at the moment the disk type that we are writing with is backend specific, and there seems to be backend inconsistencies when writing to disk. But the code provided, is not a minimal reproducible example as as it requires downloading some data from the internet. Would you be able to provide a |
@ncclementi thanks a lot, it was just me not reading the documentation, problem solved all good thanks for your help |
What happened?
DUNIT = DUNIT.cast({"SETTLEMENTDATE": "TIMESTAMPTZ"})
generate an error
What version of ibis are you using?
10.0.0.dev71
What backend(s) are you using, if any?
DuckDB and pyspark
Relevant log output
Code of Conduct
The text was updated successfully, but these errors were encountered: