New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: Can't infer object conversion type: 0 (6.0, 1.0, 1.0, 1.0, 1.0) #458
Comments
You should use the |
This kind of issue makes it hard to use this library on situations where the incoming data is undefined. For instance, in this real world example, when running the following code with
I plan on looking into how Pyarrow deals with this, and maybe try to port this to fastparquet if possible. I guess this file is where its done in fastparquet, correct ? Appendix : the error
|
Yes, you have exactly the right location. It seems we don't explicitly check for decimals. I don't know whether the right thing to do would be to convert to float or make use of parquet's decimal type. |
So if I understood correctly, the simplest approach would be to add decimal to that file (at least to solve my case), and latter try to figure out the rest ? If so I cloud create a PR for the decimal case in small time. This would not solve the issue pointed out by @chrinide, and I think Json serializable objects should be parsed to Json by default. |
Yes, check for decimal in that list of types. If you convert to float, that's easy, but you'll have to do some digging to figure true decimal storage out. JSON would not be able to store decimal either.
We couldn't want to, for example, json-encode nulled-floats and strings, but it could be a resonable fallback. On the other hand, you can always request JSON encoding anyway. I'm not sure what it would do to deminals - probably fail. |
Hi all,
I have tried to write the pandas DataFrame as a parquet file.
My DataFrame has some columns with list or tuple as the object.
If I try to do this, show the below error:
`
`
The text was updated successfully, but these errors were encountered: