New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
random error with postgresql data source #1191
Comments
Looks like it hasn't been added to the parser.
I'm on it! |
On 31/07/15 16:19, Phillip Cloud wrote:
Why does it happen randomly though? |
@wavexx Can you show Data('postgresql://your uri here').dshape |
On 31/07/15 16:22, Phillip Cloud wrote:
One example:
|
what is the exact expression you are trying to run? i don't need to see the database connection string, just the actual line of code that randomly giving the error |
@wavexx when it doesn't fail, does it show you the correct data? |
On 31/07/15 16:33, Phillip Cloud wrote:
It seems to, yes. |
On 31/07/15 16:33, Phillip Cloud wrote:
It's a slice on both columns and rows: list(data[data.fields[a:b]][c:d]) |
are |
On 31/07/15 16:39, Phillip Cloud wrote:
You can try the full code yourself if you want: https://github.com/wavexx/gtabview PYTHONPATH=$PWD ./bin/gtabview postgresql://something/db::table The slicing occurs in gtabview/models.py:157 For the first query though, it often means always: list(data[data.fields[0:len(data.fields)][0:min(16384,int(data.nrows)]) So I assume it's constant if int(data.nrows) and len(data.fields) return Despite being constant, it still fails randomly. |
On 31/07/15 16:46, Yuri D'Elia wrote:
However, do I infer that an empty slice wouldn't be valid? I do expect data[...][0:0] to return an empty list. |
Yes this is valid. @wavexx Can you show: import blaze, odo, datashape
blaze.__version__
odo.__version__
datashape.__version__ |
FWIW, the only "random" element here is that the order of your result set is undefined without an |
@wavexx What do you get when you run the following code? I'm using IPython, but you can use vanilla Python as well In [30]: from blaze import Data
In [31]: d = Data('postgresql://localhost::table')
In [32]: d.head() |
On 31/07/15 17:17, Phillip Cloud wrote:
If I run it on ipython, right now it works all the time. If I stick it in a file: from blaze import Data, odo, compute and run it with: python test.py it fails with float16 not being recognized (same error as reported before). Now it's interesting, if I import os, sys I did that only because that's what I do in the ipython startup. |
in both cases can you show the versions of |
@wavexx also, In the first case you're seeing something like this print(repr(odo(compute(d.head()).execute().fetchall(), pd.DataFrame))) Which works, because In the second case something like I have no idea why importing |
I have no explanation for the randomness. I'd need you to put a halt into a debugger right before the expression is converted to see what the issue is. |
I'm pretty sure that blaze/datashape#163 will fix all of these errors. |
On 31/07/15 17:47, Phillip Cloud wrote:
Sure, I just noticed now because of the ipython startup I had.
Which is why I explicitly do list().
The source I'm using is: import blaze, odo, datashape When run with python: $ python test.py File , line 1 You mentioning iterators made me think, it seems that adding from future import generators is sufficient: $ python test.py could that be that you expect some builtins to emit generators, somewhere? |
On 31/07/15 17:50, Phillip Cloud wrote:
I need some guidance. I need at least some functions names and/or things to look for. |
what are
|
@wavexx closing. pls reopen if this is still an issue |
I'm new to blaze, so pardon my ignorance here. I have no idea if I have to report this to odo/datashape or something else.
I'm using blaze.Data on a postgresql table ("postgresql://"). When I try to get some data off the table with list(head(10)); in 50% of the cases (without any change on the db), I get this error:
I actually wonder why this error is not reproducible. Looks like odo is randomly choosing a different conversion/coercion route? In fact, it's so random I cannot even determine whether there's a specific column type that could cause the issue.
The text was updated successfully, but these errors were encountered: