-
Notifications
You must be signed in to change notification settings - Fork 229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Query Length Limit? #1486
Comments
Hi @Luxxii, The RedisGraph query parser is supposed to use a resizable array, but there is currently a flaw in it, so at the moment we're resorting to a fixed-length buffer. If you'd like, you can modify this buffer length by changing the
For creating large graphs, however, you'll experience much better performance using the bulk loader utility. Hope this helps! |
I increased the limit size and it worked without further problems. Thanks! The bulk loader would be an additional step from converting a large database into csv and then would be read again to be added to the redis. Since I am not testing the performance of adding graphs into redis, this would only add a step, maybe even taking longer than adding entries directly in a parallel manner. I am more interested in the performance of finding special paths between nodes (not the shortest paths, since these could be retrieved easily due to the properties of a DAG). Is there somewhere an Issue pointing to the |
Resolved by #1572 |
Hi everyone!
Recently I have discovered RedisGraph, while looking for alternatives. It seems to be much faster than neo4j and therefore I want to apply absurdly large DAGs and test its performance. Since I am new to redis I am not sure whether this issue belongs here (or to redisgraph-py).
I generate directed and acyclic property graphs per entry from a large database (currently only using a subset, ~5.5 GB). Each entry in itself can be again very large.
Generating a graph, using redisgraph-py, can be done quickly. Here is a small snippet:
However, those generated graphs can explode in their size. They can contain more than (10 or 100-)thousands of nodes and edges. When executing
redis_graph.commit()
I stumbled upon the same error multiple times on many entries:Reducing the amount of properties added to the graph resulted into fewer such errors. Interestingly, the same line, column and offset was displayed in each error
line: 1, column: 1048584, offset: 1048583
, which seems to be the limit of the query length.Is there a limit of a query length in redis/RedisGraph/redisgraph-py? If so, can it be increased or set to unlimited length? It seems to only accept queries which do not exceed 1 Mbyte... I have not found such limitations in the documentations of redis/RedisGraph/redisgraph-py.
Versions:
I used redis-server 6.0.9 and RedisGraph with Version: 999999 (?? I have built it from source: 0ead11e). The version of redisgraph-py was: 2.2.3
The query sent to redis was captured using
redis-cli monitor
and saved here. However, I can provide further examples (with properties) if needed.Greetings!
The text was updated successfully, but these errors were encountered: