Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

查询返回数据量过大报错,limit使用无效 #2477

Closed
wx-123456 opened this issue May 21, 2021 · 2 comments
Closed

查询返回数据量过大报错,limit使用无效 #2477

wx-123456 opened this issue May 21, 2021 · 2 comments
Labels
type/question Type: question about the product

Comments

@wx-123456
Copy link

在业务查询过程中,起始点通过关系找到对应的点,查询出来的点的数量属于不可控的,数据量过大查询时间长并且会导致报错,不可能每次功能都先测试下这个查询出来结果的数据是否会因为数据量大导致错误。同时根据测试,limit是从全部数据中限制数量,一样的问题数据量过大查询时间长并且会导致报错。
需求:希望能对查询的数据进行限制或者说是支持下推,能够对查询返回部分数据,类似于关系数据的limit。

如果有相关计划,想问下什么时候能够支持这个功能?

@wey-gu
Copy link
Contributor

wey-gu commented May 25, 2021

Dear @wx-123456 ,

limit 下推的优化还在进行中,我们在努力做这一块哈。

Before that, could you try the following configuration option under storage conf?

max_edge_returned_per_vertex

And also besides max_edge_returned_per_vertex, before our optimization on the limit finished, for piped query, if acceptable, add limit in all intermediate subqueries may help, referring to this blog around some optimization on the user layer to avoid slow query caused by supernodes: https://discuss.nebula-graph.com.cn/t/topic/3933

Thanks!

@HarrisChu HarrisChu added the type/question Type: question about the product label Jun 15, 2021
@wey-gu
Copy link
Contributor

wey-gu commented Jul 16, 2021

Will close it as it's been inactive for days. Feel free to reopen it.

@wey-gu wey-gu closed this as completed Jul 16, 2021
yixinglu pushed a commit to yixinglu/nebula that referenced this issue Sep 14, 2023
* Fix client ip format in show sessions command

* fix conflict

---------

Co-authored-by: Yichen Wang <18348405+Aiee@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type/question Type: question about the product
Projects
None yet
Development

No branches or pull requests

3 participants