Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

数据量过大时,会卡死 #38

Closed
Chilfish opened this issue Mar 24, 2024 Discussed in #37 · 2 comments
Closed

数据量过大时,会卡死 #38

Chilfish opened this issue Mar 24, 2024 Discussed in #37 · 2 comments
Labels
bug Something isn't working

Comments

@Chilfish
Copy link
Owner

Discussed in #37

Originally posted by copymonopoly March 24, 2024
检索到10000条,只下载了1500多条卡住不动了,怎么办

版本:v0.3.7

@Chilfish Chilfish added the bug Something isn't working label Mar 24, 2024
@Chilfish
Copy link
Owner Author

Chilfish commented Mar 24, 2024

我想可以在导出时,暂存到 weibo.com 下的 idb,而不是存到内存里。同时将获取的状态,如 page 和 since_id 持久化,这样也就能够断点续传了,即便刷新了也能够恢复

@Chilfish
Copy link
Owner Author

现在到了两千多的数据,占用还是可以的🥳

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant