Bgsave problem #399

Closed
halur opened this Issue Mar 22, 2012 · 2 comments

Comments

Projects
None yet
3 participants

halur commented Mar 22, 2012

Hi,

I have redis data of more then 43 GB in memory. i need to have security, ie if my server goes down how to bring it up will minimum loss.
Whenever i try to have BGSAVE server is server die with maximum CPU utilization error. Should i use AOF or is there any way by which i can optimise BGSAVE.

Need some information,
What should be maximum size redis support. ?

I need to mine some 900million of log lines everyday. Inorder to churn it i am trying to use redis. Redis work great with data churning But it fails when i try to add some data security measures.

Regards

Owner

antirez commented Mar 22, 2012

Hello, why on the earth a server should die because you are using too many CPU? May you report the exact error please? How RAM is installed in the system? Thanks.

halur commented Mar 22, 2012

Hi,

i am benchmarking on 8 core, 64GB Ram server.
And i have some 900 million log lines. And some 150+ million keys. Everything works well till i don't think of data security.
As soon as i try to benchmark for backup and recovery. It failing.
Can you suggest what could be the best process by which i can assure 99.99% data assurance in terms of crash.

@mattsta mattsta closed this May 30, 2014

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment