You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to benchmark dynomite pipelining. I have set up two dynomite nodes at different data centers(cross data center replication). Without pipelining, everything works well.
Benchmarking pipelining at dc1 dynomite: redis-benchmark -p 8102 -n 1000000 -t set,get -P 10 -q
output: Error: Server closed the connection
dynomite(dc1) logs:
[2017-09-25 08:11:20.025] _msg_get:307 allocated #msgs 300000 hit max allowable limit
[2017-09-25 08:11:20.025] server_ack_err:203 Could not allocate msg.
[2017-09-25 08:11:20.025] _msg_get:307 allocated #msgs 300000 hit max allowable limit
[2017-09-25 08:11:20.025] server_ack_err:203 Could not allocate msg.
[2017-09-25 08:11:20.025] _msg_get:307 allocated #msgs 300000 hit max allowable limit
[2017-09-25 08:11:20.025] server_ack_err:203 Could not allocate msg.
[2017-09-25 08:11:20.025] server_close:269 close <CONN_SERVER 0x562a9b611e10 14 to '127.0.0.1:6379:1'> Dropped 9534 outqueue & 0 inqueue requests
[2017-09-25 08:11:20.025] event_del_conn:234 epoll ctl on e 10 sd 14 failed: No such file or directory
[2017-09-25 08:11:20.025] conn_pool_notify_conn_close:170 <CONN_POOL 0x562a9b611370 active_conn:2 in array 3 max 3> Removing <CONN_SERVER 0x562a9b611e10 -1 to '127.0.0.1:6379:1'>
[2017-09-25 08:11:20.025] _msg_get:307 allocated #msgs 300000 hit max allowable limit
[2017-09-25 08:11:20.025] dn_stacktrace:326 [0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x110c0) [0x7f8006c240c0]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [1] src/dynomite(req_forward_error+0xb4) [0x562a9a950bc0]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [2] src/dynomite(req_forward_all_local_racks+0x178) [0x562a9a95163d]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [3] src/dynomite(+0x1d46d) [0x562a9a95f46d]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [4] src/dynomite(+0x1d660) [0x562a9a95f660]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [5] src/dynomite(+0x2b169) [0x562a9a96d169]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [6] src/dynomite(+0x2b293) [0x562a9a96d293]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [7] src/dynomite(+0x2b848) [0x562a9a96d848]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [8] src/dynomite(msg_recv+0x9d) [0x562a9a96d93a]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [9] src/dynomite(+0x19a52) [0x562a9a95ba52]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [10] src/dynomite(core_core+0x10a) [0x562a9a95c040]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [11] src/dynomite(event_wait+0x197) [0x562a9a9988bd]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [12] src/dynomite(core_loop+0x7c) [0x562a9a95c902]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [13] src/dynomite(+0x43820) [0x562a9a985820]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [14] src/dynomite(main+0x162) [0x562a9a9859cd]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [15] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf1) [0x7f80065922e1]
[2017-09-25 08:11:20.026] dn_stacktrace:331 system command did not succeed to print filename
[2017-09-25 08:11:20.026] dn_stacktrace:326 [16] src/dynomite(_start+0x2a) [0x562a
I have set mbuf_size to maximum limit of 64000 and max_msgs to 300000.
I have set max_msgs to 1000000(max limit) still the same happens.
Can you explain what exactly is max_msgs here?
The text was updated successfully, but these errors were encountered:
Similarly to #487 I think you should decrease the mbuf_size The max_msgs is how many messages Dynomite is buffering. Once an upper limit has been reached Dynomite stops accepting further writes. This is done so that Dynomite is capped on how memory it is taking. Our production systems are usually set to 500000 but it depends on the instance type. Check the launch_dynomite script to get some ideas.
Other than what @ipapapa mentioned, like I told you in the other issue #487, redis-benchmark will hammer Dynomite with max requests but the replication can be lagging behind. Dynomite tends to start failing requests as a means of admission control. Hope this helps. Please look at https://github.com/Netflix/ndbench the benchmarking project that we use to do load testing on Dynomite and many other projects internally.
I am trying to benchmark dynomite pipelining. I have set up two dynomite nodes at different data centers(cross data center replication). Without pipelining, everything works well.
Benchmarking pipelining at dc1 dynomite:
redis-benchmark -p 8102 -n 1000000 -t set,get -P 10 -q
output:
Error: Server closed the connection
dynomite(dc1) logs:
dynomite(dc2) logs:
I have set mbuf_size to maximum limit of 64000 and max_msgs to 300000.
I have set max_msgs to 1000000(max limit) still the same happens.
Can you explain what exactly is max_msgs here?
The text was updated successfully, but these errors were encountered: