Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dynomite POC question #815

Open
WenningQiu opened this issue Nov 9, 2022 · 0 comments
Open

Dynomite POC question #815

WenningQiu opened this issue Nov 9, 2022 · 0 comments

Comments

@WenningQiu
Copy link

WenningQiu commented Nov 9, 2022

OS: RHEL7
Redis: 6.2.6
Dynomite: https://github.com/Netflix/dynomite/tree/v0.8 with AUTH changes in https://github.com/reimannf/dynomite/tree/redis-auth applied to support AUTH command.

My application currently uses Redis Open Source as distributed cache. I am trying to use Dynomite to replicate data between Redis clusters running in different data centers so that the application can fail over to the other data center should disasters happen.

I started with one Redis cluster on a RHEL7 host to study how Dynomite proxy works and behaves, and plan to set up another cluster for data replication. The first cluster contains 6 nodes that runs on ports 5000-5005 with 1 replica. Below is the redis.conf for port 5000.

port 5000
cluster-enabled yes
cluster-config-file cluster.conf
cluster-node-timeout 5000
pidfile pid
appendonly yes
maxclients 500
protected-mode no
masterauth password
requirepass password
tcp-backlog 100

For Dynomite, I started with the sample configuration at http://www.dynomitedb.com/docs/dynomite/v0.5.6/quick-start/. Below is the Dynomite configuration for node 5000:

dyn_o_mite:
    datacenter: omaha
    rack: rack5000
    dyn_listen: 0.0.0.0:5100
    dyn_seed_provider: simple_provider
    dyn_seeds:
    - 127.0.0.1:5101:rack5000:omaha:99990001
    - 127.0.0.1:5102:rack5000:omaha:99990002
    - 127.0.0.1:5103:rack5000:omaha:99990003
    - 127.0.0.1:5104:rack5000:omaha:99990004
    - 127.0.0.1:5105:rack5000:omaha:99990005
    listen: 0.0.0.0:5010
    stats_listen: 0.0.0.0:5090
    servers:
    - 127.0.0.1:5000:1
    timeout: 30000
    tokens: '99990000'
    data_store: 0
    requirepass: password

Complete Redis and Dynomite configuration files can be found in attached files.

My simple test involves using Get and Set commands. Key "Name" apparently is owned by node 5001, below is the interaction when I connect directly to port 5001.

qiuwen01-pa@accdevxacpx0001> redis-cli -p 5001
127.0.0.1:5001> AUTH password
OK
127.0.0.1:5001> get Name
(nil)
127.0.0.1:5001>
[09:39:56][/tmp]

I was expecting to get the same result when connecting to the Dynomite proxy process for 5001 as I expect Dynomite proxy to forward the commands to port 5001, but instead I got different response:

qiuwen01-pa@accdevxacpx0001> redis-cli -p 5011
127.0.0.1:5011> AUTH password
OK
127.0.0.1:5011> Get Name
(error) MOVED 8680 127.0.0.1:5001
127.0.0.1:5011> Set Name Alice
(error) MOVED 8680 127.0.0.1:5001
127.0.0.1:5011>

The trace (snippet is below. the complete trace is attached) indicates that Dynomite proxy at port 5011 appears to be forwarding the commands to port 5100 which is the dyn_listen port for node 5000.

[2022-11-09 10:24:19.733] core_core:505 event FF00 on <CONN_LOCAL_PEER_SERVER 0xefdbe0 9 to '127.0.0.1:5100:rack5000:omaha:99990000'>
[2022-11-09 10:24:19.733] req_send_next:879 send next req 8 len 34 type 55 on s 9
[2022-11-09 10:24:19.733] msg_send_chain:1293 About to dump out the content of msg
[2022-11-09 10:24:19.733] msg_dump:741 msg dump id 8 request 1 len 34 type 55 done 0 error 0 (err 0)
[2022-11-09 10:24:19.733] mbuf_dump:143 mbuf 0xf1a830 with 30 bytes of data
00000000  20 20 20 24 32 30 31 34  24 20 38 20 33 20 30 20   |   $2014$ 8 3 0 |
00000010  31 20 31 20 2a 31 20 64  20 2a 33 34 0d 0a         |1 1 *1 d *34..|
[2022-11-09 10:24:19.733] mbuf_dump:143 mbuf 0xf155b0 with 34 bytes of data
00000000  2a 33 0d 0a 24 33 0d 0a  53 65 74 0d 0a 24 34 0d   |*3..$3..Set..$4.|
00000010  0a 4e 61 6d 65 0d 0a 24  35 0d 0a 41 6c 69 63 65   |.Name..$5..Alice|
00000020  0d 0a                                              |..|
[2022-11-09 10:24:19.733] msg_dump:744 =================================================
[2022-11-09 10:24:19.733] conn_sendv_data:407 sendv on sd 9 64 of 64 in 2 buffers
[2022-11-09 10:24:19.733] req_send_done:896 send done req 8 len 34 type 55 on s 9
[2022-11-09 10:24:19.733] dnode_req_peer_dequeue_imsgq:1301 conn 0xefdbe0 dequeue inq 8:0
[2022-11-09 10:24:19.733] _stats_pool_decr:1639 decr field 'peer_in_queue' to 0
[2022-11-09 10:24:19.733] _stats_pool_decr_by:1661 decr by field 'peer_in_queue_bytes' to 0
[2022-11-09 10:24:19.733] dnode_req_peer_enqueue_omsgq:1321 conn 0xefdbe0 enqueue outq 8:0
[2022-11-09 10:24:19.733] _stats_pool_incr:1629 incr field 'peer_out_queue' to 1
[2022-11-09 10:24:19.733] _stats_pool_incr_by:1650 incr by field 'peer_out_queue_bytes' to 34
[2022-11-09 10:24:19.733] event_del_out:165 removing conn <CONN_LOCAL_PEER_SERVER 0xefdbe0 9 to '127.0.0.1:5100:rack5000:omaha:99990000'> from active
[2022-11-09 10:24:19.733] stats_swap:1539 skip swap of current 0xef2d80 shadow 0xef2de0 as aggregator is busy
[2022-11-09 10:24:19.733] core_process_messages:618 length of C2G_OutQ : 0
[2022-11-09 10:24:19.733] event_wait:239 epoll 0001 triggered on conn 0xefdbe0
[2022-11-09 10:24:19.733] core_core:505 event 00FF on <CONN_LOCAL_PEER_SERVER 0xefdbe0 9 to '127.0.0.1:5100:rack5000:omaha:99990000'>
[2022-11-09 10:24:19.733] _dn_alloc:229 malloc(32) at 0xf1a900 @ ../dyn_array.c:35
[2022-11-09 10:24:19.733] _dn_alloc:229 malloc(32) at 0xf03020 @ ../dyn_array.c:40
[2022-11-09 10:24:19.733] _dn_alloc:229 malloc(32) at 0xf1a8d0 @ ../dyn_array.c:35
[2022-11-09 10:24:19.733] _dn_alloc:229 malloc(16) at 0xf16840 @ ../dyn_array.c:40
[2022-11-09 10:24:19.733] msg_get:500 get msg 0xf16440 id 9 request 0 owner sd 9
[2022-11-09 10:24:19.733] mbuf_get:116 get mbuf 0xf1e900
[2022-11-09 10:24:19.733] mbuf_insert:196 insert mbuf 0xf1e900 len 0
[2022-11-09 10:24:19.733] conn_recv_data:358 <CONN_LOCAL_PEER_SERVER 0xefdbe0 9 to '127.0.0.1:5100:rack5000:omaha:99990000'> recv 58 of 16320
[2022-11-09 10:24:19.733] msg_dump:741 msg dump id 9 request 0 len 58 type 0 done 0 error 0 (err 0)
[2022-11-09 10:24:19.733] mbuf_dump:143 mbuf 0xf1e900 with 58 bytes of data
00000000  20 20 20 24 32 30 31 34  24 20 38 20 35 20 30 20   |   $2014$ 8 5 0 |
00000010  31 20 31 20 2a 31 20 64  20 2a 32 38 0d 0a 2d 4d   |1 1 *1 d *28..-M|
00000020  4f 56 45 44 20 38 36 38  30 20 31 32 37 2e 30 2e   |OVED 8680 127.0.|
00000030  30 2e 31 3a 35 30 30 31  0d 0a                     |0.1:5001..|
[2022-11-09 10:24:19.733] msg_dump:744 =================================================
[2022-11-09 10:24:19.733] redis_parse_rsp:2990 parsed rsp 9 res 0 type 163 state 0 rpos 28 of 28
00000000  2d 4d 4f 56 45 44 20 38  36 38 30 20 31 32 37 2e   |-MOVED 8680 127.|
00000010  30 2e 30 2e 31 3a 35 30  30 31 0d 0a               |0.0.1:5001..|
[2022-11-09 10:24:19.733] dnode_rsp_recv_done:1133 dnode_rsp_recv_done entering ...
[2022-11-09 10:24:19.734] dnode_rsp_recv_done:1142 Dumping content for rsp:
[2022-11-09 10:24:19.734] msg_dump:741 msg dump id 9 request 0 len 28 type 163 done 0 error 1 (err 0)
[2022-11-09 10:24:19.734] mbuf_dump:143 mbuf 0xf1e900 with 28 bytes of data
00000000  2d 4d 4f 56 45 44 20 38  36 38 30 20 31 32 37 2e   |-MOVED 8680 127.|
00000010  30 2e 30 2e 31 3a 35 30  30 31 0d 0a               |0.0.1:5001..|
[2022-11-09 10:24:19.734] msg_dump:744 =================================================

I must have not properly configured Dynomite due to not fully understanding Dynomite configuration entries. I appreciate if someone can correct the mis-configurations in my dynomite.yml files.

cluster.conf.tar.gz
dynomite.trace.txt.gz

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant