Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RedisConnectionException in High Concurrence #355

Closed
wx-chevalier opened this issue Jan 5, 2016 · 20 comments
Closed

RedisConnectionException in High Concurrence #355

wx-chevalier opened this issue Jan 5, 2016 · 20 comments

Comments

@wx-chevalier
Copy link

I use redis as required data store, but in high concurrence, there are almost 50% connection failed. I want to know whether to optimize my redis server(use cluster) or my client? Is there any great suggestion? infinity retry until success? And My test code are list after:

RedissonFactory redissonFactory = new RedissonFactory();

    final RedissonClient redissonClient = redissonFactory.getRedissonClientInstance();

    final AtomicInteger atomicInteger = new AtomicInteger(0);

    ArrayList<Thread> threads = new ArrayList<Thread>();

    //使用1000个并发线程去连接并且获取数据
    for (int i = 0; i < 1000; i++) {

        final int i_inner = i;

        Thread thread = new Thread(new Runnable() {
            public void run() {

                RBucket<String> rBucket = null;
                try {
                    rBucket = redissonClient.getBucket("aaa");

                    rBucket.set("1", 2, TimeUnit.SECONDS);

                    System.out.println(rBucket.get() + ":" + i_inner);


                } catch (Exception e) {

                    e.printStackTrace();

                    atomicInteger.addAndGet(1);

                    System.out.println("Failed:" + i_inner);

                }


            }
        });

        threads.add(thread);

        //开启线程
        thread.start();



    }

    //等待全部线程终结
    threads.stream().forEach(thread->{
        try {
            thread.join();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    });

    System.out.println("Final:" + atomicInteger.get());

}

I use 1000 threads, and find that if i set the connection pool size to 500, the successful proportion achieves maximum。

@mrniko
Copy link
Member

mrniko commented Jan 5, 2016

How many cluster master-nodes do you use?

@wx-chevalier
Copy link
Author

Only one in my ECS, But i use the redis service in aliyun, refer to its document, there are about 3 master nodes。 I wonder this is caused by redis itself or limited by redission?

@wx-chevalier
Copy link
Author

=。= I thought that you are from USA and surprised that you are still awake at this time=。=

@mrniko
Copy link
Member

mrniko commented Jan 5, 2016

no, I'm from Russia :)

@mrniko
Copy link
Member

mrniko commented Jan 5, 2016

You can play with Redisson timeout settings like retryAttempts and retryInterval

@mrniko
Copy link
Member

mrniko commented Jan 5, 2016

And which Redisson version do you use?

@mrniko
Copy link
Member

mrniko commented Jan 5, 2016

how many cores do you have in Redisson client machine? Because 1000 threads is too much for single 8 cores CPU, for example.

@wx-chevalier
Copy link
Author

I use 2.2.3 and 16 cores in client machine. e... if i set poolsize to 500, it means there are most 500 threads will be invoked to do connect? or you have any Mechanism to do schedule when clients invoke beyond working threads?

@wx-chevalier
Copy link
Author

so sorry to ask you so much question. i find that if i set pool size to 300 and retryattemp to 1, there are 0 failed connection. and if i set pool size to 30 and retry attempt to 10, i also work. may be the later one is better solution?

@mrniko
Copy link
Member

mrniko commented Jan 5, 2016

please upgrade your version to 2.2.4 because of connection leak issue.

@mrniko
Copy link
Member

mrniko commented Jan 5, 2016

if i set poolsize to 500, it means there are most 500 threads will be invoked to do connect?

No, Redisson uses current_processors_amount * 2 threads by default to handle any amount of connections. You can change this setting via Config.threads

@mrniko
Copy link
Member

mrniko commented Jan 5, 2016

Have you tried 2.2.4 version?

@wx-chevalier
Copy link
Author

Yes I have upgrade to 2.2.4 , And i set connection pool to 300 and retryattempt to 10. I use webbench : webbench -c 300 -t 20 http://localhost:10086/Social/Share/getRecommendShareList?requestData={"user_token":"qnZ5awrOszYd1iFb3iLF9DJN2kCJ2B02FgzmX3a2F8gVM83D","pageNum":0,"pageSize":20} .
find that if still much connection error in redis connection. Because it is a required store, so if redis connection failed , the request will failed or out of time.

@lefay1982
Copy link
Contributor

I think this issue similar to my issue #338 in High Concurrence.

@mrniko
Copy link
Member

mrniko commented Jan 6, 2016

@wxyyxc1992 have you tried to increase Config.threads ? Also try to increase timeout setting

@mrniko
Copy link
Member

mrniko commented Jan 6, 2016

@wxyyxc1992 and that exception do you get now, Can't aquire connection from pool!?

@wx-chevalier
Copy link
Author

Yes Can't aquire connection from pool! ,threads is set to 30, but i think it may be the bottleneck of network?

@mrniko
Copy link
Member

mrniko commented Jan 6, 2016

Maybe a bottleneck. You can use redis-benchmark to check that. Command line example:

redis-benchmark.exe -h myredisserver.com -p 6735 -c 300 -r 10000 -n 100000 PING

@wx-chevalier
Copy link
Author

get it , thank you very much

@mrniko mrniko closed this as completed Jan 22, 2016
@mrniko
Copy link
Member

mrniko commented Jan 25, 2016

@wxyyxc1992 any news?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

3 participants