Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Am I using it right? #2

Closed
noppanit opened this issue Mar 14, 2016 · 2 comments
Closed

Am I using it right? #2

noppanit opened this issue Mar 14, 2016 · 2 comments

Comments

@noppanit
Copy link

First thing, I love your work. But I have a question as I'm new to Redis and how this actually works.

I'm cross posting from Stackoverflow.

I'm trying to use redlock for concurrent requests as I don't want the same request to fetch data from database multiple times. If the data is available then save it to Redis and it can be retrieved from Redis.

Here's my code

var AWS = require('aws-sdk');
var redis = require('redis');
var bluebird = require('bluebird');
var s3 = new AWS.S3();
var Lock = require('yarnl');
var request = require('request');

bluebird.promisifyAll(redis.RedisClient.prototype);
bluebird.promisifyAll(redis.Multi.prototype);

var client = redis.createClient({
  host: "localhost",
  port: 6379
});

module.exports = {
  getFromRedis: function(req, res) {
    client.getAsync('hash').then(function(data) {
      if(data) {
        console.log('fetch from cache');
        return res.send(data);
      } else {
        new Lock('url').lock(function (err, unlock) {
          if (err){
            throw err
          }

          if (!unlock) {
            console.log('first lock');
            return res.send('first lock');
          } else {
            console.log('fetch crawl');
            request('/fetch_data_from_db', function (err, response, body) {
              client.set('hash', body);
              unlock(function (err) {
                if(err) {
                  throw err
                }

                console.log('releasing lock');
                return res.send(body);
              });
            });
          }
        });
      }
    });
  }
}

I'm using this library to do the locking

https://github.com/gabegorelick/yarnl

And I test this by running ab -n 10 -c 5 http://localhost:8081/fetch

This is the result I get

fetch crawl
releasing lock
fetch from cache
fetch from cache
fetch from cache
fetch from cache
fetch crawl
releasing lock
fetch crawl
releasing lock
fetch crawl
releasing lock
...

What I get from that result is that the first request locks the process and the subsequent requests fetch from what's in Redis. However, there's 4-5 concurrent requests trying to get the lock and fetch the data from database. My question is is there anything I could do to reduce the concurrent requests to fetch the data to just one request. If the next request sees the lock then just return empty response until the data is ready in Redis.

@gabegorelick
Copy link
Owner

Not sure I fully understand your question, but if your problem is that you want to prevent a lot of subscribers from attempting to acquire the lock at the same time, the .lock function takes a retryDelay argument that can be a function. See here [1] for how you can use that to implement custom backoff strategies, perhaps with some random intervals, to prevent thundering herds from trying to acquire a lock at the same time.

Let me know if I misunderstood your issue and if this is an actual bug in the library.

[1] https://github.com/gabegorelick/yarnl#custom-backoff-strategies

@noppanit
Copy link
Author

Thanks for your reply. I don't think it's a bug. You're right about preventing the herds of requests. I'll take a look at the custom backoff strategy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants