Using Redis

Adam McCrea edited this page Jun 13, 2018 · 40 revisions

Sidekiq uses Redis to store all of its job and operational data.

By default, Sidekiq tries to connect to Redis at localhost:6379. This typically works great during development but needs tuning in production.

Using an ENV variable

You can set the Redis URL using environment variables. This makes configuring Sidekiq on Heroku dead simple.

Set the REDIS_PROVIDER env var to the name of the env var containing the Redis server URL. (Example with RedisGreen: set REDIS_PROVIDER=REDISGREEN_URL and Sidekiq will use the value of the REDISGREEN_URL env var when connecting to Redis.)

heroku config:set REDIS_PROVIDER=REDISTOGO_URL

You may also use the generic REDIS_URL which may be set to your own private Redis server.

Using an initializer

It is important to note that to configure the location of Redis, you must define both the Sidekiq.configure_server and Sidekiq.configure_client blocks. To do this put the following code into config/initializers/sidekiq.rb.

Sidekiq.configure_server do |config|
  config.redis = { url: 'redis://redis.example.com:7372/12' }
end

Sidekiq.configure_client do |config|
  config.redis = { url: 'redis://redis.example.com:7372/12' }
end

NOTE: Unknown parameters are passed to the underlying Redis client so any parameters supported by the driver can go in the Hash.

NOTE: The configuration hash must have symbolized keys.

Complete Control

If you need complete control when creating the Redis connection, e.g. if you are using redis-failover or Redis Sentinel, you can provide Sidekiq with a pre-built connection pool:

redis_conn = proc {
  Redis.new # do anything you want here
}

Sidekiq.configure_client do |config|
  config.redis = ConnectionPool.new(size: 5, &redis_conn)
end

Sidekiq.configure_server do |config|
  config.redis = ConnectionPool.new(size: 25, &redis_conn)
end

Note the size tuning. You'll want to ensure you have plenty of connections for the threads running in each process. Connections are created on demand so it's ok to specify a larger size (e.g. 20-30) if you aren't sure. A Sidekiq server process requires at least (concurrency + 5) connections.

Life in the Cloud

One problem with cloud-based systems like EC2 and Heroku is unpredictable network performance. You should tune your network timeouts to be a little more lenient if you are seeing occasional timeout errors, it defaults to 1 second.

config.redis = { url: 'redis://...', network_timeout: 5 }

REMEMBER: THIS IS A BANDAID You are not solving the actual cause of the slow performance.

If you are seeing Redis timeout errors, you should check your Redis latency by using the redis-cli --latency and --latency-history flags:

$ redis-cli --latency-history localhost
min: 0, max: 1, avg: 0.20 (1359 samples) -- 15.01 seconds range
min: 0, max: 1, avg: 0.18 (1356 samples) -- 15.01 seconds range
min: 0, max: 1, avg: 0.18 (1355 samples) -- 15.01 seconds range
min: 0, max: 1, avg: 0.17 (1359 samples) -- 15.01 seconds range
min: 0, max: 1, avg: 0.19 (1358 samples) -- 15.00 seconds range
min: 0, max: 1, avg: 0.18 (1353 samples) -- 15.01 seconds range
min: 0, max: 1, avg: 0.19 (1357 samples) -- 15.01 seconds range

This says my average latency to localhost is 0.2ms or 200 microseconds: excellent. With users seeing odd Redis behavior, I regularly see setups with latency over 5 seconds: terrible. You can move to a different Redis provider or run your own Redis server on a dedicated machine but there's nothing Sidekiq can do if the network performance is terrible. Contact your Redis provider and ask about your available options.

Disabled CLIENT command

Some Redis servers have all CLIENT commands disabled for security purposes (to avoid accidental CLIENT KILLs, etc). In this case, you may disable Sidekiq's CLIENT SETNAME command by setting id option to nil:

config.redis = { url: 'redis://...', id: nil }

Architecture

Redis offers many different topologies:

  • Single node -- offers no fault tolerence
  • Redis Sentinel -- offers fault tolerence, fails over to a replica in case of primary failure
  • Redis Cluster -- multi-master keyspace spread across many instances

Cluster is designed for large-scale datasets, like caches, that can spread evenly across machines. Cluster is NOT appropriate for Sidekiq as Sidekiq has a few very hot keys which are constantly changing (aka queues). I recommend using Sentinel or use a Redis SaaS which has built-in support for failover.

Tuning

You can see Redis's config variables with the command redis-cli info.

Memory

Redis runs best when all data fits in memory. You should set maxmemory-policy noeviction in redis.conf so Redis doesn't drop Sidekiq's data silently.

Multiple Redis instances

Many people use Redis as a cache (it works great as a Rails cache store) but it's important that Sidekiq be run against a Redis instance that is not configured as a cache but as a persistent store. I recommend using two separate Redis instances, each configured appropriately, if you wish to use Redis for caching and Sidekiq. Redis namespaces do not allow for this configuration and come with many other problems, so using discrete Redis instances is always preferred.

Timeouts

The most common reasons for Redis networking timeouts are:

  1. swapping - you are running out of RAM and disk swapping is causing massive latency spikes. Using Redis to hold cache data can take a lot of RAM, configure Sidekiq to use a separate Redis instance if necessary.
  2. command latency - you are running a Redis command which is taking a large amount of time. Read the Monitoring blog post below.

Notes

Previous: Best Practices Next: Error Handling

You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.
Press h to open a hovercard with more details.