Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Database backed asynchronous priority queue -- Extracted from Shopify
Ruby
tree: d0378cd3b7

Fetching latest commit…

Cannot retrieve the latest commit at this time

Failed to load latest commit information.
lib/delayed
spec
tasks
MIT-LICENSE
README.textile
init.rb

README.textile

Delayed::Job

Delated_job (or DJ) encapsulates the common pattern of asynchronously executing longer tasks in the background.

It is a direct extraction from Shopify where the job table is responsible for a multitude of core tasks. Amongst those tasks are:

  • sending massive newsletters
  • image resizing
  • http downloads
  • updating smart collections
  • updating solr, our search server, after product changes
  • batch imports
  • spam checks

Changes

  • 1.5 Job runners can now be run in parallel. Two new database columns are needed: locked_until and locked_by. This allows us to use pessimistic locking, which enables us to run as many worker processes as we need to speed up queue processing.
  • 1.0 Initial release

Setup

The library evolves around a delayed_jobs table which looks as follows:

create_table :delayed_jobs, :force => true do |table| table.integer :priority, :default => 0 table.integer :attempts, :default => 0 table.text :handler table.string :last_error table.datetime :run_at table.datetime :locked_until table.string :locked_by table.timestamps end

Usage

Jobs are simple ruby objects with a method called perform. Any object which responds to perform can be stuffed into the jobs table.
Job objects are serialized to yaml so that they can later be resurrected by the job runner.

class NewsletterJob < Struct.new(:text, :emails) def perform emails.each { |e| NewsletterMailer.deliver_text_to_email(text, e) } end end Delayed::Job.enqueue NewsletterJob.new(‘lorem ipsum…’, Customers.find(:all).collect(&:email))

There is also a second way to get jobs in the queue: send_later.

BatchImporter.new(Shop.find(1)).send_later(:import_massive_csv, massive_csv)

This will simply create a Delayed::PerformableMethod job in the jobs table which serializes all the parameters you pass to it. There are some special smarts for active record objects
which are stored as their text representation and loaded from the database fresh when the job is actually run later.

Running the tasks

You can invoke rake jobs:work which will start working off jobs. You can cancel the rake task by CTRL-C.

At Shopify we run the the tasks from a simple script/job_runner which is being invoked by runnit:

#!/usr/bin/env ruby require File.dirname(FILE) + ‘/../config/environment’ SLEEP = 5 trap(‘TERM’) { puts ‘Exiting…’; $exit = true } trap(‘INT’) { puts ‘Exiting…’; $exit = true } puts “* Staring job worker #{Delayed::Job.worker_name}” begin loop do result = nil realtime = Benchmark.realtime do result = Delayed::Job.work_off end count = result.sum break if $exit if count.zero? sleep(SLEEP) puts ‘Waiting for more jobs…’ else status = “#{count} jobs processed at %.4f j/s, %d failed …” % [count / realtime, result.last] RAILS_DEFAULT_LOGGER.info status puts status end break if $exit end ensure Delayed::Job.clear_locks! end
Something went wrong with that request. Please try again.