User's Guide 04:00 Defining and Sending Jobs

Ed Ropple edited this page Jun 25, 2018 · 3 revisions

Creating a Job

Jobs are TypeScript/ES6 classes (or constructor functions, I guess). Let's consider NoDeps from the example project:

// example/src/NoDeps.ts
import Bunyan from "bunyan";
import { IDependencies, ClientRoot } from "@taskbotjs/client";

export class NoDeps implements IDependencies {
  readonly baseLogger: Bunyan;
  readonly taskbot: ClientRoot;

  constructor(baseLogger: Bunyan, taskbot: ClientRoot) {
    this.baseLogger = baseLogger;
    this.taskbot = taskbot;

(Aside: ClientRoot is a class that exists mostly to erase the types of our Client object. I'd recommend passing around ClientRoot references instead of Client, as future work might provide other implementations.)

The name NoDeps, then, refers to no user dependencies; it is an invariant that every job must be passed a Bunyan logger as well as a TaskBotJS client. And our configuration, then, creates a NoDeps off of that logger:

// example/cfg/example.config.js
const { Config } = require("@taskbotjs/service");
const { NoDeps } = require("../dist/NoDeps");

// ...

const config = new Config();
config.dependencies = (baseLogger, taskbot) => new NoDeps(baseLogger, taskbot);

And then a straightforward job that just prints to the logger its argument:

// example/src/jobs/ArgJob.ts
import Chance from "chance";
import sleepAsync from "sleep-promise";
import { DateTime } from "luxon";

import { Job, IDependencies, constantBackoff, RetryFunctionTimingFunction } from "@taskbotjs/client";

import { NoDeps } from "../NoDeps";

const chance = new Chance();

export class ArgJob extends Job<NoDeps> {
  static readonly jobName: string = "taskbot.arg";
  static readonly maxRetries = 5;
  static readonly calculateNextRetry: RetryFunctionTimingFunction = constantBackoff(3);

  async perform(arg: number): Promise<void> {
    const interval = Math.max(25, Math.round(chance.normal({mean: 300, dev: 250})));
    await sleepAsync(interval);{ arg}, `I have an arg: ${arg}`);

Nothing super surprising, but now we've got a job--and so we'd better go back to our config file and register our job:

// instantiate and otherwise configure, then:
const { ArgJob } = require("../dist/jobs/ArgJob");

Done and done--the server, once started, knows how to resolve a job name of taskbot.arg to ArgJob. (The client will use the ArgJob class object/constructor function object to figure out what name to send!)

Actually Sending a Job

Interacting with the TaskBotJS data store is done through the Client class out of @taskbotjs/client. First, you need a pool of clients (hereafter unsurprisingly referred to as ClientPool) instantiated with Client.withRedisOptions or one of the other builders:

const clientPool = Client.withRedisOptions(
  { url: "redis://", prefix: "my-prefix/" }

Easy as falling off a log (cribbed mostly from our example):

import { ArgJob } from "../jobs/ArgJob";

const clientPool = Client.withRedisOptions(logger, { url: "redis://", prefix: "ex/" });

// this will yield until there's an available client (backed by
// a Redis client) to queue jobs.
await clientPool.use(async (taskbot) => {
  taskbot.perform(ArgJob, 35);

The client uses ArgJob to determine the settings for stuff like job retries, whether to store a backtrace (and how much of it) when TaskBotJS catches an error thrown from the job, and other similar settings. (If you'd like to override the defaults defined in the job itself, you can use taskbot.performWithOptions, and the type JobDescriptorOptions includes the valid options that the method accepts.)

The Default Client Pool

An early and consistent bit of feedback with regards to TaskBotJS is that there's a lot of boilerplate around running a job. You have to pass a ClientPool down, then you have to await clientPool.use to get an instance, and so on. That's kind of a bummer, and for many applications it can be distracting.

TaskBotJS's codebase avoids singletons and unlike its inspiration Sidekiq it's quite possible to have multiple differently-configured TaskBotJS clients in the same process. (I don't envision most use cases to need it, but doing so encouraged better encapsulation and a cleaner design.) However, there is one place where you as a developer can opt into singleton invocation of jobs. By calling Job.setDefaultClientPool (a static method on the Job class), you enable the use of the methods Job.perform, Job.scheduleIn, Job.scheduleAt, and the related WithOptions methods. The below is cropped from the example project:

const clientPool = Client.withRedisOptions(/* ... */);

(async () => {
  while (true) {
    // ...

    if (chance.integer({ min: 0, max: 100 }) < 5) {"Queueing fail job.");
      await FailJob.perform();

    if (chance.integer({ min: 0, max: 100 }) < 5) {"Queueing long job.");
      await LongJob.perform();
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.
Press h to open a hovercard with more details.