I have a service written with nodejs when I found some requests took too long to process. so I decided to queue up the requests and process them one by one. I used Celery in python projects before, so I was looking for something similar in nodejs. Then I found Bull which is a Redis-based queue for Node. Here are some essential notes that would be helpful if you want to use Bull in your project.

Concepts

  1. Worker. A worker is a process that is meant to run jobs. It connects to a queue and waits for jobs to be processed. A worker can be deployed seprately but if you deploy in the same machine or container it needs to run separately from your major application process.

  2. Job. The job is the task that needs to be processed. without the queue this should be the only thing you need to create. With queue you just need to add the job to the queue and get a response swiftly. If you need to monitor the job status you can register a listener to it.

  3. Queue. A queue is a container of jobs that are waiting to be processed. With Bull it is persisted in Redis. If you want the worker to process the jobs in the queue, you need to maker sure the queue and worker use the same name parameter.

Pre-requisites

Bull set redis as a default dependency. I don’t think they clarify it clearly in the document. However you will run into issues if you have not run a redis server in the background. The quickstart in the document said you can simply initiate a queue like this:

const myQueue = new Queue('foo');

But if you run this you will get a warning of not configuring redis specifically. It would be better to start the queue like this:


const redisOptions = {
  host: redisHost,
  port: redisPort,
  password: redisPassword,
};

const queue = new Queue("my-awesome-queue", { connection: redisOptions });

It will clear the warning and allow you to configure dev-redis and prod-redis separately.

Steps to develop a service endpoint backed with queue

With all the concepts above, you can start to develop your service with queue. Here are the simplest steps I followed:

  1. Define the job that do the heavy lifting
  2. Initialize the queue in your main process
  3. In the controller enqueue the job to the queue

How to run worker and application concurrently

Suppose you have a worker file named worker.js and a queue file named queue.js. In local environment or a vps You can first run the worker like node worker.js then you run the application that including the queuing up logic. But in a containerized environment you need to run both of them in the same container. You can use a process manager like pm2 to run them concurrently. Here is an example of the pm2 config file:


module.exports ={

    apps : [
        {
        name: "app",
        script: "./src/app.js",
        watch: true
        },
        {
        name: "worker",
        script: "./src/worker.js",
        watch: true
        }
    ]

}

In the dockerfile


# Start the Express app
CMD ["/app/node_modules/pm2/bin/pm2" ,"start", "ecosystem.config.js", "--no-daemon"]

In my case, you need to specify the full pm2 path in the dockerfile. Perhaps there are some other ways to do it. But this is the one I found working. Besides you might want to run pm2 in “–no-daemon” mode so you can check the logs output.

There are also other process management tools like concurrently I haven’t tried it yet. I suppose it would work as well.

Performance

You might be worried that adding the queue would consume more resources. But according to my practice, it only increases the memory usage by 10% . Actually as we queue up the jobs in redis and reduced the response time of the endpoint, the running cpu cost is reduced significantly. So I would say it’s a good add on to your service if you have some requests that take too long to process.