Dealing with rate-limits: Use Bottleneck and Promise.all to safely & efficiently send a large number of outgoing requests with Javascript.

Michael Alan Cohen
3 min readMay 3, 2021

--

Every now and then you might need to send a large number of requests to a REST API that is rate-limited or, in some cases, unable to handle more than a certain number of requests a second. Often when you send a request to an API for a specific set of data you recieve a list of ids back.

Imagine we were writing a script to download a list of ingredients from an external api to save in our own database. When we call the ‘/ingredients’ endpoint we get back a list of ids:

The response doesn’t have the required detail for our needs. There are a few ways you might consider handling this. You could either loop over the ingredient ids using async/await to get each detail one after another (Method 1). Or you you might try using Promise.all() (Method 2). These both have their advantages and drawbacks.

Method 1: Async/Await in a for loop.

This is clean, readable, and maintainable. But it is not very performant at all as we only send a new request when the last one has resolved. Also there is a chance, albeit slim, that our requests would be rate-limited by external api and we would end up with a lot of 429 errors or 5xx (for badly written apis) when sending large numbers of requests.

Method 2: Using Promise.all()

This method is great. It is simple and clean and fast, however, it has two major drawbacks, if any of the promises are rejected, an error will be thrown inside our function and we won’t have access to any of the data that we had succesfully fetched (All or nothing). The second problem is that the requests are going to be executed extremely quickly so requests will definitely fail due to rate-limiting on the external api or you will get ECONNRESET from the external api server.

Solving problem number 1:

So here we wrap the Promise.all method in a function, promiseAll. So our code would now look like this:

This is awesome, if we run this code we will get all of our results and all of the api errors back. But we can expect a lot of errors because of the speed at which this code will dispatch requests.

Solving problem number 2: Introducing the library bottleneck

Bottleneck is a lightweight and zero-dependency Task Scheduler and Rate Limiter for Node.js and the browser. — Simon Grondin, bottleneck Author.

We can use a bottleneck ‘limiter’ to schedule requests in a queue and we can tweak how many we allow per minute amongst many other things. I encrouage you to read the bottleneck docs to get an understanding of how powerful the package is. Anyway, this how our code should look once we add a limiter.

Now this is epic, we can dispatch requests much faster than simply using Async/Await, we handle and detail all errors, and we can tweak the limiter to get the best performance possible and ensuring that we retrieve all the data we need.

We could even go further and add an error handling callback to our promiseAll function so that you can decide what you want to do with an error:

Thanks for reading!

--

--