npm install @teamawesome/tiny-batch
tiny-batch is a utility to create functions of which the execution is batched. This can be very useful for instance to limit the amount of queries or http requests while still having a single, easy to use function.
Call tinybatch
to create an async function that adds to the batch. The first argument is a callback that will handle the batching.
import tinybatch from "@teamawesome/tiny-batch";
const batchedFunc = tinybatch((batchedArgs) => {
// code
});
For example, fetch users from different components with a single request:
import tinybatch from "@teamawesome/tiny-batch";
const getUserById = tinybatch((batchedArgs: [number][]): User[] => {
// batchedArgs equals [[1], [2]]
const userIds = batchedArgs.flat();
return fetch(`api/${userIds}`)
.then((response) => response.json())
.then((json) => json.users);
});
const user1 = await getUserById(1);
const user2 = await getUserById(2);
Each call of the batched function adds its arguments to the queue as-is. The callback then gets an array of all these
arguments. The callback must return an array
or a promise of an array
. The return value will be used to resolve
the batched function calls in the same order. If an entry is instanceof Error
, the call will be rejected.
import tinybatch from '@teamawesome/tiny-batch';
const batchedFunc = tinybatch((batchedArgs: unknown[][]): string[] => {
// batchedArgs equals
// [
// [1, 2, 3],
// ["a", "b", "c"]
// ]
return batchedArgs.map((_, index) => `${index} done!`);
});
await first = batchedFunc(1, 2, 3); // 0 done!
await second = batchedFunc("a", "b", "c"); // 1 done!
tinybatch
has a second argument to specify a scheduler. A scheduler determines when to execute the callback. The
scheduler is called each time an entry is added to the batch. tinybatch
comes with some scheduler factories out of the box:
name | description |
---|---|
microtaskScheduler() |
(default) Queues a flush in the microtask queue at the first call. |
intervalScheduler(ms) |
Flushes every given ms, regardless of the queue. The timer can be cleared with the stop() method. |
timeoutScheduler(ms) |
Waits the given amount of ms after the first call to flush. The timer can be cleared with the stop() method. |
amountScheduler(amount) |
Flushes after the given amount of calls. |
import { tinybatch, amountScheduler } from "@teamawesome/tiny-batch";
// Get users in batches of 10.
const getUserById = tinybatch((batchedArgs) => {
// code
}, amountScheduler(10));
The queue can be manually flushed. This will execute the callback regardless of the scheduler. Note that the callback is never called if the queue is empty.
batchedFunc.flush();
The queue can also be inspected.
console.log(batchedFunc.queue);
The scheduler of a tinybatch is available. Some schedulers have extra methods, for instance to clear timers.
console.log(batchedFunc.scheduler);
To reduce overhead even more, caching can be introduced. While this is not supported directly by tiny-batch, it is very simple to achieve. Use any of the memoization libraries available. For example, memoizee;
import memoizee from "memoizee";
const batched = tinybatch((args) => {
// code
});
const batchedAndCached = memoizee(batched, {
// Set the amount of arguments that "batchedAndCached" will receive.
length: 1,
});
await batchedAndCached("once");
await batchedAndCached("once");
The second call is not added to the queue but will resolve with the same value.