-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding concurrent
function that works like Promise.allSettled
#182
Comments
Good suggestion! How about |
Love the concurrent function. Is there a design reason on why it batches/chunks instead of operating like buffer? In the case of a long running task, the time for a chunk to resolve will be capped at the slowest item in the chunk. Is there a use case in mind for the concurrent where chunking is preferred? Would it be more dynamic if it parallelized or create another function that allows parallel operations up to a limit? |
@puppybits Thank you for your interest in fxts As you mentioned, specifying the maximum number of requests and adjusting backpressure might be a better fit for speed and concurrency(like buffer) This function has a use case where it can apply effects simultaneously. It waits for all file I/O requests to complete, and once all the requests are finished, it applies the effect at once. If you were to visualize this on the screen, it would look like the video below: It would be useful if functions working with buffers were added as well! default.mov |
@puppybits It seems that this will be useful when making requests to a service with a rate limit.
async function executeTask<T>(val: T) {
return delay(100, val);
}
async function withoutConcurrent() {
const tasks = range(Infinity);
return await pipe(
toAsync(tasks),
map((task) => () => executeTask(task)),
chunk(10),
map((tasks) => append(() => delay(10_000, 0), tasks)), // for rate limiting
map((tasks) =>
Promise.all(Array.from(tasks).map((f) => f())).then((tasks) =>
dropRight(1, tasks) // filter `delay`
)
),
flat,
peek((task) => console.log(task)), // print task
toArray
);
}
async function withConcurrent() {
const tasks = range(Infinity);
return await pipe(
toAsync(tasks),
map((task) =>
Promise.all([
executeTask(task),
delay(10_000), // for rate limiting
])
),
map(([task]) => task),
peek((task) => console.log(task)), // print task
concurrent(10),
toArray
);
}
|
Suggestion
⭐ Suggestion
💻 Use Cases
The text was updated successfully, but these errors were encountered: