Send 1 mln requests - strategy

https://lemmy.world/post/8957850

Send 1 mln requests - strategy - Lemmy.World

One needs to send 1 mln HTTP requests concurrently, in batches, and read the responses. No more than 100 requests at a time. Which way will it be better, recommended, idiomatic? * 1) Send 100 ones, wait for them to finish, send another 100, wait for them to finish… and so on * 2) Send 100 ones. And then keep adding new ones into the pool as requests in the pool finish. “Done - add a new one. Done - add a new one”.

This is to me an question lacking very precise requirements. But you say you need batches, so it sounds like option one (batches) is the only option that satisfies the “batches” requirement.
Then this questin isn’t for you.
Nor is the answer for you it would seem.
Given we know very little about the problem and runtime constraints, the second approach has the potential to have better performance, as with case 1. the average duration of the requests in a batch is equal to the worst case for all requests, while in case 2. the average duration of all requests is, well, the average.
Who “we”? Voices in your head?
Aaand… blocked
Aaaaaaaa… blocked

Given you seem to be against adding extra info, based on the other comments, the only correct answer is “whichever one works best for you”.

The answer is quite useless, but I can’t be more specific. “Better” is vague. Faster, smoother, less taxing on system resources, there’s options and they may conflict. Good luck.