The easiest thing to do would be to process everything concurrently. The Promise specification has some methods for working with multiple promises simultaneously, for this we'll want to use Promise.all.
app.use('/convert', async (req, res) => {
const files = await getFiles();
const promises = files.map(async (file) => {
await download(file);
await convert(file);
await upload(file)
});
await Promise.all(promises);
res.send('finished');
}
Although doing everything at once is relatively simple, it can be very resource heavy. It's unclear how download, convert, and upload work internally but it's very possible that you could hit the limit of the machine resources. To avoid things like hitting the open file limit, or running out of memory there should be a limit to the number of items being processed concurrently.
One way is to process items in batches. To process in batches you could simply split the array of files into chunks and combine the solution above with your iterative solution.
app.use('/convert', async (req, res) => {
const files = await getFiles();
const chunkSize = 5;
const chunks = [];
while (files.length) {
chunks.push(files.splice(0, chunkSize));
}
for (const chunk of chunks) {
const promises = chunk.map(async (file) => {
await download(file);
await convert(file);
await upload(file)
});
await Promise.all(promises);
}
res.send('finished');
});
The implementation above will wait for chunkSize items to finish processing before queuing up another chunkSize items for processing. Because it waits for all items to finish it's possible that some of the items process very quickly, but others take much longer. In this case you end up under utilizing your resources. Ideally you would always be processing chunkSize items at a time. To do this you can queue up chunkSize "threads" for processing, each "thread" will process one item at a time until there is nothing left to process.
async function process(file) {
await download(file);
await convert(file);
await upload(file);
}
async function thread(files) {
while (files.length) {
await process(files.pop());
}
}
app.use('/convert', async (req, res) => {
const files = await getFiles();
let maxConcurrency = 5;
const threads = [];
while (--maxConcurrency) {
threads.push(thread(files));
}
await Promise.all(threads);
res.send('finished');
});
Array.prototype.mapalong withPromise.allto process everything concurrently. Doing everything concurrently could be very resource heavy, so you may want to process in batches, or some kind of queue based processing, but that depends on your requirements.download,convert, anduploadare truly asynchronous you should not need additional node processes. If any of the code you're running blocks (e.g. is synchronous and takes a long time to run), then you may want to consider forking node.