Idea 1 just seems too inefficient and cumbersome.
Idea 2 is deprecated, as per the axios github page
Idea 3 could be considered somewhat viable, however, given that you have to do this for 100 users concurrently (100 login calls + 100 getData calls = at least 200 API calls), I would not recommend this. It may put too much pressure on the server hosting the API and will allocate too much memory for concurrent requests, might cause issues and be slow.
What I would do is use a library called Bluebird.js, which has a .map() function which takes an array, a callback function and an object of options as params (one of which is concurrency, the number of promises to be executed at once) and executes the promises (requests, in this case) in batches.
Your code could look something like:
const BluebirdPromise = require("bluebird");
BluebirdPromise.map(usersForLogin, function(userData) {
// Promise.map awaits for returned promises as well.
return axios.post(`http://api.url.here/api/login`,{
id: userData.id,
password: userData.password
});
},{
concurrency: 10
}).then(function(loginResponseData) {
console.log("Logins done");
return BluebirdPromise.map(usersForLogin,function(userData,index){
return axios.post(`http://api.url.here/api/login`,{
id: userData.id,
token: loginResponseData[index].token
})
},{
concurrency: 10
})
}).then(function(resultsArray){
console.log(`All results fetched: resultsArray`);
});
This will hit the login API for 100 users in the usersForLogin array, 10 at a time, so it won't cause one big spike in the server, or the client.