I am new to node.js (and to request.js). I'd like to get the body of a website back from a specific url with different paths (in the example below http://www.example.com/path1, http://www.example.com/path2, etc.) and log this data in an object with a key/value mapping (siteData[path] below).
var request = require('request'),
paths = ['path1','path2','path3'],
siteData = {},
pathLength = paths.length,
pathIndex = 0;
paths.forEach((path) => {
var url="http://www.example.com/"+path;
request(url, function(error, response, html){
if(!error){
siteData[path] = response.body;
pathIndex++;
if(pathIndex===pathLength){
someFunction(siteData);
}
}
});
function someFunction(data){
//manipulate data
}
My questions are:
- The if statement (index === length) doesn't look like the right way to determine if the asynchronous requests are finished. How should I properly check if the requests are finished?
- When I execute the code above I get an error
(node) warning: possible EventEmitter memory leak detected. 11 unpipe listeners added. Use emitter.setMaxListeners() to increase limit.I tried chainingrequest(url, function(...){}).setMaxListeners(100);but that didn't work.
Thanks for your help!