In my nodejs app, I am trying to read a http file url, and then download the contents of that file in a streaming manner.
What I want to do is: - make to request to the file resource (using node request module) - when response starts to become available, then start reading the data in chunks, rather than have the file downloaded to disk..
I understand that the request module does support streaming, and I have verified that the below code works:
var request = require('request');
var fileUrl = "http://172.19.2.48:8080/files/1472628282006/valid.txt";
request(fileUrl, function(err, response, body) {})
.on('response', function(response) {
/*
response.on('readable', function() {
console.log("now readable");
var ch;
while ((ch=response.read(1))!== null) {
console.log("char:", ch);
}
});
*/
response.on('data', function(data) {
console.log('data: ', data.toString());
});
});
But only problem is, I don't have control on "reading how much I want to read", since the 'data' event gives whatever available at that point of time. Instead what I wanted to do was something like using a read operation myself, as in the commented code in the above snippet.
Such code generally works for nodeJS steams 2, but I cannot get it working here. The readable event is fired, but the read operation returns null. My use case is, I am going to read some sort of structured data, and I am gonna parse it by reading a char at a time using some finite state machine.
So, is there any to read, rather than be notified by a 'data' event?
output, when trying to read. Only readable event is received, subsequent reads return null.
rvnath@admin ~/downloader $ node download.js
now readable
Edit In a nutshell, I want to be able to use the incoming response steam in a streams2 (pull based stream) manner, rather than streams 1(push based) type.