39

I'm using play framework, to generate chunked response. The code is:

class Test extends Controller {
    public static void chunk() throws InterruptedException {
        for (int i = 0; i < 10; i++) {
            String data = repeat("" + i, 1000);
            response.writeChunk(data);
            Thread.sleep(1000);
        }
    }
}

When I use browser to visit http://localhost:9000/test/chunk, I can see the data displayed increased every second. But, when I write a javascript function to receive and handle the data, found it will block until all data received.

The code is:

$(function(){
    $.ajax(
        "/test/chunked", 
        {
            "success": function(data, textStatus, xhr) {
                alert(textStatus);
            }
        }
    );
});

I can see a message box popped up after 10s, when all the data received.

How to get the stream and handle the data in time?

3 Answers 3

68

jQuery doesn't support that, but you can do that with plain XHR:

var xhr = new XMLHttpRequest()
xhr.open("GET", "/test/chunked", true)
xhr.onprogress = function () {
  console.log("PROGRESS:", xhr.responseText)
}
xhr.send()

This works in all modern browsers, including IE 10. W3C specification here.

The downside here is that xhr.responseText contains an accumulated response. You can use substring on it, but a better idea is to use the responseType attribute and use slice on an ArrayBuffer.

Sign up to request clarification or add additional context in comments.

11 Comments

why is using of ArrayBuffer better?
@4esn0k: for example if you decide to use a binary protocol, which is a good idea. Another point here is that browser doesn't have to keep the entire data in memory, and as far as i understand, it's easier to achieve with ArrayBuffer than with a String. slice should work better, since it doesn't have to work with UTF-8 String as substring does, but can just use array indexes.
is there a more structured approach / better library to do that?
@Edmondo1984, I ended up creating this package which allows more efficient tranfer than jsonpipe: npmjs.com/package/chunked-request
I am trying to do the same thing as OP asked. I followed Phil's directions and getting the response in chunks. But What should I do to NOT accumulate response? I tried to set responseType to arraybuffer but then I do not get any response.
|
7

Soon we should be able to use ReadableStream API (MDN docs here). The code below seems to be working with Chrome Version 62.0.3202.94:

fetch(url).then(function (response) {
    let reader = response.body.getReader();
    let decoder = new TextDecoder();
    return readData();
    function readData() {
        return reader.read().then(function ({value, done}) {
            let newData = decoder.decode(value, {stream: !done});
            console.log(newData);
            if (done) {
                console.log('Stream complete');
                return;
            }
            return readData();
        });
    }
});

1 Comment

above code worked nicely. Had to add check before 'let newData = decoder' that 'value' was not undefined.
0

the success event would fire when the complete data transmission is done and the connection closed with a 200 response code. I believe you should be able to implement a native onreadystatechanged event and see the data packets as they come.

2 Comments

onreadystatechanged will only fire on timeout, abort and successfull request done. It won't fire on chunks.
onreadystatechange is called each time the readystate property changes.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.