0

I'm building a simple UI that communicates with my backend server. The UI sends to the server a URL to process, and the server would return a list of results.

Regarding results, each result takes time to calculate, yet I would still like to show the incoming results in the UI, maybe even use ng-repeat on the set of results which will be updated as results are being received from the server.

The problem is, Angular's $http service returns a promise that can only handle success and failure states.

I was wondering if there's some kind of "ongoing" state that I can run a callback function on.

For example, The results are JSON containing an array of results:

{
  results: [
    {prop1: val1, prop2: val2}, // result 1
    {prop1: val1, prop2: val2}, // result 2
    {prop1: val1, prop2: val2}, // result 3
    {prop1: val1, prop2: val2}  // result 4
  ]
}

Just imagine that it takes 5-6 seconds to calculate before each of the results are ready to be sent in the response.

So, the wanted flow and behavior is this:

UI:

  1. Sends a POST request containing all the details required by the server.
  2. On each line received, push the result contained in that line to $scope.results.

The rest will be done by the digest loop and ng-repeat will make sure I see them on screen.

So the only problem I have, is how do I access the response data while it is being received?

Btw, if you have any suggestions to do it in another way, I'd be glad to hear, as long as it's simple... it is really a simple UI, nothing advanced is needed here.

Thanks a lot.

2 Answers 2

2

This is not possible since the specification of HTTP requests does not provide this functionality. You will either have to make multiple requests (implement something like a "pagination") or try Websockets (a bit more complicated)

But there is no "flow" like you describe. The client says "hey server, give me the data" - then the server calculates ALL the needed data - then the server responds atomically with the whole data - not step by step.

Sign up to request clarification or add additional context in comments.

4 Comments

What if the server sends to the outputStream line by line, each line is a result which was received by some other worker? This way I can send the response step by step... of course that the response procedure will take more time, as the lines are sent with gaps.
There is no such thing as an outputStream when we're talking about HTTP requests. There are RequestBodys - and those are transmitted atomically. If you are using some other technique than HTTP (hence not the $http service) this indeed is possible - but that would be a completely different approach. No $http, no promises - but maybe a Websocket-Connection for example. This is up to you.
Actually I just understood that when responding with a body, I have to tell the client what's the body's length - which is unknown if the calculations aren't done yet. Thanks a lot anyway. If this is possible, at least I know that it's not with my implementation.
No problem buddy. As said you can still try to do a bunch of requests instead (give me the first entry... give me the second entry... etc. etc. until the server says "there is no next entry") for a large set of data you might also do "give me the first 5 entries... give me entries 6 to 10... etc - but you will have to implement this on both sides in detail
0

You can only get multi-part results if the back end supports that. The only behavior on a chunk of json I've seen is, it gives you the whole thing all at once, and you don't get pieces at a time. But if you can modify the back end to do this, then you can do pieces.

2 Comments

But on the client side, how can I read those pieces as they arrive and do something with them (on the fly)?
@johni The $q service has a third parameter for progress notification and The CommonJS Promise proposal describes a promise as an interface for interacting with an object that represents the result of an action that is performed asynchronously, and may or may not be finished at any given point in time.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.