0

This works great at retrieving the php data for say 15 passes BUT when the json file is say 100 items it chokes the php script and creates random errors. My guess is because the requests are all made from this singe ajax request (faster than the php script works) the php script is getting confused?

    $(document).ready(function(){

        var ajax_load = "<div class='loadwrap'><img class='load' src='/img/load.gif' style='width:12px;' alt='' /> fetching list...</div>";

        $("#status").html(ajax_load);

        $.getJSON('/fsbo/get_urls_24_hours', function(data) {

            $("#alias").fadeOut('slow');
            var ajax_load = "<div class='loadwrap'><img class='load' src='/img/load.gif' style='width:12px;' alt='' /> fetching property...</div>";

            $('#props').html('');       

                $.each(data, function(key, val) {
                    $.ajax({
                        type: "POST",
                        url: base_url + "/fsbo/get_property",
                        data: "url="+ val,
                        cache:false,
                        success:
                        function(data){
                        $("<div></div>").html(data).appendTo('#props');
                        }
                    }); 

                });
        }); 
    });

As a side note where do I put the hide loading gif? Putting it at the end of the loop does no good it just opens and closes not waiting for the return of data.

2
  • For the loading gif: Show it just before starting the AJAX-request, hide it in the callback function of the AJAX-request. Commented Feb 1, 2012 at 20:01
  • Does not work. It shows then hides. And later the items append. Commented Feb 2, 2012 at 21:02

3 Answers 3

3

It's generally a bad idea to makes AJAX requests in a loop. Why not just modify the original call to return all of the data you want in your JSON rather than making 100 calls.

If for some reason you can't avoid this, constrain the number of pending requests. Send, for example, the first 5 requests, then only send the 6th once you get a response from one of the first 5. This way only 5 requests are pending at any time and your server isn't hit with 100 all at once.

Sign up to request clarification or add additional context in comments.

Comments

2

This code snipped is killer.

$.each(data, function(key, val) {
       $.ajax({

If the length of data is 100 there will be 100 http connections to your server. This will obviously choke your server. Besides, your browser will become slow. Its like opening 100 tabs in Firefox one shot.

Pass all the data at a single Ajax request. If the size is huge send chunk by chunk. When you receive first response send the request for next chunk. But don't send them simultaneously.

Comments

1

I do think that you have the right answer, that it is because of the call.

The way PHP calls work, is by creating a seperate call to your system, while also invoking various other libraries, therefore invoking a sort of memory leak, which is exponentially increasing the time and resources required.

What I suggest is - pass all of your variables to PHP and let it do the work, then receive a JSON object back, and parse it.

It may be a bit slower to the end user, but should help avoid this from happening.

P.S.

I've had similiar issues, when these kinds of calls were making so many requests for one user, that the whole webserver crashed.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.