3

I take a fat JSON array from the server via an AJAX call, then process it and render HTML with Javascript. What I want is to make it as fast as humanly possible.

Chrome leads over FF in my tests but it can still take 5-8 seconds for the browser to render ~300 records.

I considered lazy-loading such as that implemented in Google Reader but that goes against my other use cases, such as being able to get instantaneous search results (simple search being done on the client side over all the records we got in the JSON array) and multiple filters.

One thing I have noticed is that both FF and Chrome do not render anything until they loop over all items in the JSON array, even though I explicitly insert the newly created elements into DOM after every loop (as soon as I have the HTML). What I'd like to achieve would be just that: force the browser to render as soon as it can.

I tried deferring the calls (every item from the array would be processed by a deferred function) but ran into additional issues there as it seems that the order of execution isn't guaranteed anymore (some items further down the array would be processed before other items before it).

I'm looking for any hints and tips here.

2
  • Rendering 300+ DOM nodes is going to take time. I reccomend that you check out the dojo grid (dojotoolkit.org) it implements lazy rendering but you can still keep all the data on the client. Commented Jun 29, 2010 at 13:50
  • 2
    It's probably a good idea to not insert each element into the DOM separately. Build a container element and keep it out of the DOM until you've filled it up. (300 records doesn't really seem that much to me, and you should definitely be able to get it running faster than 5 to 8 seconds.) Commented Jun 29, 2010 at 13:55

5 Answers 5

1

try:

  • push rows into an array, then simply

     el.innerHTML = array.join("");
    
  • use document fragments

    var frag = document.createDocumentFragment();
    for ( loop ) {
        frag.appendChild( el );
    }
    parent.appendChild( frag );
    
Sign up to request clarification or add additional context in comments.

Comments

1

If you don't need to display all 300 records at once you could try to paginate them 30 or 50 records at a time and only unroll the JSON array as those sub-parts are required to be displayed through a pager or a local search box. Once converted you could cache the content for subsequent display as users navigate up and down the pages.

Comments

0

Try creating the elements in a detached DOM node or a document fragment, then attaching the whole thing in one go.

2 Comments

... or in a document fragment
Interesting! I did just that and it brought the rendering time in Chrome to 1 second. Firefox 3.6, however, still takes 5 seconds. I suppose I'll have to optimize the rendering algorithms themselves.
0

300 isn't a lot. I managed to create a tree of over 500 elements with data from JSON using jQuery, in a fraction of a second on Chrome. 300 isn't a big number.

If they are rendered so slowly, it might be due to a wrong way of doing it. Can you specify how you do it?

The slowest way would be to write HTML into a string in Javascript, then assign it with innerHtml member. But that would still be fast as hell for 300 rows.

Comments

0

Google Web Toolkit has BulkTableRenderers that are designed to render large tables quickly. Even if you choose not to use GWT, you might be able to pick up some techniques by looking through the source code which are available under Apache License, Version 2.0.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.