I found some different and conflicting answers on this topic.
I am building an application which works mostly with html dynamically generated by jQuery, based on results acquired from underlying API in form of JSON data.
I was told by some of my collegues (personally), that the best way would be to do something like this:
var ul = $("<ul>").addClass("some-ul");
$.each(results, function(index) {
ul.append($("<li>").html(this).attr("id", index));
});
$("body").append($("<div>").attr("id", "div-id").addClass("some-div").append(ul));
etc. The reason I was told it was that "updates the DOM directly instead of parsing html to achieve it".
However, I see lots of code like this (same example):
var toAppend = '<div class="some-div" id="div-id"><ul>';
$.each(results, function(index) {
toAppend += '<li id="' + index + '">' + this + '</li>';
});
toAppend += '</ul></div>'
Which I personally consider as not as elegant - but is it better? I googled the issue for a couple of minutes and found this article. Basically, it is about increasing performance drastically by using string concatenation - my "second way".
The main issue of this article is that it has been released in 2009 and discussed jQuery version is 1.3. Today, the current release is version 1.6.4 which can behave quite differently. And this is the issue of most articles on the subject I have already found and I'm also somehow suspicious about their credibility.
That's why I have decided to post the question here and ask - which method of generating DOM is actually the proper one, based on performance?
IMPORTANT EDIT:
I have written a little benchmark to test which approach is better considering performance.
jsFiddle - concatenation version
jsFiddle - array join version
Code:
var text = "lorem ipsum";
var strings = $("#strings");
var objects = $("#objects");
var results = $("#results");
// string concatenation
var start = new Date().getTime();
var toAppend = ['<div class="div-class" id="div-id1"><ul class="ul-class" id="ul-id1">'];
for (var i = 1; i <= 20000; i++) {
toAppend[i] = '<li class="li-class" id="li-id1-' + i + '">' + text + '</li>';
}
toAppend[i++] = '</ul></div>';
results.append(toAppend.join(""));
strings.html(new Date().getTime() - start);
// jquery objects
var start = new Date().getTime();
var ul = $("<ul>").attr("id", "ul-id2").addClass("ul-class");
for (var i = 0; i < 20000; i++) {
ul.append($("<li>").attr("id", "li-id2-" + i).addClass("li-class"));
}
results.append($("<div>").attr("id", "div-id2").addClass("div-class").append(ul));
objects.html(new Date().getTime() - start);
It seems that operating on strings is faster (in Firefox 7 about 7 times) than using jQuery objects and methods. But I can be wrong, especially if there are any mistakes or performance-decreasing bugs in this "benchmark's" code. Feel free to make any changes.
Note: I used Array join because of the article mentioned earlier instead of actual concatenation.
EDIT: Based on suggestion by @hradac, I used actual string concatenation in the benchmark and it did in fact improve the times.
.text(text)makes it significantly slower but does demonstrate the advantage that jQuery can close off an XSS vector.