T
Thomas 'PointedEars' Lahn
Lasse said:Thomas 'PointedEars' Lahn said:What would be the advantage of computing static keys and comparing them
as compared to comparing the live values?
[...]
In the case of the default compare function, it converts both arguments
to strings before comparing them lexicographically.
If that is an expensive operation (which it potentially is, if you
are comparing objects with an expensive toString function), it might
make sense to cache the results. In most cases, the values being compared
with the default compare function will be strings or numbers (and the
numbers really should use a custom compare function too),
Yes, it is a rather bad default.
The swapping of rows can be done by exchanging TR elements in a manner
which is essentially just a pointer exchange, without any object
creation or pseudo-destruction. One should be able to sort the rows
array, with a comparison function that just compares rows[J].key.
However, the row container doesn't have a swap operation, so you will
have to go through the normal DOM methods. That won't be nearly as
fast as doing things directly in JavaScript.
The object that the `rows' property of a table object refers to is not an
Array instance, but a NodeList implementation; that is what would make
this approach not considerably more efficient than the other one of
reading table data into an array, sorting the array and recreating the
table from the sort
result. And are you actually suggesting to augment host objects here?
My guess (it was a long time since I did the test) is that working
directly on the DOM is significantly slower than creating an array in
Javascript and sorting that, and then put the DOM in the correct order
(e.g., by removing all rows and reinserting them in the correct order).
I had an idea this morning that could turn out to be the best of both
worlds: You could read all row object references into an array, sort the
array of references, and then iterate over the array to put the rows in
correct order with insertBefore(). This would take advantage of faster
array sort and would not require removing all rows and reinserting them,
at least not explicitly.
/* but note that this is specified as implementation-dependent */
var rows = Array.prototype.slice.call(tBody.rows, 0);
rows.sort(comparator);
for (var i = 0, len = rows.length - 1, p = len && rows[0].parentNode;
i < len;
i++)
{
p.insertBefore(rows, rows[i + 1]);
}
On problem with using the built-in sort is that it's not guaranteed to be
stable. When sorting a table, it's desireable that sorting first on column
and then another is stable wrt. the result of the first sort.
While what you describe is actually _not_ the definition of a stable sorting
algorithm (its property is instead that elements that compare equal remain
in their original order), the problem you describe can be solved with
computing and comparing row keys as John suggested.
In that case the first pass should probably be a loop in which both the
references are stored and the keys are computed. The key should then be the
property of a *native* object, though:
var rows = [];
for (var tRows = tBody.rows, i = tRows && tRows.length; i--
{
rows = {
data: tRows;
key: computeKey(tRows);
};
}
PointedEars