"The Good Enough Revolution" - As it applies to js

T

Tim Down

Crediting an obnoxious parroter for changes in jq could be a post-hoc
fallacy. In fact, some small problems have gone unfixed (change
string.match -> pattern.test, in many places, to improve perf). Who do
you blame for that?
Regardless, the few small changes in jq don't change the inherent design
of it much.
[snip]

The "event registry" (if it can be so called) still uses attachEvent or
addEventListener.

Have I missed something? Is there another way to register multiple
event listeners for a single event on a single object?

Tim

.... or is your point that jQuery should be using event delegation?

Tim
 
T

Thomas 'PointedEars' Lahn

Tim said:
Crediting an obnoxious parroter for changes in jq could be a post-hoc
fallacy. In fact, some small problems have gone unfixed (change
string.match -> pattern.test, in many places, to improve perf). Who do
you blame for that?
Regardless, the few small changes in jq don't change the inherent design
of it much. [snip]

The "event registry" (if it can be so called) still uses attachEvent or
addEventListener.
Have I missed something? Is there another way to register multiple
event listeners for a single event on a single object?

Yes and yes.
.... or is your point that jQuery should be using event delegation?

His point is probably that those two approaches are not equivalent:

<http://www.quirksmode.org/blog/archives/2005/08/addevent_consid.html>


PointedEars
 
B

beegee

I regularly reduce old code on the sites I work on by more than half
using JQ and I'm being conservative. It's probably been used to knock
out megabytes worth of crap-code on our sites in just the last couple
months. That's worth a one-time cache of 15K for the user. If you had
say... 3-5 static page sites that just needed some form validation in
mind, I could see your point.


I regularly reduce maintainable, easy to read code on the sites I work
on by more than half using JQ and I'm being fanatic. It's probably
been used to knock out megabytes of readable code on our sites in the
just the last couple months. That's worth a one-time cache of 15k for
the user who will never know the difference. Here's some more words.


There, fixed it for ya.


Bob
 
T

Tim Down

Yes and yes.


His point is probably that those two approaches are not equivalent:

<http://www.quirksmode.org/blog/archives/2005/08/addevent_consid.html>

OK. All this I know. I haven't delved into the innards of jQuery's
event handling code, so I don't know if there's faulty logic to be
found. What I was trying to get at was whether there is something
wrong with the use of attachEvent and addEventListener in all cases. I
certainly don't think so.

Tim
 
B

Bart Lateur

Thomas said:
Given that an HTML tag has the form `<...>', what you describe is not going
to happen.

Um, $('a') works in jQuery, it returns an array of all links as
elements.

"a" is perfectly usable as an ID.
 
E

Eric Bednarz

Bart Lateur said:
Um, $('a') works in jQuery,

You are using the *generic indentifier* ‘a’, not a *tag* (e.g. a HTML
start-tag of an A element might look like ‘<a>’, ‘<a/’, ‘<a’, ‘<>’, and
have a attribute specification, of course).

Of course the DOM has misnomers like tagName and getElementsByTagName(NS),
and incidentally, in Dutch ‘dom’ means stupid. :)
 
G

Garrett Smith

The links in to footer of that article have some incorrect advice,
particularly the advice to replace a method assignment in a loop with
assignment to an anonymous function.

Good because it explains things from a beginners perspective, confused
because it fails to explain context, and bad because it has a bug where
he loses context for replacing the event handler and calling - old - in
the wrong context. Question what you read.

He shows in one of the linked "Documentation you should know by heart":
http://www.quirksmode.org/js/events_tradmod.html

| var old = (element.onclick) ? element.onclick : function () {};
| element.onclick = function () {old(); spyOnUser()};

The problem is that in - element.onclick - , function - old - is called
in global context, so when - old - executes, its - this - is the global
object where originally it would have been - element -.

| for (var i=0;i<x.length;i++) {
| x.onmouseover = over;
| x.onmouseout = out;
| }
|

Which is OK. It would be much more efficient to use bubbling and have
one handler for those, but what is worse is what he says next:-

| This code will work, no problem. But since the functions over() and
| out() are so simple, it is much more elegant to register them as
| anonymous functions:
|
| for (var i=0;i<x.length;i++) {
| x.onmouseover = function() {this.style.backgroundColor='#cc0000'}
| x.onmouseout = function () {this.style.backgroundColor='#ffffff'}
| }

Besides missing semicolon, the code in the second example results in the
creation if (x.length * 2) functions, where the code in the first
example results in the creation of just 2 functions. So the example that
he says is "much more elegant" is less efficient (that should be fairly
obvious).
OK. All this I know. I haven't delved into the innards of jQuery's
event handling code, so I don't know if there's faulty logic to be
found. What I was trying to get at was whether there is something
wrong with the use of attachEvent and addEventListener in all cases. I
certainly don't think so.

Please see lines 2433-2796 where - jQuery.event - is defined.

With attachEvent, the callback functions are called in random order,
immediately after the object's event handler is called. The - this -
argument will be the global object, and - window.event - might not be
the same window/frame that the event occured in.

With addEventListener, callbacks fire in order they were registered in.

This is also a problem with Mootools:
https://mootools.lighthouseapp.com/...nt-ff-s-addeventlistener-and-event-fire-order

A program should not rely on callbacks firing in a particular order but
using a registry that is known to be inconsistent in contrast to one
that is seems unreasonable. That DOM 0 Event registration is much more
consistent.

// More consistent.
elem.onclick = callback;

An event registry that is designed and intended to do register and
remove callbacks should not be concerning itself with the name of the
event, whether there is a potentially absent - srcElement - property, or
any of the plethora of oddities that may or may not occur in any given
situation or browser configuration.

An event registry that concerns itself with specific details of browser
bugs or peculiarities will ultimately fail to miss some of those
peculiarities and in its attempt, will add complexity and bloat.

Taken to the extreme, it becomes either A) Afrightening mess that is not
extensible or B) magic (depends on your perspective). Adding expando
properties to the objects in the registry is risky. Errors caused by
setting pageY/X on an event in Gecko are well-known to this group's
regulars and such attempts also fail in Safari 2. It was recently
mentioned how the - delete - operator in MSIE results in error when used
on a certain Host object, and so relying on the delete operator to
remove an expando property from a Host object is not a safe bet.

In contrast, an event registry that does not concern itself with such
issues is going to be a lot simpler. It will be more extensible and can
possibly even be extended via prototype inheritance so that any
user-defined "ADT" or "class" could itself be an Event Publisher, should
such design be deemed desirable.

Where it is likely that multiple callbacks need to be registered, a
callback system that uses only properties for events (as DOM 0 does) can
work very well.

The strategy for using events-as-properties to keep an array of callbacks.

elem[eventType] = callback;

- where - callback - calls an array of functions. The concept can be
abstracted so that - elem - and - eventType - are variable.

When using attachEvent/addEventListener, one must keep the differences
in mind when using that strategy. Libraries do not remove the need for
testing.

Garrett
 
M

Michael Wojcik

Matt said:
A typical barrier in these kinds of discussions, IMO, is that the
'experts' here don't seem to be able to imagine the world through
someone else's eyes and experience. Things look very different to
different people, and if they choose a different path it's not
necessarily because they are stupid or uninformed.

No doubt at least a few of the people who post here have some
experience with the fields of usability, user interaction studies,
human-computer interaction studies, etc.

In any case, I suspect it is not terribly useful to restate the basic
tenet of usability theory in a discussion like this. Your
interlocutors are either already familiar with the principle, or
disinclined to hear it. It would be difficult to participate in the
industry for long without encountering the idea in one form or another.
I am fascinated by "pop culture" and trying to understand the
"consensus" decisions that emerge from the swarm of people that all
act independently.

Surely *this* ground has by now been tiresomely mined over and over
again by casual observers who have never done a methodologically-sound
human-subjects study, so they can post their trite and redundant
insights to their blogs and suchlike. As an argument in this context I
can't see how it carries any weight - certainly not without a much
more substantial thesis and solid supporting evidence.

(Personally, I am fascinated that anyone would think that large groups
of people, in their participation in popular culture, act
independently. But perhaps I have an overly-cynical view of human agency.)
 
M

Michael Wojcik

Matt said:
Of COURSE some of the popular libraries were slow in adopting some
general good practices. Of COURSE they started out with some bad ideas
and questionable practices. Because their focus was on solving the
problem of how to write, distribute, debug, support, and maintain a
general-purpose library for the masses.

This argument might be more persuasive if good practices were
independent from writing, debugging, supporting, and so forth. They
are not, so this does nothing to excuse a failure to attend to them.

Experienced practitioners should know that good practices reduce
development and maintenance costs.
 
J

Jorge

Stefan Weiss a écrit :



Nope, you're right. And FWIW, accessibilty would also be one of my first
concerns for a e-commerce website - people with certain disabilities
tend to do their shopping online whenever possible.


Lol. But yes, it's always good (and not necessarily that hard) to design
your web apps so they first work without javascript. I sometimes have to
work in text-mode only (oops, system update failed, no more
X-server...), and that's when you really start understanding what
accessibility is about.

The future handicapped due to the handicappeds ? Hmm, no no.

Could you tell me how would your javascript-less theory help in
turning this site (below) accessibility-able ?

http://research.sun.com/projects/lively/
http://research.sun.com/projects/lively/index.xhtml
 
D

David Mark

I regularly reduce maintainable, easy to read code on the sites I work
on by more than half using JQ and I'm being fanatic.

And replace it with illegible code?
It's probably
been used to knock out megabytes of readable code on our sites in the
just the last couple months.  That's worth a one-time cache of 15k for
the user who will never know the difference.  Here's some more words.

For the last time (I hope). It is not 15K. It's silly to quote
compressed sizes as compression is not a constant. It sure isn't 15K
on the server or the user end, so why get hung up on the perceived
weight in transit? And it is hardly a one-time cache, even if it was
not patched and re-released constantly.
There, fixed it for ya.

Until it breaks (or is exposed as broken in environments you didn't
test) and the client must download a new jQuery (or sit on their hands
for years waiting for one that fixes the problem).

I wonder if you understand what you are doing. Do you use the attr
method? If so, you are in for a shock as your "easy to read" code is
a cipher that even the authors of jQuery can't unravel. That's one.
If you want more, they are plenty in the archive (and discussed all
over the place at this point).

You made a bad choice. Every day until you realize this fact will be
spent digging a deeper hole for your clients (and really fouling up
their Web properties). You can observe fouled-up Web sites
everywhere. This is how they got fouled up.

It's not because it is a general-purpose and publicly available
script. There were lots of those before jQuery. It's because it is
(and always has been) a very bad script. So bad that it must be
reevaluated and rewritten constantly. That's contrary to its stated
goal of making thing easier for Web developers. Perhaps people just
skip testing and figure zero times anything is zero.

Not hard to see how this happened. Just imagine looking at browser
scripting through the eyes of developers who are several years
behind. I'd estimate jQuery has arrived in 2003 by now. Should catch
up long after it is irrelevant to do so (assuming technology moves on
from silly scripts that can't do anything properly to... anything at
all). Just because a lot of people have bought into it doesn't make
it a good idea. What you see is not the beginning of a movement, but
the (bitter) end. Hard to believe anyone would argue differently at
this point. The decade is almost over. ;)
 
R

RobG

Matt Kruse wrote: [...]
I am fascinated by "pop culture" and trying to understand the
"consensus" decisions that emerge from the swarm of people that all
act independently.

Surely *this* ground has by now been tiresomely mined over and over
again by casual observers who have never done a methodologically-sound
human-subjects study, so they can post their trite and redundant
insights to their blogs and suchlike. As an argument in this context I
can't see how it carries any weight - certainly not without a much
more substantial thesis and solid supporting evidence.

(Personally, I am fascinated that anyone would think that large groups
of people, in their participation in popular culture, act
independently. But perhaps I have an overly-cynical view of human agency.)

Cue the scene from The Life of Brian, where the masses chant in unison
"We're all individuals" and a lone voice cries "I'm mot".

<URL:
>
 
G

Gregor Kofler

Jorge meinte:

[snip]
Could you tell me how would your javascript-less theory help in
turning this site (below) accessibility-able ?

http://research.sun.com/projects/lively/
http://research.sun.com/projects/lively/index.xhtml

Well, it loads 619kB of scripts and leaves a blank screen then.
I could do that with a simple "<html></html>" (throw in a few more tags
to make it standards compliant). On some subsequent reloads it showed me
a 25% fraction of the "desktop". It put a 25% load on my cpu (i.e.
blocking a complete core), for just sitting there. And there is no
Safari for my OS...
What's this example gonna prove? With _the_ (not _a_) right browser, I
can see some glorious example of how to burn cpu performance for
virtually nothing. But it uses prototype.js - should that prove something?

Gregor
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,089
Messages
2,570,602
Members
47,223
Latest member
smithjens316

Latest Threads

Top