A
Arved Sandstrom
Not usually, but of course, it's impossible to say with certainty.
If the user has been rigorous about keeping the older versions updated,
that definitely mitigates the risk. If the user has 1.2 or older, well,
they're SOL I guess, but it's their own fault. 1.3 and later get
security updates.
A malicious app would have to insist on using only an older version of
Java (doable with Java Web Start AIUI - not sure if possible with
applets), hope the user has that version installed (which cannot be
guaranteed), and depend on the user not having kept it up to date. The
target population would comprise casual users - professionals would be
much more likely to have kept things up to date and to have purged
obsolete versions, and less likely to run exploitative code. Casual
users are more likely to have current versions of Java, and to be
running the Java update service in the background, keeping their Java
patched. Plus there just aren't so many casual users.
Given the difficulty of getting Java to do anything harmful in the first
place, given the sandbox and how few Java exploits there ever were, that
would be an awful lot of malice invested in writing code for very little
chance of damage. It's so much easier for the frakheads to write
malware in languages other than Java.
The risk is negligible. It's a tempest in a teapot.
Has your research uncovered any such incidents?
I don't know if it was possible before in a Java applet to select JRE
version, but it is now (I think as of SE 6 Update 10).
AHS