"How arbitrary does a string need to be before the use of eval() is
required to execute it?"
If the string is known when you write the code (or if you build it
using code that is), then eval is *never* needed, and all it does is
to delay any syntax errors you might make to when eval is called,
instead of when the program is loaded.
So, I'm assuming that you have a string provided only at runtime.
Given the following code, I'm able to evaluate/execute most expressions
like: "a.b.c.d()"
However, that string looks like it is written specially for the
current page, since it knows about the runtime environment it is being
executed in. That, most likely, means that it was the author of the
page (or someone related) who wrote the string. In that case you can
trust the format and using "eval" on it isn't much different from
evaluating it as a <script src="thecode.js" ...> element.
If the string is *not* supplied by the same authority as the page,
then it is less likely to be based on the structure of the data
already available in the runtime environment, and more likely to be
expected in a restricted format.
If that format happens to be a subset of Javascript syntax and
has the same semantics as it does in Javascript, then you might
use eval. However, I would first check that the string does
have the expected format.
E.g., if the user must provide a fraction, e.g., "2/7", then I would
*check* the format, most lilely using a regular expression like
(/^\d+\/[1-9]\d*$/).
I might use eval after that, because I know what is going to happen and
that the eval will not do something unexpected, but it would be about as
simple to just capture the numbers with the regexp and do the evaluation
manually:
var match = /^(\d+)\/([1-9]\d*)$/.exec(string)
var frac = match[1] / match[2];
(This particular example actually showed eval being slightly faster,
but that's not something I'd worry about unless you are doing a *lot*
of conversions).
So, eval is never needed for strings provided when the script is
written, nor for strings generated by the program itself.
For trusted strings provided at runtime, eval can be reasonable.
For untrusted strings provided at runtime, you should first check
the format. The format is probably so simple that you don't need
eval if you can check it.
The danger of eval is that it hides errors by being more acceptig than
what is really needed for a given job, by turning syntax errors into
runtime errors, and increasing complecity by adding an extra level to
the programming (code *and* values that become code, at the same
time). Complexity makes maintenance harder.
The average web page scripter is not trained in computer science, and
extra complexity is an error waiting to happen. He should not use eval
at all.
The competent developer should consider whether eval really is the
simplest, and most maintainable, way of doing what is needed. It might
be, but if you need to ask in this group, we'll assume you are not
able to judge it yourself, and then you shouldn't use it.
/L