Arne Vajhøj said:
I don't think SUN sees it as a problem.
And many Java developers neither.
We don't need this:
http://www.mono-project.com/MoMA
JNI is kind of lame when dealing with a largish mixed language codebase (big
mix of parts of the app written mostly in C and C++, some parts in Java, and
other parts in a custom ECMAScript/JavaScript/ActionScript style
language...).
in my case, I have more recently gotten my own Java implementation mostly
into basic working order, but I am still building with Eclipse's compiler,
and due to minor technical issues am still left using JNI for this stuff (I
am also sort of using my own half-assedly kludged together class library,
and this is the main place I am running face-to-face with JNI).
it is usable, at least, but far from being painless either...
Of course SUN could have done it, but it would not have been cheap.
this is back to the whole "vision" thing.
Java was seen as a framework for creating and distributing
architecture-neutral apps...
to this extent, it works well enough...
however:
totally left out of the mix is people who are mostly building C and C++
apps, but want something more capable and less terrible than the likes of
using Python or similar.
I suspect probably Sun didn't really want people writing mixed apps, and
using Java as a scripting language, since this would hinder its use in the
above category (as an app-distribution platform).
JNI seems to be designed in such a way to make this particular usage pattern
somewhat awkward (one has to use the JNIEnv pointers which are not
gueranteed to be valid if saved between calls, ...), and AFAIK there is no
external C-side API, ...
nearly everything seems designed with the idea in mind of it being a
self-contained platform.
but... not everyone wants or needs such a "platform".
but, for scripting, I would much rather be using Java than Python at
least...
Python probably has done as well as it has because it spawned up within a
niche where relatively little strong competition existed at the time (apart
from Scheme). it wasn't until recently that JavaScript really became a
viable general-purpose option (for most of its life, JS was largely confined
to browsers, apart from those of us maintaining their own implementations,
....).
now, of course, this niche (in general) has a bit more competition.
back to the late 90's, I had been using Scheme, but due to frustrations with
the VM (Guile) went and wrote my own. by 2002/2003, this had become overly
complex, so I dropped the prior VM and language, and mostly started clean
with a JavaScript variant (2004).
in 2007, I started on a trek involving writing a C compiler for use as a
scripting engine.
in all of this, I developed quite a bit of dislike for nearly all forms of
writing boilerplate for sake of plugging native code and VMs, and even my
own JS-like language was standing on perilous ground for a fairly long time
until I devised better ways to plug it into C land (whereas there is Swig
for Python and similar...).
the main competition (for my JS variant): a dynamically-compiled C/C99
variant (my own implementation, mostly standards conformant). however, it
had its own drawbacks (mostly poor reliability and slow compiler
performance), and once the interface between my JS variant and C improved
(early 2010), I switched back most of my scripting efforts to using this JS
variant (a lot of the machinery used in my C-scripting and early JBC effort
were used in implementing this interface machinery).
finishing up my Java support (via running Java Bytecode) was more recent (I
started around 2009 but worked mostly on other stuff, and had been trying
for an compiler similar to my dynamic C compiler).
eventually, I gave in, and finished up on JBC support mostly as my
Java-compiler effort wasn't going very fast, and also because I felt some
need for an external bytecode (both my JS variant and C compiler have their
own internal, but very different, bytecode and JIT-style systems, and in
both cases the internals are very tangled in with both the bytecode and
compiler machinery, preventing then from being used as external "cannonical"
formats...).
basically, an externalized bytecode could offer an alternative to my current
heavy use of x86 and x86-64 assembler (the worry is that all of this ASM
poses a portability threat...).
and, JBC also offered a few possible capabilities that my other bytecodes
lack.
one was observed in getting the support working: it can be demand-loaded.
most of my prior stuff tended to have to be force-fed its input ("go load in
this big pile of crap which may or may not be useful"), with the JS variant
always recompiling everything, and the C compiler using trickery (dependency
checking, hash codes, and object caching) to allow skipping recompiling
modules which it has previously compiled (when compiling, it would save out
the code, and reload the previously code if the module was still
up-to-date). the above trick (hashes+deps) was used because the C compiler
was SLOW, and otherwise killed app startup times...
demand-loading is a somewhat different possibility...
this makes JARs and class files more sensible (originally, I had just
assumed bulk-loading all the classes and forcing them through, which would
still be problematic), as the volume of code, raw speed of the loader and
JIT backend, ... become much less important with demand-loading (since the
app startup time is not killed by trying to load in and JIT the entire class
library...).
or such...