And yet you go on and on and on about how much larger than 4 bytes they
are, yourself.
No, I go on and on about how NOT very much larger than 4 bytes they are.
According to you, a lot longer.
I meant the actual character data.
In most of the programs I've seen they are. (Where "good" means
something large enough to notice.) String literals alone abound in all
the real-world Java code that I've seen. Dynamic string variables exist,
too, of course, and I'm not claiming that a majority are interned.
The stuff I see tends to have a lot of non-literal strings, acquired
from disk files, the database, or the network. And a lot of them are
short strings, like short table names and little fragments of XML and SQL.
But the overhead of the monitor is still only 4 bytes, less than 100% of
the object size.
You keep harping on that "less than 100%" thing as if that was the part
in dispute. But I'd say that anything more than about 5% is certainly
still a significant overhead. How would you like it if they raised sales
taxes 5% in whatever place it is where you live?
Ergo the claim that the monitor doubles the allocation size is bogus.
I never made or agreed with such a claim, so this is another straw man.
So that 4-byte overhead for a monitor is looking like less and less of a
problem by comparison.
In much the way a 5% tax hike is less of a problem than a 100% tax hike.
Aren't you proving my point that objects are much larger than 4 bytes?
No.
You're providing evidence for my point. Thanks.
Bull puckey.
Most of which are shared,
Where's your numbers? Where's your data? What's good for the goose is
good for the gander...
and best practice militates against autoboxing so that scenario you
describe represents bad programming.
Who said it didn't? But it also represents common programming. The best
programmers don't grind away at reams and reams of Java for BigSoftInc;
the best programmers are hacking Lisp at Google's blue-sky division or
working on AI at MIT's Media Lab or shit like that. The whole Java
language is designed around the expectation that the stuff will be
written by masses of corporate monkeys with
adequate-to-somewhat-noteworthy programming skill and maintained by the
guys that got the C-s and D+s in college, but still need gainful
employment somewhere.
You're bringing in the straw man.
Bull puckey.
The OP claimed that monitors doubled the memory need for objects.
What the OP claimed is not a point against me, because I cannot be held
responsible for something someone else said. So that's irrelevant, i.e.
it's a straw man, in this branch of the thread. I only argued that it's
a significant percentage increase in the memory need, and I only did so
when you made the blatantly false claim that the non-header size of
objects tends to be much larger.
This is the point I addressed,
Obviously not, since it is not a point I ever made, and you are
following up to one of my posts to argue with me.
You have, in fact, provided substantial evidence for my claim that the
monitor presents far less than 100% overhead.
A claim I didn't dispute. My claim was only that objects aren't
typically as large as you claimed they were, and that the overhead is
still significant, even if nowhere near as large as the OP claimed.
How is directly addressing the main point remotely classifiable as a
straw-man argument?
Define "the main point"? I'd define it as "whatever my opponent asserted
in the immediately preceding post", but obviously you're not using that
definition. Instead you seem to be misattributing to me the more extreme
position of the thread's OP, and then misguidedly attacking me for that.
It's called "enregistration", and it's one of the optimizations
available to HotSpot, as is instantiating an object on the stack instead
of the heap.
More details, please, and references. Or, put more succinctly: [citation
needed].
Yet you don't show the numbers.
Neither do you.
What other conclusion can I draw?
There are lots of other explanations; jumping instantly to the least
charitable one, namely that your opponent is being outright dishonest,
says something about your character that I find disturbing.
Tell verifiable truth if you don't want to be called to account for
fantasy facts.
Tell that to the man in the mirror.
Don't get mad at me for pointing out your failure.
???
I have no failures, so the above sentence is merely a philosophical
thought-experiment, at least for now.
yadda yadda yadda yadda yadda yadda
So,
1. You claimed my reason for not giving numbers earlier had to be
dishonesty, but here you suggest another reason, which is that a) it
would be effort and b) you'd just invent some long-winded excuse for
ignoring them and sticking to your theory anyway, so it would be
*wasted* effort.
2. You went ahead and accused me (again!) of not having numbers and of
being dishonest in a post that is subsequent to, and indeed in reply
to, a post where I *did* include some numbers -- so *you* were
dishonest.
3. This means that going to the effort of digging up some numbers and
posting them just to *shut you up* in your harping about my lack of
numbers was also wasted effort!
4. Which of course makes it even less likely that others will bother in
the future when you demand hard data, having seen you react like this
once already.
Michal Kleczek had written:
"It is (almost) twice as much memory as it could be and twice as much GC
cycles."
Michal Kleczek does not speak for me, so it does not matter what he had
written.
I said that was "nonsense", to which you replied "Bullpuckey"
No, you specifically claimed "most objects are much larger than 4 bytes"
to which I replied "bullpuckey".
then proceeded to demonstrate that I was correct.
Horsefeathers.
And how is that a straw-man argument on my part, again?
Because you are misattributing Kleczek's position to me, when my
position is actually less extreme.
Given that I directly addressed that claim and you yourself provided
evidence for my point? Hm?
I may have provided evidence for your claim that object overhead is less
than 100% for a typical object, but not for your claim that most objects
are "much larger than 4 bytes". A very large number of them are not; in
fact almost all non-array, non-AWT, non-Swing objects tend to be not
much larger than 4 bytes (not including the object header) and most
arrays are wrapped in a containing class instance (ArrayList, HashMap,
String) so get a triple whammy from two object headers, a pointer from
the containing object to the array, and the array length field, which
will come to 24 bytes rather than just 8 on a typical 32-bit system.
That array needs to get fairly large -- 30 normal references, 60
characters, 120 bytes -- before the overhead gets below 5% of the
thing's total memory consumption.
I'm not defending the decision to make every object a monitor,
Really? It sure as hell looks like you are, given that you argue
vehemently against and border on flaming (I consider repeated
insinuations that your opponents in debate may be intentionally lying to
be bordering on flaming) anyone who suggests that that may have been a
mistake.
other than to point out that it contributed mightily to Java's utility
and popularity.
[citation needed]
But I am refuting the claim that the
monitor adds 100% to the object's memory footprint.
If that were *all* you were doing I'd take no issue with it. But you
also claimed:
1. that "most objects are much larger than 4 bytes"; and
2. that you think I might be being intentionally dishonest;
and both of those speculations are pure and utter hogwash.
Meanwhile no one is showing me the numbers
Utter hooey. That might have been true a couple of days ago but it
hasn't been since 2:52 yesterday afternoon.
The addition of monitors to Java has a benefit.
The addition of monitors to Java is not at issue here. No-one has
claimed it should have shipped with no locking mechanisms at all.
I will assume for the rest of that paragraph that you really meant "the
making of every object a monitor".
Is it worth the cost? That depends on the actual cost, and the actual
benefit, quantification of which is not in evidence in this
discussion.
That's a comparison of apples and oranges: design time benefits on the
one hand and run time costs on the other.
Of course, the design time benefits are reaped, for a given piece of
code, only once, while any run time costs are incurred every time that
code is run.
So the design time benefits have to be huge, really, to outweigh run
time costs for any piece of code that will be run very frequently and
will still be in use for decades.
This clearly is a criterion that includes a lot of Java's own older
library code, which has been in use since the 90s and some of which is
very frequently executed by JVMs all over the world.
A point you nor anyone else has yet to address, choosing instead to
divert with side issues and straw men.
Horse manure.
That'd be you bringing in the straw man, not me, dude.
Balderdash.
Show me the numbers.
Been there, done that, got the T-shirt.