no more primitive data types in Java (JDK 10+). What do you think?

S

Silvio Bierman

According to

"To Java SE 8, and Beyond! Simon Ritter Technology Evangelist, Oracle"

(Google the string "To Java SE 8 and Beyond!" and click on
the PDF file, about the 5th link down the page)

On page 42, it says:

"Unified type system (JDK 10+)
No more primitives, make everything objects"

I've seen very little discussion on this very important
subject.

What do the experts here think of the idea?

For me, and I am no expert, I think it will be good to have
a consistent type model (everything is an object), but I am
worried that the performance will take a hit (computational finite
elements methods, large meshes, etc...), unless PC's and computers
will become 1000 times faster by the time JDK 10+ comes in few years
from now, which might be possible.

Any one knows more information about this item?
Any truth to it? Do you think it will really happen?

--Nasser

Scala already works this way. There is a common super type Any with
subclasses AnyRef (akin Object) and AnyVal. The "primitives" reside
under AnyVal.
 
G

glen herrmannsfeldt

Robert Klemme said:
On 04/20/2012 02:31 AM, Lew wrote:
(snip)
As an additional data point: Ruby MRI works like that. Basically
integers (instances of class Fixnum) look like ordinary objects but
under the hood the value is encoded in the reference and there is no
object on the heap. You get a nice consistent model for the language
user but avoid the overhead of GC.

I have used arrays dimensioned [1] in Java where I needed a
primitive type as an object. I believed at the time that it was
faster than the other ways to do it.

-- glen
 
G

Gene Wirchenko

Yeah, but you have to take into account the kind of people who insisted
that the new millennium started on Jan 1, 2000. :) The concept of "teens"
may be more, um...flexible to some people than to others.

An amusing thing just occurred to me.

We are the sort of people that insist on getting things right,
things like the new millennium and new century starting 2001-01-01
(and "millennium" being spelled with two N's, but that is another
battle). To us, neither time period started until a year later than
the 2000ers think.

And yet, and yet, we often start counting at zero which is one
earlier than most do!
(For the record, I'm with you, but I hardly ever try to explain this sort
of mistake to people who make them any more :) )

It is worth it to weed out people w.r.t. use of logic. Explain
it once. If the person gets it, fine. If not and especially if the
person argues with you on it, *WEED OUT*.

Sincerely,

Gene Wirchenko
 
L

Lew

Yeah, but you have to take into account the kind of people who insisted
that the new millennium started on Jan 1, 2000. :) The concept of "teens"
may be more, um...flexible to some people than to others.

But it did, by popular acclaim. There is no "real" millennium other than the
day after whenever it was hardest to get New Year's Eve hotel reservations at
Times Square. Prince didn't write "party like it's 2000" or entitle his album
"2000". There weren't mass panics at the end of 1000, but at the end of 999.

I thumb my nose at those who pedantically insist that the millennium must
begin in 2001 because there was no "year zero" and remind them that there was
no "year one" either until about what, three or four centuries later, and no
agreement on that for millennia after.

Nor did those early years begin on Jan. 1. So really the pedants should claim
April 2, 2001, as Millennium Day, accounting for the Gregorian calendar shift.

And if thirteen years starts the "teens", then they start April 2, 2014.

Half-assed pedants; don't even follow through.

I'm'a go where the party at while y'all argue over when the millennium begins.
(For the record, I'm with you, but I hardly ever try to explain this sort
of mistake to people who make them any more :) )

I enjoy it.
 
L

Lew

BGB said:
I think for many, "teens" starts at 10 (rather than 13), so 2010-2019 would be
the "teens" of the new millennium.

If many thought the world were flat, would that make them right?

No.

"Ten". "Eleven". "Twelve". "Thir_*teen*_". "Four_*teen*_". "Fif_*teen*_". ...
 
G

glen herrmannsfeldt

Gene Wirchenko said:
On Thu, 19 Apr 2012 23:15:35 -0700, Peter Duniho
(snip)
(snip)
We are the sort of people that insist on getting things right,
things like the new millennium and new century starting 2001-01-01
(and "millennium" being spelled with two N's, but that is another
battle). To us, neither time period started until a year later than
the 2000ers think.
And yet, and yet, we often start counting at zero which is one
earlier than most do!

While we are in the third millenium and the 21st century, I never
hear anyone say that we are in the 202nd decade. I don't ever
remember anyone saying we were in the 199th, 200th, or 201st
decade, either.

It seems to me that decades don't count the same way as centuries.

In addition, with a few year uncertainty in the actual date
that christ was born, worrying about the difference in millennia
seems a little strange.

-- glen
 
R

Robert Klemme

I have used arrays dimensioned [1] in Java where I needed a
primitive type as an object.

That was often the approach if one wanted to modify a value in the
caller's or other scope. I'd rather have a mutable integer object.
I believed at the time that it was
faster than the other ways to do it.

And, was it?

Cheers

robert
 
B

Bernd Nawothnig

This is the way Eiffel works,

The same for Python.
but under the covers there are still primitives. Perhaps what they
have in mind for Java, more intelligent boxing. At least at the low
levels of the JVM you need primitives.

These implementation details should better be hidden and invisible for
most cases. Let the compiler automatically detect and generate
possible optimisations.

A programming language should be as simple and orthogonal as possible.




Bernd
 
L

Lew

Peter said:
Peter said:
Lew wrote:

Arved Sandstrom wrote:
This is the teens of the 21st century after all.

Quibble: Not until next year.

Yeah, but you have to take into account the kind of people who insisted
that the new millennium started on Jan 1, 2000. :) The concept of "teens"
may be more, um...flexible to some people than to others.

But it did, by popular acclaim. There is no "real" millennium other than the
day after whenever it was hardest to get New Year's Eve hotel reservations at
Times Square. [...]

You would be correct, except you're not. If I thought the people making
the mistake I'm talking about actually understood the point you're making,
and were just arbitrarily reassigning the term "millennium", you'd have a
point.

But they don't. They are specifically looking at the count of years and
falsely imagine that on Jan 1, 2000, two sets of 1000-year intervals have
passed.

Two sets of 1000-year intervals *have* passed.

Since the year zero. Defined as 1000 years prior to when people first reacted to "the millennium".

You go on and miss the party, Peter. I'll have fun there without you.
 
L

Lew

glen said:
While we are in the third millenium and the 21st century, I never
hear anyone say that we are in the 202nd decade. I don't ever
remember anyone saying we were in the 199th, 200th, or 201st
decade, either.

I had an argument in 1980 with someone who claimed that 1980 belonged to the 70s, not the 80s. I was responding to an argument that ten belonged in the teens.

These are *linguistic* terms, not scientific ones. They mean what society has them mean, and by the party metric, society deemed 2000 as the beginning of the millennium. I agree.

It's all by convention. I follow that convention, and I am far from alone.
It seems to me that decades don't count the same way as centuries.

In addition, with a few year uncertainty in the actual date
that christ [sic] was born, worrying about the difference in millennia
seems a little strange.

And unnecessary. The "where's the party?" rule completely resolves the problem.

If there's one thing pretty well established, it's that Jesus couldn't have been born on January 1, which was New Year's Day in the pre-Christian era anyway, nor even on December 25. So Christs's birthday has absolutely nothing to do with the discussion.

Zero, like the year that began the millennium count (a.k.a. 1 BC).
 
L

Lew

rossum said:
abstract class Peano { }

class 0 extends Peano { }

class 1 extends 0 { }

class 2 extends 1 { }

...

And that's relevant because ... ?

Do you think they'll suddenly allow leading digits in class identifiers for Java code? I think not.

It's all a tempest in a teapot anyway.
 
G

Gene Wirchenko

On Fri, 20 Apr 2012 20:53:46 +0200, Bernd Nawothnig

[snip]
These implementation details should better be hidden and invisible for
most cases. Let the compiler automatically detect and generate
possible optimisations.

If you complicate things, the compiler then has to work to
decomplicate (optimise). Why not just keep it simple?
A programming language should be as simple and orthogonal as possible.

One application of keeping it simple would be to use primitives
where possible -- since they are simpler than objects -- and only use
objects where they are needed.

Sincrely,

Gene Wirchenko
 
G

glen herrmannsfeldt

Lew said:
(snip)
And that's relevant because ... ?
Do you think they'll suddenly allow leading digits in class
identifiers for Java code? I think not.

As I remember, all unicode letters are allowed. There are plenty
that could be confusing to readers. Maybe there aren't any that
look like roman digits, though. There are many that look like,
but aren't the same character as, roman alphabet letters.

-- glen
 
A

Arne Vajhøj

Scala already works this way. There is a common super type Any with
subclasses AnyRef (akin Object) and AnyVal. The "primitives" reside
under AnyVal.

Which is also the C#/.NET way.

Arne
 
A

Arne Vajhøj

It's impossible.

Since several languages has already implemented this feature, then
it is obviously not impossible.
Whatever they mean when they say "remove primitives"
cannot possibly be what those words actually mean. Just think, how would
it be possible to state a = a + 1 without the number 1? Ok, so you can
use .add(Integer x). But how precisely do you call it? .add(1)? There's
still a 1. And what's worse is if numbers act like objects, which
introduces it's own dangerous problem. Is the number 5 really 5, or is
it something else? Treating primitives like objects, without them
actually being objects, is UN-neccessary and confusing.

5.length() or 5.size()?

I am not sure about those, but toString() should obviously
be there.
Well if 5 is an object I should be able to
over-ride it.

Class 6 Extends 14 {}

????

6 and 14 are instances of int not types.

And not all types are extendable.

Arne
 
A

Arne Vajhøj

However they do things there will be problems and concerns. What you
talk about is not likely to be one of them. In a system where all things
are objects, numeric literals are objects: they are syntactic sugar.

a = 2;

really means

a = new Integer(2);

and

a = a + 1;

means that a is some Number and you're adding Integer(1) to it. Who
cares that underneath the hood the compiler translates that to (int)13 +
(int)1?

Just because you've got literals doesn't mean that you need primitives.

I would expect a split in ref and val types just both deriving from
Object.

Arne
 
J

Joshua Cranmer

It's impossible. Whatever they mean when they say "remove primitives"
cannot possibly be what those words actually mean.

The term probably refers to unifying the type hierarchy such that the
primitive types are logically subtypes of Object. In other words, remove
the distinction between primitive and reference types.
5.length() or 5.size()? Well if 5 is an object I should be able to
over-ride it.

Class 6 Extends 14 {}

5 is an object instance, not a type that can be extended. Just like I
can't say class Allegro extends System.out {}...
Is that what they mean, or do they mean they will just treat numbers
/like/ objects? I guess I need more information. In the absence of a
good reason, I don't believe such a change will ever actually make
it into Java.

My guess is the main goal is to allow things like a true List<int>
(where the T data would be `int data') instead of List<Integer>.
 
A

Arne Vajhøj

On 4/19/2012 11:27 PM, Tsukino Usagi wrote:
...
...

I'm sure if the literal 6 were mapped to an Object, it would be an
instance of Integer or some other final class, preventing overriding.

In C# value types can not be extended.

In Scala value types is a fixed set.

So it seem very likely that int would be final if this
change were implemented.

Arne
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,982
Messages
2,570,190
Members
46,740
Latest member
AdolphBig6

Latest Threads

Top