no more primitive data types in Java (JDK 10+). What do you think?

L

Lew

BGB said:
http://en.wikipedia.org/wiki/Dozen

as noted, 13, 14, and 10 notions of "dozen" are listed, in addition to the
usual 12.

I stand corrected, in the sense that any number can be used approximately.

When someone says, "I had twelve phone calls", they might mean more or less
than twelve, but the meaning of the word "twelve" remains precise.

No one I've heard of before you denied that "dozen" means twelve, even when
used approximately like any other number. So yes, "dozen" can mean
"approximately a dozen" in the same way that any number "N" can mean
"approximately N". Do you dispute that "12" means exactly a particular number
even though it can be used approximately?
 
B

BGB

How exactly is that ironic?

because not everything goes in the same direction, sometimes things go
in different directions.


for example:
language A has primitives as a subtype of Class/Instance Objects;
language B has Class/Instance Objects a subtype of primitives and
value-types.

the irony is that a difference in organization of this sort (related to
a type-ontology) goes against the basic assumptions of such an ontology
(that there is a hierarchical relationship between the types in question).

or such...
 
B

BGB

I stand corrected, in the sense that any number can be used approximately.

When someone says, "I had twelve phone calls", they might mean more or
less than twelve, but the meaning of the word "twelve" remains precise.

No one I've heard of before you denied that "dozen" means twelve, even
when used approximately like any other number. So yes, "dozen" can mean
"approximately a dozen" in the same way that any number "N" can mean
"approximately N". Do you dispute that "12" means exactly a particular
number even though it can be used approximately?

well, given 12 is a number, it indicates a specific value.

"dozen" normally means 12, and "teens" normally means 13-19, but in both
cases I have seen occasional variation on these points regarding usage
of the term.


oddly, when applied to people, the term "teens" is sometimes also used
to refer to people in their 20s, despite this being technically
incorrect (maybe has something to do with movies and TV, where most
"teenage" characters are played by people in their 20s or sometimes
30s... although I know of at least one show where an actress who is 20
plays a character who is supposed to be 9, maybe because she is small...).


or such...
 
G

glen herrmannsfeldt

(snip)
I stand corrected, in the sense that any number can be used
approximately.
When someone says, "I had twelve phone calls", they might
mean more or less than twelve, but the meaning of the word
"twelve" remains precise.
No one I've heard of before you denied that "dozen" means
twelve, even when used approximately like any other number.

OK, but how many is "dozens?" Is it only integer multiples of 12,
or other multiples?

-- glen
 
L

Lew

glen said:
OK, but how many is "dozens?" Is it only integer multiples of 12,
or other multiples?

I would apply the normal rules for plurals, unless you're referring to the
modern version of flyting, in which case the word is a plural construed as a
singular.
 
G

Gene Wirchenko

My proposal was quite the contrary: simplification of things, i.e.
removal of unnecessary data types by unifications.

Keep in mind: the compiler is not the programmer!


See above: don't mix up the compiler, the machine, and implementation
details with the programmer. Things should be simple for the
*programmer*, not necessarily for the compiler or the machine, even if
that maybe preferable. But preferable is not necessary ...

I am not confusing them, but one does have to consider them all.
If a language can not be easily compiled, that creates problems. Even
if the compilation is simply slower, that may discourage use of it.

Sincerely,

Gene Wirchenko
 
B

BGB

I am not confusing them, but one does have to consider them all.
If a language can not be easily compiled, that creates problems. Even
if the compilation is simply slower, that may discourage use of it.

yep...

(pardon any exaggeration... this can be taken purely as personal
opinion...).


programmer considers whether to use C or C++ for something:
tries compiling code with a C compiler, kind of slow, but livable (well,
my project rebuilds in about 20 seconds);
tries compiling code with a C++ compiler, takes a while, programmer
wanders off, gets coffee, comes back, compiler is still churning
along... (and maybe continues to take an additional 30 minutes).

programmer concludes, regarding C++: "no... this just isn't worth it...".

OTOH: Java compiles fairly fast...
(but, admittedly, Java isn't really perfect either).

so, ultimately, the programmer chooses things based on a mixture of what
they are most comfortable with, and what annoyances they are inclined to
put up with, ...


so:
the C++ programmer lives with absurdly long build times (and says "but
at least I have, features!").

the C programmer lives with a world where doing OO stuff/... is
generally painful, and thinks "what problem is there which can't be
solved with a little pointer arithmetic?" (maybe followed by using an
#ifdef to check the CPU type, writing a few bytes into a buffer, and
then calling it as if it were a function pointer...), and concluding
"all this OO stuff is nothing really beyond syntax sugar over a few
structs and function pointers anyways... so why care?...".


and the Java programmer lives in a world where writing code in general
is painful (and then thinks, "well at least I have this giant class
library, just have to find the relevant SomeClassWhichDoesTaskX",
nevermind should anyone go through the pain of having to write some
actual code to do something...). but, it is all good, "for safety!".


say:

C:
void *p;
long long j;
int i;
....
i=p; //compiler: warning: I don't like the way this looks, but ok...
i=j; //compiler: <remains silent>

C++:
void *p;
long long j;
int i;
....
i=p; //compiler: error: you need to cast this crap...
i=j; //compiler: <remains silent>

Java:
Void p;
long j;
int i;
....
i=j; //compiler: error: OMG WTF!
.... (error message spanning several lines)
.... i=j;
.... (gotta make sure, ^, programer sees this crap)
....

so user has to remember to type "i=(int)j;" or they might cut themselves
on the sharp edges of numeric precision, and meanwhile "i=(int)p;"
doesn't even come close to working (but, to be fair, there is little in
the language design to say what the value "should" be, if it were
in-fact to work).


and maybe some of:

C:
printf("Have you seen this? %f\n", sin(M_PI));

C++:
cout << "Have you seen this?" << sin(M_PI) << endl;
( somewhere in history: "hey, have you seen the spiffy new feature, I
can make these operators do whatever random crap I want!", someone else:
"hard-core yeah! using shift for printing! why don't we make this a
standard feature!" ).

Java:
System.out.println("Have you seen this? " + Math.sin(Math.PI));
("no one will notice...").

well, and:
C:
#include <stdio.h>
int main()
{
printf("yay!\n");
return 0;
}

C++:
#include <iostream>
using namespace std;
int main()
{
cout << "yay!\n" << endl;
return 0;
}

Java:
public class MyClass
{
public static void main(String[] args)
{
System.out.println("yay!");
}
}

just saying is all...


(decided to leave out some other examples here, potentially more likely
to create controversy, which isn't really my intent here).


or such...
 
L

Lew

BGB said:
so user has to remember to type "i=(int)j;" or they might cut themselves
on the sharp edges of numeric precision, and meanwhile "i=(int)p;"
doesn't even come close to working (but, to be fair, there is little in
the language design to say what the value "should" be, if it were
in-fact [sic] to work).

There is, too, such a statement in the language spec - the value should be a compiler error.

The notion of 'what the value "should" be' is not mentioned because, naturally enough, it's not a valid construct. If it were to work, it wouldn't be Java any more. It's such a fundamentally opposed construct to the Java ethos that such a thing cannot happen in Java, ever. It's part of the foundational type-safe approach that is the heart of Java. So put aside such self-contradictory absurdities as, "what that which is utterly forbidden and anathema to the language philosophy would look like if it were allowed."

or such...
 
B

BGB

BGB said:
so user has to remember to type "i=(int)j;" or they might cut themselves
on the sharp edges of numeric precision, and meanwhile "i=(int)p;"
doesn't even come close to working (but, to be fair, there is little in
the language design to say what the value "should" be, if it were
in-fact [sic] to work).

There is, too, such a statement in the language spec - the value should be a compiler error.

well, either way, it doesn't work (a compiler error isn't really a value).

The notion of 'what the value "should" be' is not mentioned because, naturally enough, it's not a valid construct. If it were to work, it wouldn't be Java any more. It's such a fundamentally opposed construct to the Java ethos that such a thing cannot happen in Java, ever. It's part of the foundational type-safe approach that is the heart of Java. So put aside such self-contradictory absurdities as, "what that which is utterly forbidden and anathema to the language philosophy would look like if it were allowed."

doesn't seem like a relevant argument, because such things *are* defined
in various ways in other languages, and a language is more about
specific syntax, behaviors, and semantics than it is about constructs
being "anathema" or having a "philosophy" (unless of course the spec
were to define this as well).


I am not saying it has to do the same thing as C or C++, and should have
been implied from context that I was implying that it not do the same
thing as in C or C++, but infact probably something very different
(like, say, implement an interface for coercing the type to a number or
something).


so, the thing is, the compiler rejects it, and the spec doesn't define
semantics for what should happen in this case (apart from an error
condition).

this doesn't mean that semantics for this case couldn't be defined, say
for example, it implementing a "ConvertsToNumber" interface or similar.

example:
public interface ConvertsToNumber {
public int intValue();
public long longValue();
public float floatValue();
public double doubleValue();
}

but, yes, given it is not defined in the language spec, it is sort of a
moot argument.


granted, yes, this makes about as much sense as implementing "structs"
in Java via a sort of magic "ValueType" class or interface, say:

public interface ValueType {
public ValueType copyValue();
public void dropValue();
}

with any object implementing such an interface effectively gaining
pass-by-value semantics (say, if done internally via invoking the
"copyValue()" method to get a new copy, and the "dropValue()" method
whenever an instance goes out of scope).

taken slightly further, and combined with some syntax sugar, a person
could *also* implement an interface for support for things resembling C
pointer operations as well, but with no actual pointers involved... (and
then maybe use it for things like walking strings or arrays or similar).


or something...
 
L

Lew

BGB said:
doesn't seem like a relevant argument, because such things *are* defined in
various ways in other languages, and a language is more about specific syntax,
behaviors, and semantics than it is about constructs being "anathema" or
having a "philosophy" (unless of course the spec were to define this as well).

What do other languages have to do with it?

This is Java.
 
B

BGB

What do other languages have to do with it?

This is Java.

languages don't exist in a vacuum.

most borrow features, syntax, and semantics, from other languages,
particularly when in similar domains.

if something is useful, and the next guy over does it, the usual answer
is simple: do likewise.
 
L

Lew

BGB said:
languages don't exist in a vacuum.

That's not germane here.
most borrow features, syntax, and semantics, from other languages,
particularly when in similar domains.

That process happened for Java a long time since, and having made certain
choices it's not going to reverse them now.
if something is useful, and the next guy over does it, the usual answer is
simple: do likewise.

Big ifs, especially for Java. The construct in question, casting 'Void' to
'int', is vanishingly unlikely to be adopted, because it is oppositional to
the fundamental structure of the language, i.e., rigid type safety.
Consequently that change would break just about everything. Unless the
stewards of the language have utterly lost their minds, this won't happen.

All your fine but unrelated generalities notwithstanding.

And how is that answer "usual"? Have you looked at how slowly Java adopts the
latest groovy fads?

Backwards compatibility and the cost-benefit analysis of feature changes, much
less core paradigm-shifting changes like breaking Java's basic promise of type
safety, are much stronger forces than you credit. Languages do not willy-nilly
adopt features "usually" as you claim.

Or such...
 
B

BGB

That's not germane here.

I disagree, I have seen frequent references by others to, among other
languages: C#, Ruby, Scala, ...

That process happened for Java a long time since, and having made
certain choices it's not going to reverse them now.

taken over a longer time period, they have kept adding features.

for example, Generics, and more recently, Lambdas, ...


some of the features mentioned in the referenced document also seemed to
have some amount of vague similarity to those in C#, ...

Big ifs, especially for Java. The construct in question, casting 'Void'
to 'int', is vanishingly unlikely to be adopted, because it is
oppositional to the fundamental structure of the language, i.e., rigid
type safety. Consequently that change would break just about everything.
Unless the stewards of the language have utterly lost their minds, this
won't happen.

except, it doesn't have to be by violating type-safety.

yes, granted, the Void case would probably remain an error (it was used
merely as an example of "something which doesn't work for a good number
of reasons", and not as an example of "something I think should actually
work"), but there are other cases where such a cast could be allowed for
other object types without necessarily violating type safety (so long as
an interface is defined for which sensible behavior can also be defined
and implemented).


the point would *not* be that of expecting Java to suddenly become some
loosely-typed dynamic language, abandon static type-checking, or turn
into C or C++, as that would be missing the point (and kind of pointless
/ stupid as well).


anyways, the original comment was not meant to say whether or not
certain things were actually good or bad, but how they are often
perceived by developers of the other languages. hence, presentations of
all 3 languages (C, C++, and Java), were more intended as straw-men than
as accurate presentations (hence, why it was mentioned up-front that it
was exaggerated).


All your fine but unrelated generalities notwithstanding.

And how is that answer "usual"? Have you looked at how slowly Java
adopts the latest groovy fads?

yes, I have noticed.

this is not to say there is a specific time-frame, or that it
necessarily happens quickly, but it can also be noted that to some
extent this has still been the practice (if slowly).
likewise goes for C and C++ as well.

a faster-moving language is C#, which to some extent would be a much
more direct example of this practice (and them doing so much more quickly).

Backwards compatibility and the cost-benefit analysis of feature
changes, much less core paradigm-shifting changes like breaking Java's
basic promise of type safety, are much stronger forces than you credit.
Languages do not willy-nilly adopt features "usually" as you claim.

it takes years, but it is worth noting that such a change would be
unlikely to impact backwards compatibility, FWIW.

how or why it impacts type-safety would depend in large part on how the
feature were defined and implemented.


given nothing has been actually defined for what would or would not
happen here, how can it be said what if any impact there will by on
either the type-safety, or on the semantics?

to know the impact requires first knowing what the feature *is*, and not
simply how it may be expressed.
 
T

Tsukino Usagi

I am not confusing them, but one does have to consider them all.
If a language can not be easily compiled, that creates problems. Even
if the compilation is simply slower, that may discourage use of it.

yep...

(pardon any exaggeration... this can be taken purely as personal
opinion...).


programmer considers whether to use C or C++ for something:
tries compiling code with a C compiler, kind of slow, but livable (well,
my project rebuilds in about 20 seconds);
tries compiling code with a C++ compiler, takes a while, programmer
wanders off, gets coffee, comes back, compiler is still churning
along... (and maybe continues to take an additional 30 minutes).

programmer concludes, regarding C++: "no... this just isn't worth it...".

OTOH: Java compiles fairly fast...
(but, admittedly, Java isn't really perfect either).

so, ultimately, the programmer chooses things based on a mixture of what
they are most comfortable with, and what annoyances they are inclined to
put up with, ...


so:
the C++ programmer lives with absurdly long build times (and says "but
at least I have, features!").

the C programmer lives with a world where doing OO stuff/... is
generally painful, and thinks "what problem is there which can't be
solved with a little pointer arithmetic?" (maybe followed by using an
#ifdef to check the CPU type, writing a few bytes into a buffer, and
then calling it as if it were a function pointer...), and concluding
"all this OO stuff is nothing really beyond syntax sugar over a few
structs and function pointers anyways... so why care?...".


and the Java programmer lives in a world where writing code in general
is painful (and then thinks, "well at least I have this giant class
library, just have to find the relevant SomeClassWhichDoesTaskX",
nevermind should anyone go through the pain of having to write some
actual code to do something...). but, it is all good, "for safety!".


say:

C:
void *p;
long long j;
int i;
...
i=p; //compiler: warning: I don't like the way this looks, but ok...
i=j; //compiler: <remains silent>

C++:
void *p;
long long j;
int i;
...
i=p; //compiler: error: you need to cast this crap...
i=j; //compiler: <remains silent>

Java:
Void p;
long j;
int i;
...
i=j; //compiler: error: OMG WTF!
... (error message spanning several lines)
... i=j;
... (gotta make sure, ^, programer sees this crap)
...

so user has to remember to type "i=(int)j;" or they might cut themselves
on the sharp edges of numeric precision, and meanwhile "i=(int)p;"
doesn't even come close to working (but, to be fair, there is little in
the language design to say what the value "should" be, if it were
in-fact to work).


and maybe some of:

C:
printf("Have you seen this? %f\n", sin(M_PI));

C++:
cout << "Have you seen this?" << sin(M_PI) << endl;
( somewhere in history: "hey, have you seen the spiffy new feature, I
can make these operators do whatever random crap I want!", someone else:
"hard-core yeah! using shift for printing! why don't we make this a
standard feature!" ).

Java:
System.out.println("Have you seen this? " + Math.sin(Math.PI));
("no one will notice...").

well, and:
C:
#include <stdio.h>
int main()
{
printf("yay!\n");
return 0;
}

C++:
#include <iostream>
using namespace std;
int main()
{
cout << "yay!\n" << endl;
return 0;
}

Java:
public class MyClass
{
public static void main(String[] args)
{
System.out.println("yay!");
}
}

just saying is all...


(decided to leave out some other examples here, potentially more likely
to create controversy, which isn't really my intent here).


or such...

Whats your point?
 
B

BGB

On Sat, 21 Apr 2012 10:20:41 +0200, Bernd Nawothnig

On 2012-04-20, Gene Wirchenko wrote:
These implementation details should better be hidden and invisible
for
most cases. Let the compiler automatically detect and generate
possible optimisations.

If you complicate things, the compiler then has to work to
decomplicate (optimise). Why not just keep it simple?

My proposal was quite the contrary: simplification of things, i.e.
removal of unnecessary data types by unifications.

Keep in mind: the compiler is not the programmer!

A programming language should be as simple and orthogonal as
possible.

One application of keeping it simple would be to use primitives
where possible -- since they are simpler than objects -- and only use
objects where they are needed.

See above: don't mix up the compiler, the machine, and implementation
details with the programmer. Things should be simple for the
*programmer*, not necessarily for the compiler or the machine, even if
that maybe preferable. But preferable is not necessary ...

I am not confusing them, but one does have to consider them all.
If a language can not be easily compiled, that creates problems. Even
if the compilation is simply slower, that may discourage use of it.

yep...

(pardon any exaggeration... this can be taken purely as personal
opinion...).


programmer considers whether to use C or C++ for something:
tries compiling code with a C compiler, kind of slow, but livable (well,
my project rebuilds in about 20 seconds);
tries compiling code with a C++ compiler, takes a while, programmer
wanders off, gets coffee, comes back, compiler is still churning
along... (and maybe continues to take an additional 30 minutes).

programmer concludes, regarding C++: "no... this just isn't worth it...".

OTOH: Java compiles fairly fast...
(but, admittedly, Java isn't really perfect either).

so, ultimately, the programmer chooses things based on a mixture of what
they are most comfortable with, and what annoyances they are inclined to
put up with, ...


so:
the C++ programmer lives with absurdly long build times (and says "but
at least I have, features!").

the C programmer lives with a world where doing OO stuff/... is
generally painful, and thinks "what problem is there which can't be
solved with a little pointer arithmetic?" (maybe followed by using an
#ifdef to check the CPU type, writing a few bytes into a buffer, and
then calling it as if it were a function pointer...), and concluding
"all this OO stuff is nothing really beyond syntax sugar over a few
structs and function pointers anyways... so why care?...".


and the Java programmer lives in a world where writing code in general
is painful (and then thinks, "well at least I have this giant class
library, just have to find the relevant SomeClassWhichDoesTaskX",
nevermind should anyone go through the pain of having to write some
actual code to do something...). but, it is all good, "for safety!".


say:

C:
void *p;
long long j;
int i;
...
i=p; //compiler: warning: I don't like the way this looks, but ok...
i=j; //compiler: <remains silent>

C++:
void *p;
long long j;
int i;
...
i=p; //compiler: error: you need to cast this crap...
i=j; //compiler: <remains silent>

Java:
Void p;
long j;
int i;
...
i=j; //compiler: error: OMG WTF!
... (error message spanning several lines)
... i=j;
... (gotta make sure, ^, programer sees this crap)
...

so user has to remember to type "i=(int)j;" or they might cut themselves
on the sharp edges of numeric precision, and meanwhile "i=(int)p;"
doesn't even come close to working (but, to be fair, there is little in
the language design to say what the value "should" be, if it were
in-fact to work).


and maybe some of:

C:
printf("Have you seen this? %f\n", sin(M_PI));

C++:
cout << "Have you seen this?" << sin(M_PI) << endl;
( somewhere in history: "hey, have you seen the spiffy new feature, I
can make these operators do whatever random crap I want!", someone else:
"hard-core yeah! using shift for printing! why don't we make this a
standard feature!" ).

Java:
System.out.println("Have you seen this? " + Math.sin(Math.PI));
("no one will notice...").

well, and:
C:
#include <stdio.h>
int main()
{
printf("yay!\n");
return 0;
}

C++:
#include <iostream>
using namespace std;
int main()
{
cout << "yay!\n" << endl;
return 0;
}

Java:
public class MyClass
{
public static void main(String[] args)
{
System.out.println("yay!");
}
}

just saying is all...


(decided to leave out some other examples here, potentially more likely
to create controversy, which isn't really my intent here).


or such...

Whats your point?


that there may be relative tradeoffs between using languages, some of
which may be seen as non-issues by one developer, but as critical
failings by another (as well as multiple ways in which a given language
may be percieved by developers using a different language, ...).

compilation speed was one example, but there are many other possibilities.

for example, type-semantics / type-safety, which may be seen as an
important feature by one developer, and an annoyance or hindrance by
another (and may even be defined differently for each developer), ...


so, in effect, language perceptions tend to be relative, to some extent,
and it isn't really possible to design a "one true language" for which
everyone will be happy, or which necessarily applies equally well to
every use-case.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,982
Messages
2,570,190
Members
46,740
Latest member
AdolphBig6

Latest Threads

Top