Style Police (a rant)

L

Lew

Patricia said:
Eric Sosman wrote:
...

The criterion could be described better. Maybe it should be something
like "needs very careful design and documentation to achieve correct
extension". A big chunk of the Object API documentation, for example,
discusses the rules for equals, hashCode, and the relationship between
them in classes that override them.

It seems to me that the best use of this type of tool for that sort of
check is to prevent such things being done casually. The extra
hoop-jumping for turning off the check in a class is not that big
compared to the work that needs to be done to ensure correct extension.

Of course, if the powers-that-be forbid turning off the check, there is
a much bigger problem.

What we're overlooking here is that 'Object' actually does violate the rule, and that it's not actually a bad rule.

The exception probes the rule - what we need to understand is _why_ 'Object' breaks the rule, and the dangerous behaviors it permits thereby.

As Patricia aptly points out, it takes a mountain of documentation, comprising not only the voluminous Javadocs but volumes of articles, chapters in _Effective Java_, and innumerable blog posts to prevent errors involving mismatches between 'equals()', 'hashCode()', 'toString()', and where applicable, 'compareTo()', because 'Object' is most definitively not "designed for extension" in the sense intended by the principle.

A class designed for extension in this context means designed *for the compiler to enforce extensibility*. The downside to the rule-breaking (intentional though it be) in 'Object' is that it allows, nay encourages error. The exact sort of error that Checkstyle is trying to prevent you from committing.

So shut the front up, get over yourself, and understand the reasons for therule. Yes, it can and even should be violated under certain circumstances.. But don't present 'Object' as a counterexample when in fact it is the poster child for why the rule is valid.
 
D

Daniele Futtorovic

As Patricia aptly points out, it takes a mountain of documentation,
comprising not only the voluminous Javadocs but volumes of articles,
chapters in _Effective Java_, and innumerable blog posts to prevent
errors involving mismatches between 'equals()', 'hashCode()',
'toString()', and where applicable, 'compareTo()'

That's more a matter of exposure than anything else. Explaining the
problem and the solution can be done fairly succinctly.

For the record, I disagree with the rule you're defending by relabelling
counter-examples exceptions to it. Leaving the theory aside for a
moment, I regard it as relatively pernicious in practice. Code properly
structured into logical, functional blocks is resistant enough with
respects to the dangers of ill-advised overriding, making that rule in
effect unnecessarily restrictive IMHO.
 
A

Andreas Leitgeb

Wanja Gayk said:
It's a shame that in Java not all references are implicitly final and
only real variables get marked with "var" instead - that would serve the
same purpose with less effort and less visual clutter.

It seems like your general coding style differs from mine. The
percentage of re-assigned variables versus those assigned only
once is large enough, that a "var" keyword would cause more
clutter than putting "final" on each other variable.


I'd favor a different change: let final variables optionally
have their type inferred:
final myList = new ArrayList<String>();
Afterall, it is just a handle for some previously obtained value.
 
L

Lew

Andreas said:
It seems like your general coding style differs from mine. The
percentage of re-assigned variables versus those assigned only
once is large enough, that a "var" keyword would cause more
clutter than putting "final" on each other variable.

I'd favor a different change: let final variables optionally
have their type inferred:
final myList = new ArrayList<String>();
Afterall, it is just a handle for some previously obtained value.

That idea is more complex than it appears. Should the inferred type be 'List<String>', 'AbstractList<String>' or 'ArrayList<String>'? I suspect you would say the last, but really, is it so very, very bad to type

final List<String> myList = new ArrayList<>();

? I mean, if you want to talk hard work, an extra second or so of typing really doesn't qualify, especially since the lack of the supposedly "redundant" types would harm readability for maintainers. As one whose career has involved far more maintenance of existing code than development of new, I much prefer the explicit typing.

Wanja's main point is that 'final' has a semantic purpose, not an optimization one. I agree that the evidence for the advantage of a 'var' keyword over a 'final' is lacking, but it's moot since 'final' is already part of thelanguage and the addition of 'var' is unlikely to occur. Also, styles vary widely, as you aptly point out. As long as the purpose is semantic, I recommend broad acceptance for different ways to use 'final'.

To repeat, the key point about 'final' is that it prevents reassignment of variables, overriding methods or class extension. Such prevention is a matter of art, but that is the purpose. Not optimization, not really ease of reading (which as I stated I find important), but prevention of these various forms of reassignment. Use 'final' when you mean for something to be final, not for false economies.
 
A

Arne Vajhøj

That doesn't keep me from using final on references (not
methods/classes) wherever possible and also refactoring my code to use
that keyword when it does not compromise readability too much. The
reason is that it makes understanding code easier for me.
Once you're used to seeing "final" everwhere, you'll especially note
those places which are not marked finally; because they are the minority
they stand out, they catch your attention. These are the variables - in
the very sense of the word. The less variables you've got keep in mind,
the easier it is to understand what code does. It has served me very
well in the past 10 years and I do think it's worth the effort. You may
argue that it is no good for methods which are basically just 3 lines
long, but I'll do it there anyway: To avoid blurring the concept,
keeping the habit, keeping the "not final means attention!" sense sharp.

So I'd like to use "final" to express my intention in code very
precisely, I'm not doing it for optimizing things.
It's a shame that in Java not all references are implicitly final and
only real variables get marked with "var" instead - that would serve the
same purpose with less effort and less visual clutter.

Also with putting final in front of all method parameterts, long lists
of parameters begin to smell and cry for a refactoring earlier, which
isn't really a bad thing in my opinion (depends on code-formatting
though).

There are plenty of languages with the behavior you want (default
being val not var).

I am a bit skeptical about trying to use a language in a significant
different way from what it was intended.

The code becomes more difficult to read for others and there
are always the risk of having to be inconsistent due to
constraints in the actual language.

Sure you can learn your special style, but that does not
help much when somebody else inherits your code.

Better pick a language that work the way you want to code.

Arne
 
A

Andreas Leitgeb

Arne Vajhøj said:
Sure you can learn your special style, but that does not
help much when somebody else inherits your code.

Better pick a language that work the way you want to code.

Imagining myself in the role of a future maintainer of some code,
I'd surely feel more comfortable with Java-code written in some
strange (but consistent) style, than with code written in some
non-mainstream language Xyz. But maybe that's just me...
 
E

Eric Sosman

Imagining myself in the role of a future maintainer of some code,
I'd surely feel more comfortable with Java-code written in some
strange (but consistent) style, than with code written in some
non-mainstream language Xyz. But maybe that's just me...

I'd guess it's just you. Seriously.

Today's mainstream language is tomorrow's "legacy" language,
which means that tomorrow's maintainers will be more in tune with
whatever has become popular than with old, out-of-fashion Java.
Even those who were once Java whizzes will have seen their Java
skills rust with disuse. When they pick up a chunk of old Java
after years of working only in Snazzy, they will be on ground that
is far less familiar than once it was. (Are your own personal
COBOL/FORTRAN/ALGOL/... skills as sharp as they used to be?)

It will therefore be helpful (or at least less un-helpful) if
the code they see is not only self-consistent, but consistent with
all the other Java code they once knew. "All" is surely too high
a bar to clear, but "most" worth striving for. When writing code
that's intended to survive (code has extraordinary and surprising
longevity), you should have a really compelling reason to depart
from norms, not just a stylistic preference.
 
A

Andreas Leitgeb

Lew said:
That idea is more complex than it appears. Should the inferred type be
'List<String>', 'AbstractList<String>' or 'ArrayList<String>'?
I suspect you would say the last,

You suspected right, which is no surprise, as it is the only reasonable
choice in that particular context.
but really, is it so very, very bad to type
final List<String> myList = new ArrayList<>();

I find the direction of the inferral somewhat unlucky.
I'd rather infer the type from right(expression) to left(variable),
than the other way round. I guess that was discussed thoroughly
before it was introduced, so I might just miss knowing those perfect
arguments against my preferrence.
? I mean, if you want to talk hard work, an extra second or so of
typing really doesn't qualify,

It seems that you're biased towards reading other's code versus
writing your own. Typing long code isn't the problem - typing
redundant boilerplates is ... fatigueing. It dumbs the joy
of programming.
especially since the lack of the supposedly "redundant" types would
harm readability for maintainers.

That is a claim, of whose truth I'm not at all convinced.
If in a particular situation, the type really wasn't obvious,
then there'd still be the choice of making it explicit.
As one whose career has involved far more maintenance of existing
code than development of new, I much prefer the explicit typing.
So much to affirm the diagnosed bias...
Wanja's main point is that 'final' has a semantic purpose, not an
optimization one.
So far I agree.
I agree that the evidence for the advantage of a 'var' keyword
over a 'final' is lacking,

That Wanja suggested the "var" keyword is an indication of that he has
a bias towards functional programming. Unlike Arne, I don't say that
"pick a different language" is an appropriate answer. It's ok, to program
functionally in Java, as far as possible in Java-syntax, and that this
may lead to quite a lot of "final"s.
I just disagree with Wanja as to making "final" the default: it's
like bullying others to functional style in Java.
 
A

Arne Vajhøj

You suspected right, which is no surprise, as it is the only reasonable
choice in that particular context.


I find the direction of the inferral somewhat unlucky.
I'd rather infer the type from right(expression) to left(variable),
than the other way round. I guess that was discussed thoroughly
before it was introduced, so I might just miss knowing those perfect
arguments against my preferrence.

I prefer the Java way.

I believe in some cases you just want to read the left side
and then move further down the code.

Having to read the right side to see what type it is
will just make it more difficult (in that case).

Arne
 
A

Arne Vajhøj

That Wanja suggested the "var" keyword is an indication of that he has
a bias towards functional programming. Unlike Arne, I don't say that
"pick a different language" is an appropriate answer. It's ok, to program
functionally in Java, as far as possible in Java-syntax, and that this
may lead to quite a lot of "final"s.

Java allows it.

I am just a bit skeptical about the maintainability.

Many years ago I saw a person that liked Pascal do the following
in C:

#define begin {
$define end }

The result was not pretty.

I like Pascal, but if I want to code in Pascal then I will use
Pascal not C.
I just disagree with Wanja as to making "final" the default: it's
like bullying others to functional style in Java.

Arne
 
A

Arne Vajhøj

Imagining myself in the role of a future maintainer of some code,
I'd surely feel more comfortable with Java-code written in some
strange (but consistent) style, than with code written in some
non-mainstream language Xyz. But maybe that's just me...

I see it otherwise.

But it could be because I have an implicit assumption that
developers hired to maintain code in language Xyz will know
Xyz - mainstream or not mainstream.

Arne
 
A

Arne Vajhøj

Well in which was this was not intended?
The final keyword is to mark those "variables" that never change, isn't
it? I'm using it for exactly that.

Obviously the semantics of final is as it is.

But based on early examples coming out from SUN it seems very likely
that the keyword was intended for special cases where something had
to be final and not used everywhere except where something had to be not
final.
And I still have to meet the colleague who could not read my code just
because of some "final" keywords in it.

Sure they can read it.

They probably can read anything that is valid Java syntax.

That does not imply that the form is optimal.

Otherwise such things as coding conventions would
not exist.

Arne
 
A

Arved Sandstrom

I'd guess it's just you. Seriously.

Today's mainstream language is tomorrow's "legacy" language,
which means that tomorrow's maintainers will be more in tune with
whatever has become popular than with old, out-of-fashion Java.
Even those who were once Java whizzes will have seen their Java
skills rust with disuse. When they pick up a chunk of old Java
after years of working only in Snazzy, they will be on ground that
is far less familiar than once it was. (Are your own personal
COBOL/FORTRAN/ALGOL/... skills as sharp as they used to be?)

It will therefore be helpful (or at least less un-helpful) if
the code they see is not only self-consistent, but consistent with
all the other Java code they once knew. "All" is surely too high
a bar to clear, but "most" worth striving for. When writing code
that's intended to survive (code has extraordinary and surprising
longevity), you should have a really compelling reason to depart
from norms, not just a stylistic preference.
I agree, the problems are already more than big enough. I figure that
my Java-based work only, can be described as roughly 2/3 maintenance,
1/3 new code, as of now and the past few years.

The new coding is pretty much all JDK 1.6 (1.7 won't start being widely
used in any projects I'm connected with until next year at the
earliest), and on the Java EE side mostly Java EE 5 with a smattering of
Java EE 6. I keep up with Java EE 6 and all the associated stuff like
JSF 2 and JPA 2 and CDI and servlet 3.0, mostly on my own time, because
not many of our clients are using those APIs yet; Java EE 5 is still the
main thing for new development because the new dev is usually additions
to other stuff in a Java EE 5 environment.

For maintenance it's still mostly JDK 1.6 actually, with some JDK 1.4,
but on the Java EE side about an even split of pure Java EE 5 and hybrid
J2EE 1.4/Java EE 5 (Oracle Application Server/oc4j 10 being one of the
main examples). My colleagues, acquaintances and myself don't see Java
EE 6 to maintain...yet. Leastways in my neck of the woods I'd be
surprised if Java EE 6 became predominant before about 2013/2014.

Overall the main challenge lies in the APIs, not in the core language.
Sure, there was that generics transition going to JDK 1.5, but in the
big scheme of things that was minor. Mostly it's that there's not enough
hours in the day to keep track of "current", "old", and "latest" APIs in
both Java SE and Java EE. It's a challenge to keep switching between
"obsolete", "predominant", "current" and "cutting edge" development hats
on any given day or from week to week. Just as one example I have to
remember JSF 1.1, 1.2 and 2.0 complete, and what I can and cannot do as
I move from one to the other.

So about the last thing a body needs is non-standard (hence
inconsistent) core Java.

AHS
 
L

Lew

Wanja said:
class ApplicationWorkerUtil {

static interface InternalExceptionHandler {
void handle(final TimeoutException e);

void handle(final Exception e);
}

Please, please, *please* do not use TAB characters to indent Usenet code posts!

Your post is virtually unreadable to Google Groups because for some odd reason GG eliminates all such indentation, and to real newsreaders because they expand TABs so widely.

Please, please, *please* only use spaces to indent code on Usenet (maximum indent 4 spaces per level).

Please. I really would have liked to read your code, but such a large example with munged indentation just isn't worth it.
 
L

Lew

Andreas said:
You suspected right, which is no surprise, as it is the only reasonable
choice in that particular context.

No, it isn't. A variable type of 'List<String>' is entirely reasonable.
 
L

Lars Enderin

2011-09-11 18:42, Lew skrev:
Please, please, *please* do not use TAB characters to indent Usenet code posts!

Your post is virtually unreadable to Google Groups because for some odd reason GG eliminates all such indentation, and to real newsreaders because they expand TABs so widely.

Please, please, *please* only use spaces to indent code on Usenet (maximum indent 4 spaces per level).

Please. I really would have liked to read your code, but such a large example with munged indentation just isn't worth it.

I would copy the code and use Emacs to format it.
 
L

Lew

Wanja said:
(e-mail address removed) says...

Damn google groups.

The problem isn't limited to GG, it's only different for GG.
Sorry for that.
Fortunately you can still copy and paste it into a your IDE and format
the source.

Unfortunately the folks who insist that I do the extra effort to read their posts because they're too inconsiderate to post politely only let me know that they really don't care about me, so in turn I really don't care about them.

I'm fairly certain that you will not regret my lack of interaction with your failure to be courteous.
 
A

Arne Vajhøj

I agree, the problems are already more than big enough. I figure that
my Java-based work only, can be described as roughly 2/3 maintenance,
1/3 new code, as of now and the past few years.

And 1/3 new code is probably above industry average.
The new coding is pretty much all JDK 1.6 (1.7 won't start being widely
used in any projects I'm connected with until next year at the
earliest), and on the Java EE side mostly Java EE 5 with a smattering of
Java EE 6. I keep up with Java EE 6 and all the associated stuff like
JSF 2 and JPA 2 and CDI and servlet 3.0, mostly on my own time, because
not many of our clients are using those APIs yet; Java EE 5 is still the
main thing for new development because the new dev is usually additions
to other stuff in a Java EE 5 environment.

For maintenance it's still mostly JDK 1.6 actually, with some JDK 1.4,
but on the Java EE side about an even split of pure Java EE 5 and hybrid
J2EE 1.4/Java EE 5 (Oracle Application Server/oc4j 10 being one of the
main examples). My colleagues, acquaintances and myself don't see Java
EE 6 to maintain...yet. Leastways in my neck of the woods I'd be
surprised if Java EE 6 became predominant before about 2013/2014.

Overall the main challenge lies in the APIs, not in the core language.
Sure, there was that generics transition going to JDK 1.5, but in the
big scheme of things that was minor. Mostly it's that there's not enough
hours in the day to keep track of "current", "old", and "latest" APIs in
both Java SE and Java EE. It's a challenge to keep switching between
"obsolete", "predominant", "current" and "cutting edge" development hats
on any given day or from week to week. Just as one example I have to
remember JSF 1.1, 1.2 and 2.0 complete, and what I can and cannot do as
I move from one to the other.

I still occasionally see Java 1.3.1 + J2EE 1.2/1.3 + Struts 1
in the field.

Arne
 
C

Cthun

On 11/09/2011 3:20 PM, Wanja Gayk wrote:
$ public<T> List<T> withoutDupes(final List<T> xs) {
$ return new Object() {
$ <T> List<T> withoutDupes(final List<T> head, final List<T> tail) {
$ if(tail.isEmpty()){return head;}
$ if (head.contains(tail.get(0))) {
$ return withoutDupes(head, tail.subList(1, tail.size()));
$ }
$ return withoutDupes(
$ new ArrayList<T>(head){{add(tail.get(0));}}
$ , tail.subList(1, tail.size())
$ );
$ }
This is a whole different beast (and prone to crash with a stack
overflow exception on larger lists by the way).
Admitted, it is not entirely functional due to the "add"-call, but quite
close. Still it is pretty compact code (there is a certain beauty in
recursion, isn't it?) and not hard to understand either.

Bullying someone to functional code would be pretty stupid, as the
current JVMs still have a hard time detecting tail recursions and it
lacks data structures that do lazy evaluation in the Java SE.

There's a way around that, and it's called Clojure. It compiles to JVM
bytecode and has both lazy lists and a special operator for doing tail
recursion (the compiler turns it into an iteration).
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Members online

No members online now.

Forum statistics

Threads
473,994
Messages
2,570,223
Members
46,812
Latest member
GracielaWa

Latest Threads

Top