From: Tom Anderson on
On Mon, 12 Oct 2009, Lew wrote:

> Lew wrote:
>>> Every time you omit 'final' in a class declaration you should consider
>>> carefully.  Read the item in /Effective Java/ referenced upthread for a
>>> thorough explanation of why to prefer non-heritability.
>
> Rze?nik wrote:
>> If there is such guidance there then the book is useless rubbish - but
>> I can't say for sure because I actually did not read this.
>
> You didn't read it, yet you feel confident in refuting the arguments
> you didn't read?
>
> Interesting.
>
> FWIW, far from being "useless rubbish", /Effective Java/ is arguably
> the most useful book published to help one write idiomatic and, well,
> effective Java code.

Nonetheless, it does contain at least one piece of pretty questionable
advice, namely this one.

I say that despite the fact that i haven't read the book, and don't intend
to. I have, however, come across problems in my work which we could solve
by subclassing an existing class in a third-party library and overriding
some of its methods, in a way that its designers almost certainly did not
intend. If they'd taken JoBo's advice of finalising everything that wasn't
explicitly intended to be overridable, we wouldn't have been able to do
that.

Now, you could counter that what the Blochfather really meant was that you
*should* design your classes to be subclassable, and use final to protect
the bits that must remain unchanged even when subclassed - the emphasis
being on enabling subclassing, not preventing it. If the designers of the
libraries i've had to mangle in this way had done that, then i would still
have been able to fix things by subclassing, and everybody would be happy.
But i think this requires superhuman effort, bordering on ominousness -
they'd have to have anticipated everything someone might usefully do with
their code and provided for it.

I think it's better to make a more permissive approach - finalise the
things that absolutely must not be overriden (of which there will be
fairly few, i would think - mostly security stuff, or very fundamental
support code), and leave the rest open to change, with a large "caveat
overridor" sign on it.

tom

--
Science of a sufficiently advanced form is indistinguishable from magic
From: Rzeźnik on
On 12 Paź, 19:15, Tom Anderson <t...(a)urchin.earth.li> wrote:
> On Mon, 12 Oct 2009, Rze?nik wrote:

>
> > You are omniscient :-) I meant 'omniscient' :-)
>
> I suspected this, but am disappointed. I would love 'ominous' to become a
> technical software engineering term.
>
> public final void append(ominous String s) ...
>
> tom

Yeah, sounds great - lmao :-)))

From: Lew on
Rzeźnik wrote:
> I am not sure whether we understand each other. Let me reiterate: you
> said that one should NOT predict, with which I agree. But you clearly
> do not want to see that 'final' is one endless bag of predictions.
> Every 'final' you put in your code cries: I PREDICT THAT THIS METHOD/
> CLASS HERE IS WRITTEN IN STONE. While sometimes predictions like these
> may be valid, more often than not they aren't.
>

Your point here is incorrect. Declaring a class 'final' is not a
prediction, it's a constraint. You are not predicting that "this
method / class is written in stone" with 'final'. You are forcing it
to be. No prediction involved, just control. I advocate dictating
the use, not predicting it.

> I am declaring rubbish not his reasoning per se, but his reasoning as
> you described it - that may be two different things. Anyway, there is

I described it thus:
>> Every time you omit 'final' in a class declaration you should consider
>> carefully. Read the item in /Effective Java/ referenced upthread for a
>> thorough explanation of why to prefer non-heritability.

In other words, "Read the book. See for yourself."

What part do you disagree with, that you should consider carefully, or
with the reasoning expressed in the book to which I referred?

> no VALID argument against inheritance in OO language. One may argue

No one is arguing against inheritance, just its abuse. Read the
book. Decide for yourself.

> that inheritance should be thought out and thoroughly reviewed but one

Ummm, yeah-ah.

> cannot state that it should be abandoned as it is the only way to make
> sure that OO system is opened for future modifications while being, at
>

Straw man, straw man, straw man. I also am not arguing against
inheritance, or for its abandonment, only against its abuse.

> the same time, closed so that it is able to execute. The more final
> you use, the more closed your class hierarchy becomes - which is
> almost always 'the bad thing'.
>

I am simply saying that one should design and document for inheritance
(of concrete classes), or else prohibit it, in line with and for the
reasons stated in /Effective Java/.

If you don't design a class to be heritable but allow it to be, then
you have problems. You cannot use the circular argument that a
recommended practice for object-oriented programming violates the
principles of O-O without demonstrating that it does, in fact, do so.

So let's try again - drop the straw man and circular arguments.

--
Lew
From: Lew on
Rzeźnik wrote:
> So I judge 'final' to be usable when
> it is obviously known that class will not be redefined ever
>

If you declare a class 'final', then it is obviously known that it
will not be subclassed. ("Redefined" does not apply here - we're
talking about inheritance, not redefinition.) You're putting the cart
before the horse.

No prediction needed - it is a dictatorship.

If you do not declare a class 'final', then you had better darn well
make sure that it's conditioned for inheritance. If you do not
declare a class 'final', you are giving permission for it to be
inherited. No prediction needed.

The API writer does not predict, he permits.

--
Lew
From: Lew on
Rzeźnik wrote:
> Now I have clearer picture of what Block [sic] really wrote.
>

Bloch.

--
Lew