From: Rzeźnik on 12 Oct 2009 09:12 On 12 Paź, 14:33, Lew <no...(a)lewscanon.com> wrote: > Roedy Green wrote: > > On Sun, 11 Oct 2009 15:38:17 -0700 (PDT), Lionel van den Berg > > <lion...(a)gmail.com> wrote, quoted or indirectly quoted someone who > > said : > > >> Just wondering what arguments there are out there for making methods > >> and classes final when they are not expected to be overridden/ > >> extended? Typically I would make them final, but when I'm doing code > >> reviews I don't raise this as an issue because I see it is relatively > >> trivial. > > > seehttp://mindprod.com/jgloss/final.html > > > For APIs you don't publish, I think you should make anything final you > > can for documentation purposes. > > For APIs you do publish, you should make every class final that you intend not > to be heritable, which should be most of the concrete classes. > Lew, I am sorry to say this, but this does not make any sense. It goes against the very spirit of OOP. If you write your published APIs in a way you described then your APIs are useless in the face of design changes. You can get away with this as long as you write programs for yourself only, or programs with very short life cycle. Why do you fear inheritance so much? I know of so called FBC problem, I am well aware what it means, but it is no argument against inheritance. > -- > Lew
From: Alessio Stalla on 12 Oct 2009 09:31 On Oct 12, 4:30 am, Lew <no...(a)lewscanon.com> wrote: > Lionel van den Berg wrote: > > >>> Just wondering what arguments there are out there for making methods > >>> and classes final when they are not expected to be overridden/ > >>> extended? Typically I would make them final, but when I'm doing code > >>> reviews I don't raise this as an issue because I see it is relatively > >>> trivial. > Arne Vajhøj wrote: > >> In my opinion none it is a leftover from the very early days of Java. > > >> See: > >> http://www.ibm.com/developerworks/java/library/j-jtp1029.html > > Sorry about the accidental post. I had intended to write: > > Excellent link, Arne. However, that very article gives many good reasons for > using the 'final' keyword, contradicting your notion that there aren't good > reasons for it. > > From an API writer's perspective, that is, anyone writing classes intended to > be used, 'final' on a class or method indicates that it should not, and > therefore cannot be extended / overridden. As Mr. Goetz said in the > referenced article, this decision should be documented (in the Javadocs). There's a difference between something "you should not do" and something you are prohibited from doing. The creator of a class should clearly document that s/he didn't design it to be extensible using inheritance, but s/he should think twice about making it non- extensible forever. Sometimes the classes we write can be used in a way we didn't anticipate, and that's not automatically a bad thing. > Josh Bloch in /Effective Java/ suggests that one should prefer composition to > inheritance, and that inheritance is somewhat abused. ("Design and document > for inheritance or else prohibit it") He advises to make classes final unless > you explicitly and properly make them heritable. I would gladly accept such advice if there was a thing like composition as a first-class concept in Java; e.g. if you were able to say public class Example extends X implements Y uses Z(z) { private Z z; //Error if z is not assigned a value by a constructor public Example(Z z) { this.z = z; } ... } and automatically have methods in the implemented interfaces delegated to z, unless you override them. That is not the case, and composition, while being the right thing in certain cases, is way more cumbersome and "foreign" than inheritance, and thus can't be used as a general substitute for inheritance. It's like saying that pure functions are to be preferred over functions with side effects: that might be true in a language with heavy support for functional programming, but giving it as an advice for good Java coding would be wrong. Alessio
From: Patricia Shanahan on 12 Oct 2009 09:46 Rzeźnik wrote: > On 12 Paź, 14:33, Lew <no...(a)lewscanon.com> wrote: >> Roedy Green wrote: >>> On Sun, 11 Oct 2009 15:38:17 -0700 (PDT), Lionel van den Berg >>> <lion...(a)gmail.com> wrote, quoted or indirectly quoted someone who >>> said : >>>> Just wondering what arguments there are out there for making methods >>>> and classes final when they are not expected to be overridden/ >>>> extended? Typically I would make them final, but when I'm doing code >>>> reviews I don't raise this as an issue because I see it is relatively >>>> trivial. >>> seehttp://mindprod.com/jgloss/final.html >>> For APIs you don't publish, I think you should make anything final you >>> can for documentation purposes. >> For APIs you do publish, you should make every class final that you intend not >> to be heritable, which should be most of the concrete classes. >> > > Lew, I am sorry to say this, but this does not make any sense. It goes > against the very spirit of OOP. If you write your published APIs in a > way you described then your APIs are useless in the face of design > changes. You can get away with this as long as you write programs for > yourself only, or programs with very short life cycle. Why do you fear > inheritance so much? I know of so called FBC problem, I am well aware > what it means, but it is no argument against inheritance. For the final-by-default approach to work it has to be combined with the use of interfaces rather than classes in argument type declarations. A programmer who needs to substitute a different implementation can do so by writing a class that implements the interface, possibly using composition to reuse some of the implementation. Eclipse automates generation of delegate methods. Making a class final and then requiring a reference to that class as an argument, rather than a reference to an interface it implements, would indeed create problems. Patricia
From: Rzeźnik on 12 Oct 2009 09:55 On 12 Paź, 15:31, Alessio Stalla <alessiosta...(a)gmail.com> wrote: > On Oct 12, 4:30 am, Lew <no...(a)lewscanon.com> wrote: > > > > > Lionel van den Berg wrote: > > > >>> Just wondering what arguments there are out there for making methods > > >>> and classes final when they are not expected to be overridden/ > > >>> extended? Typically I would make them final, but when I'm doing code > > >>> reviews I don't raise this as an issue because I see it is relatively > > >>> trivial. > > Arne Vajhøj wrote: > > >> In my opinion none it is a leftover from the very early days of Java.. > > > >> See: > > >>  http://www.ibm.com/developerworks/java/library/j-jtp1029.html > > > Sorry about the accidental post.  I had intended to write: > > > Excellent link, Arne.  However, that very article gives many good reasons for > > using the 'final' keyword, contradicting your notion that there aren't good > > reasons for it. > > >  From an API writer's perspective, that is, anyone writing classes intended to > > be used, 'final' on a class or method indicates that it should not, and > > therefore cannot be extended / overridden.  As Mr. Goetz said in the > > referenced article, this decision should be documented (in the Javadocs). > > There's a difference between something "you should not do" and > something you are prohibited from doing. The creator of a class should > clearly document that s/he didn't design it to be extensible using > inheritance, but s/he should think twice about making it non- > extensible forever. Sometimes the classes we write can be used in a > way we didn't anticipate, and that's not automatically a bad thing. > > > Josh Bloch in /Effective Java/ suggests that one should prefer composition to > > inheritance, and that inheritance is somewhat abused.  ("Design and document > > for inheritance or else prohibit it")  He advises to make classes final unless > > you explicitly and properly make them heritable. > > I would gladly accept such advice if there was a thing like > composition as a first-class concept in Java; e.g. if you were able to > say > > public class Example extends X implements Y uses Z(z) { >  private Z z; //Error if z is not assigned a value by a constructor >  public Example(Z z) { >   this.z = z; >  } >  ... > > } > This is nice, powerful construct. Are you aware of any language that uses composition is this way? > and automatically have methods in the implemented interfaces delegated > to z, unless you override them. That is not the case, and composition, > while being the right thing in certain cases, is way more cumbersome > and "foreign" than inheritance, and thus can't be used as a general > substitute for inheritance. > It's like saying that pure functions are to be preferred over > functions with side effects: that might be true in a language with > heavy support for functional programming, but giving it as an advice > for good Java coding would be wrong. > > Alessio +1
From: Lew on 12 Oct 2009 10:20
Lew wrote: > > Every time you omit 'final' in a class declaration you should consider > > carefully.  Read the item in /Effective Java/ referenced upthread for a > > thorough explanation of why to prefer non-heritability. > Rzeźnik wrote: > If there is such guidance there then the book is useless rubbish - but > I can't say for sure because I actually did not read this. > You didn't read it, yet you feel confident in refuting the arguments you didn't read? Interesting. FWIW, far from being "useless rubbish", /Effective Java/ is arguably the most useful book published to help one write idiomatic and, well, effective Java code. Lew: > > As an API writer (that is, one who writes a class intended for use) one should > > control how the API is used, not predict it. > > Agreed. So if we are on the same page - we do you try to predict how > your API will be used and 'contaminate' it with 'final's? > Begging the question. You assume that 'final' is "contamination" in order to argue that it's a bad thing. And I'm not talking about prediction, as you can clearly see from the passage you quoted, where I say one should NOT predict but control how the API is used. So that adds straw man to the fallacies you're using to attempt to refute my points. A rather bald straw man, too. You took the exact opposite of my argument in simplest possible terms without even a pretense of logic or evidence. For you to use such a weak argument completely undermines your points. Since you said nothing that speaks to my arguments, I cannot but refer you to what I've already said. If you have points that refute mine, or even pretend to, we can continue. And do try to read a book before you judge its content. I referred you to Mr. Bloch's explanations because they're already far more lucid and detailed than I can hope to achieve in a Usenet post. It is not valid to declare his reasoning "rubbish" simply because it confronts your prejudices. -- Lew |