Prev: Java Swing Question: Robot Screenshot does odd things when close to the mouse cursor
Next: Does launching an applet via JNLP work on Linux at all?
From: Rhino on 23 Feb 2010 16:45 I'm worried that my knowledge of industry standard testing procedures for Java is getting a bit out of date. Can anyone refer me to a website that focuses on testing for Java code and that describes the best practices and tools for doing that testing? I am a one-man development team at the moment and I want to be sure that my code is going to be considered thoroughly tested when I eventually release it. Some guidance on what should be tested and how thoroughly it should be tested - as well as what does NOT need to be tested - would be very helpful. -- Rhino
From: markspace on 23 Feb 2010 17:23 Rhino wrote: > I'm worried that my knowledge of industry standard testing procedures for > Java is getting a bit out of date. > > Can anyone refer me to a website that focuses on testing for Java code and > that describes the best practices and tools for doing that testing? http://en.wikipedia.org/wiki/Software_testing > > I am a one-man development team at the moment and I want to be sure that my > code is going to be considered thoroughly tested when I eventually release > it. > > Some guidance on what should be tested and how thoroughly it should be > tested - as well as what does NOT need to be tested - would be very helpful. Test everything.
From: Mike Schilling on 23 Feb 2010 17:36 "markspace" <nospam(a)nowhere.com> wrote in message news:hm1khl$k2r$1(a)news.eternal-september.org... >> Some guidance on what should be tested and how thoroughly it should be >> tested - as well as what does NOT need to be tested - would be very >> helpful. > > > Test everything. And write automated tests, so that you can rerun them easily to catch regressions. Don't worry about the time required to create and automate the tests -- it will more more than pay for itself in the end.
From: Martin Gregorie on 23 Feb 2010 17:45 On Tue, 23 Feb 2010 16:45:34 -0500, Rhino wrote: > I am a one-man development team at the moment and I want to be sure that > my code is going to be considered thoroughly tested when I eventually > release it. > I think the biggest win comes from automated regression testing. IME its well worth the effort of writing a scriptable regression test harness for any chunk of code that's more complex than, say, a package that writes and parses CSV files. Done right, the code is relatively simple and can be used to implement a whole bunch of tests merely by writing a well thought out set of scripts. Think about capturing human-readable test output so that the test harness can compare this with the expected results. Design the scripts to be self documenting and easy to understand: allowing comments in them is an enormous help. Design the harness so its output lists the script with interspersed test results. This makes debugging much easier. Ideally it will output a single line reporting a successful test and show understandable diagnostics if the test fails. Just comparing test output with expected output and displaying the differences can do this if you design the test output to be readable. A well written regression test setup will accept a single, simple command will that runs the entire set of tests and produces a small, unambiguous report that can indicate pass/fail at a glance. Do this and you'll tend to run regression tests after any and all code changes. The effect on code reliability can be dramatic. Write the tests from the code specifications. The effort of writing a clear specification is never wasted. Write it hierarchically so it has an overview, class and method descriptions. This pays off in: - better code - more complete test coverage - the specification can be turned into javadocs. > Some guidance on what should be tested and how thoroughly it should be > tested - as well as what does NOT need to be tested - would be very > helpful. > See above, but you need a decent specification as the basis for test coverage. Develop a test set that covers all specified functions in terms of correct functioning, corner cases and error handling. Its essential to write tests from the specification, not the code or you'll just end up testing what the code does, not what its meant to do. There is a good case for writing the test harness and scripts before you start coding because this way you may spot errors in the specification and will certainly avoid writing tests for the code you actually wrote. Never write code that can't be tested. Design it with this in mind. -- martin@ | Martin Gregorie gregorie. | Essex, UK org |
From: Joshua Cranmer on 23 Feb 2010 18:00
On 02/23/2010 05:45 PM, Martin Gregorie wrote: > IME its well worth the effort of writing a scriptable regression test > harness for any chunk of code that's more complex than, say, a package > that writes and parses CSV files. After having found a bug in a simple getter, I've found it easier to recommend to always write a test for everything, no matter how simple it is. Also, any time you find and fix a bug, add a check for it in your test suite to make sure you don't regress the bug fix. -- Beware of bugs in the above code; I have only proved it correct, not tried it. -- Donald E. Knuth |