From: Boudewijn Dijkstra on
Op Thu, 01 Apr 2010 08:59:33 +0200 schreef D Yuniskis
<not.going.to.be(a)seen.com>:
> -jg wrote:
>> On Apr 1, 5:02 am, d_s_klein <d_s_kl...(a)yahoo.com> wrote:
>>> To answer your last question, most software is not formally tested. I
>>> asked a developer at a large software company how the product was
>>> tested, and the reply was "we outsource that". When I asked how they
>>> determined if it was tested _properly_, the reply was "we'll outsource
>>> that too".
>> Seems they follow that universal regression test rule,
>> the one that realizes that within the words
>> "THE CUSTOMER"
>> you can always find these words too :
>> "CHUM TESTER"
>> Using the customer is the ultimate outsourcing coup!!
>
> Unfortunately, this attitude seems to cause many
> firms TO CURSE THEM instead of embracing them.

Unless of course they are CUTE MOTHERS who CHEER UTMOST or MUTTER ECHOS.


--
Gemaakt met Opera's revolutionaire e-mailprogramma:
http://www.opera.com/mail/
(remove the obvious prefix to reply by mail)
From: Paul E. Bennett on
-jg wrote:

> On Apr 1, 5:02 am, d_s_klein <d_s_kl...(a)yahoo.com> wrote:
>> To answer your last question, most software is not formally tested. I
>> asked a developer at a large software company how the product was
>> tested, and the reply was "we outsource that". When I asked how they
>> determined if it was tested _properly_, the reply was "we'll outsource
>> that too".
>
> Seems they follow that universal regression test rule,
> the one that realizes that within the words
> "THE CUSTOMER"
> you can always find these words too :
> "CHUM TESTER"
>
> Using the customer is the ultimate outsourcing coup!!
>
> -jg

One might cynically say,
that is the MS way.

For anyone interested in what true testing is about then I would recommend a
very nice book by Gerald M. Weinberg (he who gave us the bible on Technical
Reviews and Inspections). The book is called "Perfect Software and other
illusions about testing" ISBN 978-0-932533-69-9. I'll say no more than every
developer should read this and then make their management read it too.

One rule that all development companies should make hard and fast is:- A
code developer shall only submit code that has undergone the sanity checks
(Compiles clean without warnings, Lint, Static Analysis, test build and
functional test...) performed by himself. That way he knows he has submitted
a reasonable piece of work and the testers role has been eased somewhat.

I consider that writing code for a project should always be just a small
proportion of the development time. Most time should be spent in developing,
testing and correcting the technical specification before the coding (or
hardware build) is started.

--
********************************************************************
Paul E. Bennett...............<email://Paul_E.Bennett(a)topmail.co.uk>
Forth based HIDECS Consultancy
Mob: +44 (0)7811-639972
Tel: +44 (0)1235-510979
Going Forth Safely ..... EBA. www.electric-boat-association.org.uk..
********************************************************************

From: d_s_klein on
On Mar 31, 5:35 pm, D Yuniskis <not.going.to...(a)seen.com> wrote:
> d_s_klein wrote:
> > On Mar 28, 11:15 am, D Yuniskis <not.going.to...(a)seen.com> wrote:
> >> Hi,
>
> >> I use a *lot* of invariants in my code.  Some are there
> >> just during development; others "ride shotgun" over the
> >> code at run-time.
>
> >> So, a lot of my formal testing is concerned with
> >> verifying these tings *can't* happen *and*, verifying
> >> the intended remedy when they *do*!
>
> >> I'm looking for a reasonably portable (in the sense of
> >> "not toolchain dependant") way of presenting these
> >> regression tests that won't require the scaffolding that
> >> I particularly use.
>
> >> For example, some structures (not "structs") that I
> >> enforce may not be possible to create *in* a source
> >> file.  For these, I create "initializers" that
> >> actively build a nonconforming image prior to unleashing
> >> the code-under-test.  If written correctly, the code
> >> being tested *should* detect the "CAN'T HAPPEN"
> >> conditions represented in that image and react
> >> accordingly (of course, this differs from the production
> >> run-time behavior as I *want* to see its results).
>
> >> I can't see any other flexible way of doing this that
> >> wouldn't rely on knowing particulars of the compiler
> >> and target a priori.
>
> >> Or, do folks just not *test* these sorts of things (formally)?
>
> > Yes, they do get tested when "formal" testing is done.  However, I
> > think that you can appreciate that a lot of software is being sent out
> > without formal testing.
>
> So, folks just write code and *claim* it works?  And their
> employers are OK with that??  (sorry, I'm not THAT cynical... :< )
>
> > For example, "formal" testing requires that every case in a switch
> > statement be entered; even the default.
>
> Yes.  Hence the advantages of regression testing -- so you
> don't have to *keep* repeating these tests each time you
> commit changes to a file/module.
>
> > <rant>
> > Most companies task the junior engineers with testing - the
> > inexperienced that really don't know where to look for errors, or how
> > to expose them.  These companies contest that the persons experienced
> > enough to do the job are "too expensive" for "just testing".  IMnsHO
> > this is why most of the software out there is pure cr(a)p.
> > </rant>
>
> I would contend that those folks are "too expensive" to be
> writing *code*!  That can easily be outsourced and/or
> automated.  OTOH, good testing (the flip side of "specification")
> is something that you can *only* do if you have lots of experience
> and insight into how things *can* and *do* go wrong.
>
> > To answer your last question, most software is not formally tested.  I
> > asked a developer at a large software company how the product was
> > tested, and the reply was "we outsource that".  When I asked how they
> > determined if it was tested _properly_, the reply was "we'll outsource
> > that too".
>
> <frown>  I refuse (?) to believe that is the case across the board.
> You just can't stay in business if you have no idea as to the
> quality of your product *or* mechanisms to control same!

Please don't refuse to believe.

There are companies that are doing -quite- well that have no fancy
idea how to test their product(s). People respond to the advertising,
buy the product, and that's all that really matters.

"And it was my idea" is an excellent example. It's cheaper to find
new customers than it is to fix the defects.

RK
From: Peter Dickerson on
"-jg" <jim.granville(a)gmail.com> wrote in message
news:af038acb-68d7-4099-bacb-1c831197285f(a)k19g2000yqn.googlegroups.com...
On Apr 1, 5:02 am, d_s_klein <d_s_kl...(a)yahoo.com> wrote:
> To answer your last question, most software is not formally tested. I
> asked a developer at a large software company how the product was
> tested, and the reply was "we outsource that". When I asked how they
> determined if it was tested _properly_, the reply was "we'll outsource
> that too".

Seems they follow that universal regression test rule,
the one that realizes that within the words
"THE CUSTOMER"
you can always find these words too :
"CHUM TESTER"

Using the customer is the ultimate outsourcing coup!!
----------------------

In the context of analytical instruments testing was something you did to
prove each instruments worked. The trick was to restrict the testing to only
those tests that are likely to work.

Even very recently I found out that pass bands were being set to the
instrument spec accuracy PLUS the reference material tolrance rather than
MINUS. All reference material calibrations were traceable to NIST, so that's
all right then! When challenged formally to explain themselves, the reply
was "That's how we did it when we worked for <competitor>".

Peter


From: D Yuniskis on
Hi Peter,

Peter Dickerson wrote:
> "-jg" <jim.granville(a)gmail.com> wrote in message
> news:af038acb-68d7-4099-bacb-1c831197285f(a)k19g2000yqn.googlegroups.com...
> On Apr 1, 5:02 am, d_s_klein <d_s_kl...(a)yahoo.com> wrote:
>> To answer your last question, most software is not formally tested. I
>> asked a developer at a large software company how the product was
>> tested, and the reply was "we outsource that". When I asked how they
>> determined if it was tested _properly_, the reply was "we'll outsource
>> that too".
>
> In the context of analytical instruments testing was something you did to
> prove each instruments worked. The trick was to restrict the testing to only
> those tests that are likely to work.

Ha!

"The light in my office is burned out!"
"Hmmm... *mine* works..."

> Even very recently I found out that pass bands were being set to the
> instrument spec accuracy PLUS the reference material tolrance rather than
> MINUS. All reference material calibrations were traceable to NIST, so that's
> all right then! When challenged formally to explain themselves, the reply
> was "That's how we did it when we worked for <competitor>".

I don't think much thought goes into testing. I think
folks see it as "nit picking". Yet, don't seem to be
bothered when some customer complains about something
that doesn't work (AT ALL!).

"Well, we didn't *test* for that..."

(OK, so what *did* you test for? What product are we
*really* making since we obviously aren't making the
product that we advertise!)