From: Flash Gordon on
Andy Champ wrote:
> Lew wrote:
>>
>> Andy Champ wrote:
>>> In 1982 the manager may well have been right to stop them wasting
>>> their time fixing a problem that wasn't going to be a problem for
>>> another 18 years or so. The software was probably out of use long
>>> before that.
>>
>> Sure, that's why so many programs had to be re-written in 1999.
>>
>> Where do you get your conclusions?
>>
>
> Pretty well everything I saw back in 1982 was out of use by 1999. How
> much software do you know that made the transition?

<snip>

> OK, so how about embedded stuff? That dot-matrix printer became a
> laserjet. The terminal concentrator lost its RS232 ports, gained a
> proprietary LAN, then lost that and got ethernet. And finally
> evaporated in a cloud of client-server computing smoke.

I know there is software flying around today that is running on Z80
processors (well, the military variant of them) and the plan in the late
90s was for it to continue for another 20 years (I don't know the
details, but a customer signed off on some form of ongoing support
contract). Admittedly the software I used was not doing date processing
(apart from the test rigs, which used the date on printouts, which I
tested to "destruction" which turned out to be 2028).

So yes, software from the 80s is still in active use today in the
embedded world and planned to be in use for a long time to come.

> I'm not so up on the mainframe world - but I'll be surprised if the
> change from dumb terminals to PC clients didn't have a pretty major
> effect on the software down the back.
>
> Where do you get your conclusions that there was much software out there
> that was worth re-writing eighteen years ahead of time? Remember to
> allow for compound interest on the money invested on that development...

Remember to allow for the fact that software does continue to be used
for a *long* time in some industries.
--
Flash Gordon
From: Wojtek on
Lew wrote :
> The point of my example wasn't that Y2K should have been handled earlier, but
> that the presence of the bug was not due to developer fault but management
> decision, a point you ignored.

At the time (70's etc) hard drive space was VERY expensive. All sorts
of tricks were being used to save that one bit of storage. Remember
COBOL's packed decimal?

So the decision to drop the century from the date was not only based on
management but on hard economics.

Which, I will grant, is not a technical decision, though the solution
was...

And at the time Y2K was created it was not a bug. It was a money saving
feature. Probably worth many millions.

--
Wojtek :-)


From: Lew Pitcher on
On February 11, 2010 19:15, in comp.lang.c, nowhere(a)a.com wrote:

> Lew wrote :
>> The point of my example wasn't that Y2K should have been handled earlier,
>> but that the presence of the bug was not due to developer fault but
>> management decision, a point you ignored.
>
> At the time (70's etc) hard drive space was VERY expensive. All sorts
> of tricks were being used to save that one bit of storage. Remember
> COBOL's packed decimal?

Packed decimal (the COBOL COMP-3 datatype) wasn't a "COBOL" thing; it was an
IBM S370 "mainframe" thing. IBM's 370 instructionset included a large
number of operations on "packed decimal" values, including data conversions
to and from fixedpoint binary, and math operations. IBM's COBOL took
advantage of these facilities with the (non-ANSI) COMP-3 datatype.

As for Y2K, there was no "space advantage" in using COMP-3, nor was there an
overriding datatype-reason to store dates in COMP-3. While "space
requirements" are often given as the reason for the Y2K truncated dates,
the truncation usually boiled down to three different reasons:
1) "That's what the last guy did" (maintaining existing code and design
patterns),
2) "We'll stop using this before it becomes an issue" (code longevity), and
3) "We will probably rewrite this before it becomes an issue"
(designer/programmer "laziness").

Space requirements /may/ have been the initial motivation for truncated
dates, but that motivation ceased being an issue in the 1970's, with
cheap(er) high(er) density data storage.

FWIW: I spent 30+ years designing, writing, and maintaining S370 Assembler
and COBOL programs for a financial institution. I have some experience in
both causing and fixing the "Y2K bug".

> So the decision to drop the century from the date was not only based on
> management but on hard economics.
>
> Which, I will grant, is not a technical decision, though the solution
> was...
>
> And at the time Y2K was created it was not a bug.

I agree.

> It was a money saving feature. Probably worth many millions.

I disagree. It was a money-neutral feature (as far as it was a feature) that
would have (and ultimately did) cost millions to change.

Alone, it didn't save much (there's enough wasted space at the end of each
of those billions of mainframe records (alignment issues, don't you know)
to easily have accommodated two more digits (one 8-bit byte) in each
critical date recorded).

The cost would have been in time and manpower (identifying, coding, testing,
& conversion) to expand those date fields after the fact. And, that's
exactly where the Y2K costs wound up. /That's/ the expense that management
didn't want in the 70's and 80's (and got with interest in the 90's).

--
Lew Pitcher
Master Codewright & JOAT-in-training | Registered Linux User #112576
Me: http://pitcher.digitalfreehold.ca/ | Just Linux: http://justlinux.ca/
---------- Slackware - Because I know what I'm doing. ------


From: Seebs on
On 2010-02-12, Lew Pitcher <lpitcher(a)teksavvy.com> wrote:
> Space requirements /may/ have been the initial motivation for truncated
> dates, but that motivation ceased being an issue in the 1970's, with
> cheap(er) high(er) density data storage.

Furthermore, what with the popularity of 30-year mortgages, people were
dealing with Y2K in or before 1970...

-s
--
Copyright 2010, all wrongs reversed. Peter Seebach / usenet-nospam(a)seebs.net
http://www.seebs.net/log/ <-- lawsuits, religion, and funny pictures
http://en.wikipedia.org/wiki/Fair_Game_(Scientology) <-- get educated!
From: Leif Roar Moldskred on
In comp.lang.java.programmer Wojtek <nowhere(a)a.com> wrote:
>
> And at the time Y2K was created it was not a bug. It was a money saving
> feature. Probably worth many millions.

Not really. Remember, you can pack 256 years into a single 8 bit byte if
you want to, but in most cases of the Y2K problem people had stored a
resolution of 100 years into two bytes -- quite wasteful of space.

In some cases it came from a too tight adherence to the manual business
process that was modeled -- remember the paper forms with "19" pre-printed
and then two digits worth of space to fill out? Those got computerised
and the two-digit year tagged along.

In other cases it boiled down to "this is how we've always done it."

--
Leif Roar Moldskred