From: Sven Mascheck on
pk wrote:

> I mean, if the system does not have large file support, and thus "command
> bigfile" fails, then "cat bigfile" would be likely to fail as well.

The example only makes sense on systems with large file support and I picked
it up from this group (thanks, Alan, for explaining the stat() issue).
I considered it as not purely academic, because gzip is post-installed on
many systems, and there were releases with the bug, not being compiled
with large file support per default - but the on-board cat certainly was.
It might have been relevant in the transition time of large file support
getting well established.

BTW, the original award (what else if not completely for fun), blatantly
called for counter-examples, and they serve best by contemplating, not
condemning. They only need to be either useful or fancy in any way.
--
[typo-supersede]
From: pk on
Sven Mascheck wrote:

> pk wrote:
>
>> I mean, if the system does not have large file support, and thus "command
>> bigfile" fails, then "cat bigfile" would be likely to fail as well.
>
> The example only makes sense on systems with large file support and I
> picked it up from this group (thanks, Allan, for explaining the stat()
> issue). I considered it as not purely academic, because gzip is
> post-installed on many systems, and there were releases with the bug, not
> being compiled with large file support per default - but the on-board cat
> certainly was. It might have been relevant in the transition time of large
> file support getting well established.

Ah, thanks for confirming this.

> BTW, the original award (what else if not completely for fun), blatantly
> called for counter-examples, and they serve best by contemplating, not
> condemning. They only need to be either useful or fancy in any way.

Sorry if I gave the wrong impression, but I was just trying to understand,
not condemning or criticising anything; I still maintain that your page
(actually, the other shell pages as well) is a great resource.
From: Alan Curry on
In article <hm65o5$qq3$1(a)speranza.aioe.org>, pk <pk(a)pk.invalid> wrote:
|Apparently "gzip" is (or was) such a software on some systems.

Look at the timestamps on the releases (from ftp.gnu.org):

-rw-r--r-- 1 1003 65534 220623 Aug 20 1993 gzip-1.2.4.tar.gz
-rw-r--r-- 1 1003 65534 220774 Feb 03 1999 gzip-1.2.4a.tar.gz

Between 1993 and 1999 there was no gzip release. At the beginning of that
time frame, files larger than 2G were not really common, and OS support for
them was still under development. It would have been almost impossible for
gzip to support large files at the time 1.2.4 was released.

But by the time the next release came, large file support at the libc level
was basically universal, and gzip was one of the last stragglers that didn't
make use of it (unless the OS vendor included a patched gzip, or the OS was
full-on 64-bit-only in which case it was never an issue)

Actually, on taking a closer look, 1.2.4a didn't even include any code
changes, so it didn't support large files either. gzip didn't catch up until
1.3, released in late 2001!

--
Alan Curry
From: Sven Mascheck on
pk wrote:

> I was just trying to understand, not condemning or criticising anything

Now I'm sorry, I only meant the examples, not you.
From: Alan Curry on
I must make a revision to my earlier statement:

In article <hm731o$kvr$1(a)speranza.aioe.org>, I wrote:
|Between 1993 and 1999 there was no gzip release. At the beginning of that
|time frame, files larger than 2G were not really common, and OS support for

What I should have said was "At the beginning of that time frame, HARD DRIVES
weren't even that big yet."

--
Alan Curry