From: Lothar Kimmeringer on
Eric Sosman wrote:

> Intervals between contemporary events can (sometimes) be
> measured to nanosecond precision. In the laboratory, femtosecond
> precision may be attainable. But extending the scale to longer
> periods is pure fiction! Claim: You cannot measure the time
> between an event at lunchtime yesterday and one at lunchtime today
> with nanosecond precision.

With intervals of that size, nobody will anyway. Point is that
you don't want to change data-structures in dependence of the
size of the interval. As well, you want to keep some kind of
reserve for the future to avoid the problem the runtime libraries
of Borland TurboPascal had where a cycle-counter-value became
larger than the maximum value that could be represented by a Word.

> You probably can't measure it with
> millisecond precision, and even one-second precision would require
> a good deal of care.

Like with all physical measures you have an error. Assuming it
to be constant (e.g. 0.01%) an interval of 10 �s can be expected
to be in the range of 9999 ns and 1001 ns where in terms of
a day, the error alone is plus or minus 9 seconds.

> Even in one single lunch hour, you cannot measure the time
> between the swallow and the belch with nanosecond precision.

Most measurements in IT I'm aware of are about the time of a
method-call, the execution time of an SQL-query, the round-
trip-time of a network request, etc. Hopefully most of them
are in the range of micro- or milliseconds, so having a data-
structure with some kind of "reserve" for the future isn't the
badest thing to have.


Regards, Lothar
--
Lothar Kimmeringer E-Mail: spamfang(a)kimmeringer.de
PGP-encrypted mails preferred (Key-ID: 0x8BC3CD81)

Always remember: The answer is forty-two, there can only be wrong
questions!
From: Arne Vajhøj on
On 03-03-2010 21:21, Eric Sosman wrote:
> On 3/3/2010 8:57 PM, Arne Vajh�j wrote:
>> On 03-03-2010 20:45, Eric Sosman wrote:
>>> On 3/2/2010 4:28 PM, Peter K wrote:
>>>> [...]
>>>> C# (.net) ticks are based on nanoseconds since 1/1/0001.
>>>
>>> Assertion: The low-order thirty-two bits of such a value
>>> at any given moment (NOW!) are unknown -- and unknowable.
>>
>> It is not 1 ns unit but 100 ns units. And the low 32 bits
>> is around 430 seconds.
>
> Thanks for the information. I'll revise my claim: "The
> low-order twenty-five bits are unknown and unknowable."
>
>> We do probably not have any measurements at 430 seconds accuracy
>> for year 1. But do have it today. And it would be rather inconvenient
>> to use different units for different periods.
>
> Intervals between contemporary events can (sometimes) be
> measured to nanosecond precision. In the laboratory, femtosecond
> precision may be attainable. But extending the scale to longer
> periods is pure fiction! Claim: You cannot measure the time
> between an event at lunchtime yesterday and one at lunchtime today
> with nanosecond precision. You probably can't measure it with
> millisecond precision, and even one-second precision would require
> a good deal of care.
>
> Even in one single lunch hour, you cannot measure the time
> between the swallow and the belch with nanosecond precision.

All true.

But still it is a lot easier to use the same unit for
both long and short intervals.

Arne

From: Eric Sosman on
On 3/4/2010 10:56 PM, Arne Vajh�j wrote:
> On 03-03-2010 21:21, Eric Sosman wrote:
>> On 3/3/2010 8:57 PM, Arne Vajh�j wrote:
>>> On 03-03-2010 20:45, Eric Sosman wrote:
>>>> On 3/2/2010 4:28 PM, Peter K wrote:
>>>>> [...]
>>>>> C# (.net) ticks are based on nanoseconds since 1/1/0001.
>>>>
>>>> Assertion: The low-order thirty-two bits of such a value
>>>> at any given moment (NOW!) are unknown -- and unknowable.
>>>
>>> It is not 1 ns unit but 100 ns units. And the low 32 bits
>>> is around 430 seconds.
>>
>> Thanks for the information. I'll revise my claim: "The
>> low-order twenty-five bits are unknown and unknowable."
>>
>>> We do probably not have any measurements at 430 seconds accuracy
>>> for year 1. But do have it today. And it would be rather inconvenient
>>> to use different units for different periods.
>>
>> Intervals between contemporary events can (sometimes) be
>> measured to nanosecond precision. In the laboratory, femtosecond
>> precision may be attainable. But extending the scale to longer
>> periods is pure fiction! Claim: You cannot measure the time
>> between an event at lunchtime yesterday and one at lunchtime today
>> with nanosecond precision. You probably can't measure it with
>> millisecond precision, and even one-second precision would require
>> a good deal of care.
>>
>> Even in one single lunch hour, you cannot measure the time
>> between the swallow and the belch with nanosecond precision.
>
> All true.
>
> But still it is a lot easier to use the same unit for
> both long and short intervals.

I've no quarrel with measuring *intervals* in tiny units.
The thing that started me ranting and foaming at the mouth was
the statement that "C# (.net) ticks are based on nanoseconds
since 1/1/0001." *That's* the association I regard as fiction,
bordering on nonsense.

I seem to have mislaid those pills the court psychiatrist
ordered me to take. Anybody know where they are? ;-)

--
Eric Sosman
esosman(a)ieee-dot-org.invalid
From: Arne Vajhøj on
On 05-03-2010 09:14, Eric Sosman wrote:
> On 3/4/2010 10:56 PM, Arne Vajh�j wrote:
>> On 03-03-2010 21:21, Eric Sosman wrote:
>>> On 3/3/2010 8:57 PM, Arne Vajh�j wrote:
>>>> On 03-03-2010 20:45, Eric Sosman wrote:
>>>>> On 3/2/2010 4:28 PM, Peter K wrote:
>>>>>> [...]
>>>>>> C# (.net) ticks are based on nanoseconds since 1/1/0001.
>>>>>
>>>>> Assertion: The low-order thirty-two bits of such a value
>>>>> at any given moment (NOW!) are unknown -- and unknowable.
>>>>
>>>> It is not 1 ns unit but 100 ns units. And the low 32 bits
>>>> is around 430 seconds.
>>>
>>> Thanks for the information. I'll revise my claim: "The
>>> low-order twenty-five bits are unknown and unknowable."
>>>
>>>> We do probably not have any measurements at 430 seconds accuracy
>>>> for year 1. But do have it today. And it would be rather inconvenient
>>>> to use different units for different periods.
>>>
>>> Intervals between contemporary events can (sometimes) be
>>> measured to nanosecond precision. In the laboratory, femtosecond
>>> precision may be attainable. But extending the scale to longer
>>> periods is pure fiction! Claim: You cannot measure the time
>>> between an event at lunchtime yesterday and one at lunchtime today
>>> with nanosecond precision. You probably can't measure it with
>>> millisecond precision, and even one-second precision would require
>>> a good deal of care.
>>>
>>> Even in one single lunch hour, you cannot measure the time
>>> between the swallow and the belch with nanosecond precision.
>>
>> All true.
>>
>> But still it is a lot easier to use the same unit for
>> both long and short intervals.
>
> I've no quarrel with measuring *intervals* in tiny units.
> The thing that started me ranting and foaming at the mouth was
> the statement that "C# (.net) ticks are based on nanoseconds
> since 1/1/0001." *That's* the association I regard as fiction,
> bordering on nonsense.

Nanoseconds in year 1 is absurd.

But it is not absurd to measure nanoseconds (or at least milliseconds
today).

And it is not absurd to be able to store days many years back.

And it is not absurd to use the same unit for all times.

So we have now proven that:
3 x not absurd = absurd

Arne


From: Peter K on


"Eric Sosman" <esosman(a)ieee-dot-org.invalid> wrote in message
news:hmr3jt$biv$1(a)news.eternal-september.org...
> On 3/4/2010 10:56 PM, Arne Vajh�j wrote:
>> On 03-03-2010 21:21, Eric Sosman wrote:
>>> On 3/3/2010 8:57 PM, Arne Vajh�j wrote:
>>>> On 03-03-2010 20:45, Eric Sosman wrote:
>>>>> On 3/2/2010 4:28 PM, Peter K wrote:
>>>>>> [...]
>>>>>> C# (.net) ticks are based on nanoseconds since 1/1/0001.
>>>>>
>>>>> Assertion: The low-order thirty-two bits of such a value
>>>>> at any given moment (NOW!) are unknown -- and unknowable.
>>>>
>>>> It is not 1 ns unit but 100 ns units. And the low 32 bits
>>>> is around 430 seconds.
>>>
>>> Thanks for the information. I'll revise my claim: "The
>>> low-order twenty-five bits are unknown and unknowable."
>>>
>>>> We do probably not have any measurements at 430 seconds accuracy
>>>> for year 1. But do have it today. And it would be rather inconvenient
>>>> to use different units for different periods.
>>>
>>> Intervals between contemporary events can (sometimes) be
>>> measured to nanosecond precision. In the laboratory, femtosecond
>>> precision may be attainable. But extending the scale to longer
>>> periods is pure fiction! Claim: You cannot measure the time
>>> between an event at lunchtime yesterday and one at lunchtime today
>>> with nanosecond precision. You probably can't measure it with
>>> millisecond precision, and even one-second precision would require
>>> a good deal of care.
>>>
>>> Even in one single lunch hour, you cannot measure the time
>>> between the swallow and the belch with nanosecond precision.
>>
>> All true.
>>
>> But still it is a lot easier to use the same unit for
>> both long and short intervals.
>
> I've no quarrel with measuring *intervals* in tiny units.
> The thing that started me ranting and foaming at the mouth was
> the statement that "C# (.net) ticks are based on nanoseconds
> since 1/1/0001." *That's* the association I regard as fiction,
> bordering on nonsense.

Yes, sorry, I mis-wrote the definition from Microsoft.

The .net DateTime structure represents dates and times ranging from 1/1/0001
to 31/12/9999. The values are measured in 100ns units called ticks.

http://msdn.microsoft.com/en-us/library/system.datetime.aspx

But is your quarrel that if I actually went back the billions of nanoseconds
from the value for today's nanasecond value, I wouldn't actually end up at
1/1/0001 - due to vagaries in the Earth's orbit, spin etc?