From: Eric Sosman on 3 Mar 2010 20:45 On 3/2/2010 4:28 PM, Peter K wrote: > [...] > C# (.net) ticks are based on nanoseconds since 1/1/0001. Assertion: The low-order thirty-two bits of such a value at any given moment (NOW!) are unknown -- and unknowable. Y'know those "Star Trek" moments where Scotty looks at the enormous alien space ship and says "It's huge! It must be half a mile across!" and Spock says "Zero point five eight three two two miles, to be precise?" Spock's folly of over-precision (who measures the alien space ship to plus-or-minus six inches?) is as nothing compared to that of a time standard that pretends to measure two millennia's worth of itsy-bitsy wobbles in Earth's rotation. My claim that thirty-two bits are unknowable says nothing more than "We don't know the history of Earth's rotation to plus-or-minus four seconds over the last two thousand years," and I'll stand by the claim. Put it this way: Can you think of ANY physical quantity that has been measured to (let's see: 1E9 nanoseconds in a second, 86,400 seconds in a day ignoring leap seconds, 365.25 days in a year ignoring adjustments, 2010 years, 63,430,776,000,000,000,000 nanoseconds in all) TWENTY decimal places? Add to this the fact that light travels only ~1 foot per nanosecond. Every mile between you and the time standard amounts to five *micro*seconds' worth of slop ... -- Eric Sosman esosman(a)ieee-dot-org.invalid
From: Arne Vajhøj on 3 Mar 2010 20:57 On 03-03-2010 20:45, Eric Sosman wrote: > On 3/2/2010 4:28 PM, Peter K wrote: >> [...] >> C# (.net) ticks are based on nanoseconds since 1/1/0001. > > Assertion: The low-order thirty-two bits of such a value > at any given moment (NOW!) are unknown -- and unknowable. It is not 1 ns unit but 100 ns units. And the low 32 bits is around 430 seconds. We do probably not have any measurements at 430 seconds accuracy for year 1. But do have it today. And it would be rather inconvenient to use different units for different periods. Arne
From: Eric Sosman on 3 Mar 2010 21:21 On 3/3/2010 8:57 PM, Arne Vajh�j wrote: > On 03-03-2010 20:45, Eric Sosman wrote: >> On 3/2/2010 4:28 PM, Peter K wrote: >>> [...] >>> C# (.net) ticks are based on nanoseconds since 1/1/0001. >> >> Assertion: The low-order thirty-two bits of such a value >> at any given moment (NOW!) are unknown -- and unknowable. > > It is not 1 ns unit but 100 ns units. And the low 32 bits > is around 430 seconds. Thanks for the information. I'll revise my claim: "The low-order twenty-five bits are unknown and unknowable." > We do probably not have any measurements at 430 seconds accuracy > for year 1. But do have it today. And it would be rather inconvenient > to use different units for different periods. Intervals between contemporary events can (sometimes) be measured to nanosecond precision. In the laboratory, femtosecond precision may be attainable. But extending the scale to longer periods is pure fiction! Claim: You cannot measure the time between an event at lunchtime yesterday and one at lunchtime today with nanosecond precision. You probably can't measure it with millisecond precision, and even one-second precision would require a good deal of care. Even in one single lunch hour, you cannot measure the time between the swallow and the belch with nanosecond precision. -- Eric Sosman esosman(a)ieee-dot-org.invalid
From: Roedy Green on 3 Mar 2010 23:52 On Wed, 3 Mar 2010 22:56:52 +0000, Dr J R Stockton <reply1009(a)merlyn.demon.co.uk> wrote, quoted or indirectly quoted someone who said : > >Ten dates (no days) dropped. Later, parts of Canada dropped 11 dates. The full story is quite complex. Different parts of world accepted the Gregorian calendar at different times. There are parts of the world today still on the Julian calendar. BigDate works off two different definitions, the papal and the British adoption. -- Roedy Green Canadian Mind Products http://mindprod.com The major difference between a thing that might go wrong and a thing that cannot possibly go wrong is that when a thing that cannot possibly go wrong goes wrong it usually turns out to be impossible to get at or repair. ~ Douglas Adams (born: 1952-03-11 died: 2001-05-11 at age: 49)
From: Thomas Pornin on 4 Mar 2010 08:56
According to Eric Sosman <esosman(a)ieee-dot-org.invalid>: > On 3/2/2010 4:28 PM, Peter K wrote: > > [...] > > C# (.net) ticks are based on nanoseconds since 1/1/0001. > > Assertion: The low-order thirty-two bits of such a value > at any given moment (NOW!) are unknown -- and unknowable. Only if you do not use the "right" definition. Nobody on January 1st, 1 AD had any notion of what a second could be, let alone a nanosecond, and neither did they imagine that their year was to be numbered "1". That calendar was applied retroactively, several centuries (for year count) or millenia (for seconds and nanoseconds) later. To the effect that "1/1/0001" is defined to be a past date computed back from now using the absolute definition of the second that we use nowadays, and which has nothing to do with the rotation of Earth. What is unknown (with 100ns precision) is how well that synthetic reconstructed instant matches the position of stars in the sky of Rome during that night, under the rule of Augustus. Strangely enough, we have some measures on the average variation of Earth rotation over the last two millenia, thanks to some ancient reports of solar eclipses. The zone of total eclipse is a narrow band, a few dozen kilometers in width; a report of an observation of a total eclipse at a given place yields a measure of the Earth orientation with a precision equivalent to two or three minutes of Earh rotation. Chinese astronomers, in particular, were quite meticulous in writing down the particulars of an observed eclipse. Of course this is relative to how well we can reconstruct the eclipse parameters themselves, but it seems that Moon orbital parameters, while complex, are still quite easier to extrapolate than the messy artefacts of Earth rotational variations. So that we _can_ pinpoint the synthetic "1/1/0001" date within the Roman calendar framework within a few minutes, which is much better than what the clocks the Romans used could do. At that point, simply imposing our notions of nanoseconds on the Romans seems hardly unfair. And they would not have noticed anyway. --Thomas Pornin |