Prev: A post to comp.risks that everyone on comp.arch should read
Next: Call for papers : HPCS-10, USA, July 2010
From: Andy 'Krazy' Glew on 28 May 2010 22:24 On 5/28/2010 3:07 PM, MitchAlsup wrote: > On May 28, 11:11 am, n...(a)cam.ac.uk wrote: >> And that's the point. Despite common belief, computer architecture >> has been about both hardware and software (and their interaction) >> for at least the past 45 years - and all that the Itanic did was >> to simplify the hardware designer's life a little while making the >> software designer's one almost impossible. > > Minor quibble: > > Computer Architecture IS the contract between hardware and software > (and has been since the Anderson paper in 1963) > Computer microarchitecture is how the hardware goes about giving the > software writer the illusion that that architecture spec is being > obeyed. > > Almost all of the OutofOrder machines give the software writer the > illusion that the machine remains completely InOrder (as seen by the > instruction pointer, and by interested third parties outside the CPU > boundary). Thus, the microarchitecture is doing a bunch of hocus pocus > to run instructions out of order, and keeping a trail of crumbs so > that when the InOrder illusion become violated, order is restored > before anyone sees the OutofOrderness. > > Some OutofOrder machines go so far as to make the software tell the > machine when to appear InOrder. By and large these machines have > either failed to become mainstream, are dying out, or have died out. > > Mitch Some programming systems try to take advantage of non-deterministic out-of-order hardware execution that is visible to them, but hide it from the users of the system. I.e. non-determinism is exposed to the layer of software closest to the hardware, but not to layers above it. I'm not sure if Transmeta falls into this camp or not. I once said to a proponent of such an idea, who worked for a major software vendor "If we do what you suggest, then *YOU* are in charge of the instruction set semantics, not us at the hardware company." He agreed.
From: nmm1 on 29 May 2010 06:12 In article <4C007AD9.9040303(a)patten-glew.net>, Andy 'Krazy' Glew <ag-news(a)patten-glew.net> wrote: > >Some programming systems try to take advantage of non-deterministic >out-of-order hardware execution that is visible to them, but hide >it from the users of the system. I.e. non-determinism is exposed >to the layer of software closest to the hardware, but not to layers >above it. Minor niggle: almost all do! Almost all architectures simplify the hardware's task by omitting certain complexities, and taking an interrupt on certain events. TLB misses, floating-point fixup, performance counters, sometimes ECC handling, and so on. It is very, very rare for the interface to the fixup routine to be entirely deterministic - and, regrettably, far too many modern architectures leave it undefined. Regards, Nick Maclaren.
From: Andy 'Krazy' Glew on 29 May 2010 10:58 On 5/29/2010 3:12 AM, nmm1(a)cam.ac.uk wrote: > In article<4C007AD9.9040303(a)patten-glew.net>, > Andy 'Krazy' Glew<ag-news(a)patten-glew.net> wrote: >> >> Some programming systems try to take advantage of non-deterministic >> out-of-order hardware execution that is visible to them, but hide >> it from the users of the system. I.e. non-determinism is exposed >> to the layer of software closest to the hardware, but not to layers >> above it. > > Minor niggle: almost all do! > > Almost all architectures simplify the hardware's task by omitting > certain complexities, and taking an interrupt on certain events. > TLB misses, floating-point fixup, performance counters, sometimes > ECC handling, and so on. It is very, very rare for the interface to > the fixup routine to be entirely deterministic - and, regrettably, > far too many modern architectures leave it undefined. Non-deterministic aspects that are considered part of the "system" architecture, not the user architecture: * TLBs * Performance counters * ECC handling While I have encountered non-deterministic aspects of floating-point fixup, and have even proposed some, I am not aware of any mass-market CPU having such. I.e. not x86.
From: nmm1 on 29 May 2010 12:09 In article <4C012B82.1030107(a)patten-glew.net>, Andy 'Krazy' Glew <ag-news(a)patten-glew.net> wrote: > >Non-deterministic aspects that are considered part of the "system" >architecture, not the user architecture: > * TLBs > * Performance counters > * ECC handling Yes. However, the boundary is very fluid, and an author of a language run-time system needs to get involved at that level to doo a good job. Also, when the hardware and software are produced by completely separate organisations, the interface is critical. >While I have encountered non-deterministic aspects of floating-point >fixup, and have even proposed some, I am not aware of any mass-market >CPU having such. I.e. not x86. I would have to look at the architecture again, but am 90% sure that the x86 has several non-deterministic aspects in that area! The original 80286/7 floating-point unit was asynchronous with respect to the main CPU, and I am pretty sure that I remember seeing several unspecified properties in the MMX or SSE stuff. Note that I am talking about the architecture, as such, and not what the CPUs did or do - i.e. the freedom than Intel left itself to change its mind later. Regards, Nick Maclaren.
From: Andy 'Krazy' Glew on 30 May 2010 00:46
On 5/29/2010 9:09 AM, nmm1(a)cam.ac.uk wrote: > In article<4C012B82.1030107(a)patten-glew.net>, > Andy 'Krazy' Glew<ag-news(a)patten-glew.net> wrote: >> >> Non-deterministic aspects that are considered part of the "system" >> architecture, not the user architecture: >> * TLBs >> * Performance counters >> * ECC handling > > Yes. However, the boundary is very fluid, and an author of a language > run-time system needs to get involved at that level to doo a good job. > Also, when the hardware and software are produced by completely separate > organisations, the interface is critical. > >> While I have encountered non-deterministic aspects of floating-point >> fixup, and have even proposed some, I am not aware of any mass-market >> CPU having such. I.e. not x86. > > I would have to look at the architecture again, but am 90% sure that > the x86 has several non-deterministic aspects in that area! The > original 80286/7 floating-point unit was asynchronous with respect > to the main CPU, and I am pretty sure that I remember seeing several > unspecified properties in the MMX or SSE stuff. Note that I am > talking about the architecture, as such, and not what the CPUs did > or do - i.e. the freedom than Intel left itself to change its mind > later. We are using different terminology. When I say "non-deterministic" I mean a feature that, when executed may gives different answers if it is run several different times on the same machine. "Non-reproducible" is a related term. I often talk about "deterministic reproducibility". Shared memory mutiprocessor code is often not deterministically reproducible, at least from the point of view of a user. (It may be deterministically reproducible if rebooted and started from scratch. Silicon debug depends on that: see Intel PSMI, http://download.intel.com/technology/itj/2003/volume07issue02/art04_validation/vol7iss2_art04.pdf) Single threaded code is normally deterministically reproducible. However, the original Intel x86 performance monitoring interrupts were imprecise, and might not be delivered at the same instruction in every run of the program. Many performance counter events are not deterministically reproducible from the point of view of a single thread I.e. they were not deterministically reproducible. E.g. cache misses, that may include speculative effects and misses induced by interrupts. However^2, certain performance counter events have been, or are supposed to be, deterministic from the point of view of a single thread. E.g. instructions retired, memory references retired. Also, in recent processors performance counter interrupts have been made precise (in a very ugly and inelegant way). You, Nick, seem to be using "non-deterministic" to mean "undefined". True, the broad meaning of undefined might include non-deterministic. However, much progress in computer architecture has involved refining the meaning of various undefined features. |