From: Robert Myers on
On May 14, 12:11 am, MitchAlsup <MitchAl...(a)aol.com> wrote:

>
> Thus, I conclude that:
> 6) running out of space to evolved killed of microarchitectural
> inovation.
>
> {And with the general caveat that no company actually does
> architectural or microarchitectural research, each does development
> based on short-medium term goals. Research happens en-the-large as
> various companies show their wares and various competitors attempt to
> incorporate or advance their adversary's developments. Much like
> bological evolution.}
>
When the race for single-threaded performance was still on, the path
to further evolution seemed fairly obvious to me: open up the
instruction window, get more instructions in flight, and let the
processor claw its way forward by finding opportunities to speculate,
perhaps by inspired guesses as to where to start a new (hardware-
generated or compiler-hinted) thread, fixing the errors of mis-
speculation on the fly.

That is to say, I would have put my money on the hardware guys to find
parallelism in the instruction stream while software guys were still
dithering about language aesthetics. I thought this way all the way
up to the 90nm step for the P4.

Much of the architectural research we have been living off of was done
in a completely different environment, where cost and immediate payoff
were not nearly the considerations they are today.

One can imagine a development path where single threaded performance
could be improved by developing aggressive hardware that made no sense
from the pov of either power or economics, and THEN figuring out how
to make the power and mass-market requirements work. The memory of
German and then Russian scientists who did not work under the
constraints of market capitalism as a threat has faded and along with
it the opportunity to do research that way.

To be fair, IBM nearly burned itself to the ground with cost-no-object
research, Intel made a couple of ballsy bets that it survived only
because it is Intel, and the government has completely run out of
money for anything except trying to explain and fix the mistakes of
its second-stringers and political hacks. The problem, though, is not
that there are no possibilities left to explore.

Robert.
From: nmm1 on
In article <78b2b354-7835-4357-92e1-21700cc0c05a(a)z17g2000vbd.googlegroups.com>,
Robert Myers <rbmyersusa(a)gmail.com> wrote:
>On May 14, 12:11=A0am, MitchAlsup <MitchAl...(a)aol.com> wrote:
>
>> Thus, I conclude that:
>> 6) running out of space to evolved killed of microarchitectural
>> inovation.

Only because they hamstrung themselves with the demented constraint
that the only programs that mattered were made up of C/C++ spaghetti,
written in as close to a pessimally efficient style as the O-O
dogmatists could get to. Remove that, and there is still room to
evolve.

>When the race for single-threaded performance was still on, the path
>to further evolution seemed fairly obvious to me: open up the
>instruction window, get more instructions in flight, and let the
>processor claw its way forward by finding opportunities to speculate,
>perhaps by inspired guesses as to where to start a new (hardware-
>generated or compiler-hinted) thread, fixing the errors of mis-
>speculation on the fly.
>
>That is to say, I would have put my money on the hardware guys to find
>parallelism in the instruction stream while software guys were still
>dithering about language aesthetics. I thought this way all the way
>up to the 90nm step for the P4.

As you know, I didn't. The performance/clock factor (which is what
the architecture delivers) hasn't improved much.

>Much of the architectural research we have been living off of was done
>in a completely different environment, where cost and immediate payoff
>were not nearly the considerations they are today.

Yup.

>To be fair, IBM nearly burned itself to the ground with cost-no-object
>research, Intel made a couple of ballsy bets that it survived only
>because it is Intel, and the government has completely run out of
>money for anything except trying to explain and fix the mistakes of
>its second-stringers and political hacks. The problem, though, is not
>that there are no possibilities left to explore.

All balls and no brains gets posthumous medals, but doesn't win wars.


Regards,
Nick Maclaren.
From: Kai Harrekilde-Petersen on
Andy 'Krazy' Glew <ag-news(a)patten-glew.net> writes:

> On 5/19/2010 4:35 AM, ned wrote:
>> Piotr Wyderski wrote:
>>
>>> nedbrek wrote:
>
>> "Low power" would be ~10 W. Filling the whole laptop space (5W-60W).
>> Anything below that is "ultra low" aka "Not interesting" :)
>
> We share a common interest in advanced microarchitecture, Ed,
> but we differ greatly wrt low or ultra low power.
>
> Not interesting????
>
> I want to work on the computers that will run my contact lens
> displays. They gotta be low power. (Unless you want to extract circa
> 10W from the body somehow - buy our wearable computer system and lose
> weight!)

Even if extracting 10W from the body was doable, you'd still have the
formidable task of making sure that the dissipation of your 10W
contact lens don't burn your eyes into charcoal.

For contact-lens sized computers, I'd say significantly less than
1W. After all, 1W is a lot of heat, when applied directly to your
skin!


Kai
--
Kai Harrekilde-Petersen <khp(at)harrekilde(dot)dk>
From: ned on
nmm1(a)cam.ac.uk wrote:

> In article <78b2b354-7835-4357-92e1-21700cc0c05a(a)z17g2000vbd.googlegroups.com>,
> Robert Myers <rbmyersusa(a)gmail.com> wrote:
>>On May 14, 12:11=A0am, MitchAlsup <MitchAl...(a)aol.com> wrote:
>>
>>> Thus, I conclude that:
>>> 6) running out of space to evolved killed of microarchitectural
>>> inovation.
>
> Only because they hamstrung themselves with the demented constraint
> that the only programs that mattered were made up of C/C++ spaghetti,
> written in as close to a pessimally efficient style as the O-O
> dogmatists could get to. Remove that, and there is still room to
> evolve.

The programs that matter are the ones customers have paid for (or paid
to have developed). A lot of those programs are spaghetti. It's not
our place to tell customers to rewrite their software. We are to serve
them.

>>When the race for single-threaded performance was still on, the path
>>to further evolution seemed fairly obvious to me: open up the
>>instruction window, get more instructions in flight, and let the
>>processor claw its way forward by finding opportunities to speculate,
>>perhaps by inspired guesses as to where to start a new (hardware-
>>generated or compiler-hinted) thread, fixing the errors of mis-
>>speculation on the fly.
>>
>>That is to say, I would have put my money on the hardware guys to find
>>parallelism in the instruction stream while software guys were still
>>dithering about language aesthetics. I thought this way all the way
>>up to the 90nm step for the P4.
>
> As you know, I didn't. The performance/clock factor (which is what
> the architecture delivers) hasn't improved much.

Near the end of my uarch career, I came to realize that much of "the
game" is keeping perf/clock from collapsing while ramping clock. At
least, that is about the only thing that has been successful.

Ned

From: ned on
MitchAlsup wrote:

> On May 19, 6:35�am, ned <nedb...(a)yahoo.com> wrote:
>> Piotr Wyderski wrote:
>> > nedbrek wrote:
>> I guess I'm showing my age...
>>
>> When Itanium first shipped, it was 130 W. �Itanium II was the same. �I
>> felt, that if Itanium were to compete against the 200+ W (aptly
>> named) Power parts from IBM, it would need similar power budget.
>>
>> We are now in an age where a 30 W laptop part is "high power" (scaling
>> to 60 or 100 W).
>
> Heck, I remember when high power was 300KVA.

Hehe, nice.

>> "Low power" would be ~10 W. �Filling the whole laptop space (5W-60W).
>> Anything below that is "ultra low" aka "Not interesting" :)
>
> How about a sub 1W part so your laptop has enough energy in the
> battery to be left on all day long, and the only part needing power
> throttling is the display.

Get better battery technology. Maybe methane fuel cells... :) Maybe
pocket fusion for a 300 KVA laptop :P

Ned