From: nmm1 on 27 Apr 2010 13:57 In article <21acb3e0-aab1-47e5-a080-1cc92a37651a(a)29g2000yqp.googlegroups.com>, Quadibloc <jsavard(a)ecn.ab.ca> wrote: > >> Well, actually, I blame the second-raters who turned some seminal >> results into dogma. >> >> None of Turing, Von Neumann or the best mathematicians and computer >> people would ever say that the model is the last word, still less >> that hardware architecture and programming languages must be forced >> into it. > >The reason that, so far, parallel architectures are used to execute >programs which basically were written for a von Neumann machine, but >chopped into bits that can run in parallel, is not so much the fault >of a blind dogmatism as it is of the absence of a clear alternative. That, I regret, is the positive side of the problem :-( I wasn't just being insulting by referring to dogma, but referring to actual incidents. There are a remarkable number of people around, including some very influential ones, who believe that the Von Neumann model is the only possible solution - and who will prevent alternative approaches from being considered. Regards, Nick Maclaren.
From: Robert Myers on 27 Apr 2010 14:22 On Apr 26, 9:12 pm, Rick Jones <rick.jon...(a)hp.com> wrote: > > Sounds like child rearing. I could handle a computer behaving like my > nine year-old, at least most of the time. I'm not sure I want my > computer behaving like my five year-old :) > Artificial intelligence seems to think it has much to learn from neurophysiology. I'm suggesting that developmental psychology might be a more promising place to look. Robert.
From: MitchAlsup on 27 Apr 2010 14:59 On Apr 27, 12:57 pm, n...(a)cam.ac.uk wrote: > I wasn't just being insulting by referring to dogma, but referring > to actual incidents. There are a remarkable number of people around, > including some very influential ones, who believe that the Von Neumann > model is the only possible solution - and who will prevent alternative > approaches from being considered. Another serious and huge problem is the amount of cubic dollars needed before any new paradigm has sufficient critical mass so that it can stand on its own two feet and begin its rise to prominence. The RISC episode was but a small paradigm shift compared to a non-vonNeumann model of computing. Back then, it was only a 1/5-1/3 of a billion dollars that were needed, and for a small shift, and in 20 years ago dollars. And thus, we may have evolved ourselves in to a corner in which there is no economically sund way out. Mitch
From: Robert Myers on 27 Apr 2010 15:13 On Apr 27, 2:59 pm, MitchAlsup <MitchAl...(a)aol.com> wrote: > And thus, we may have evolved ourselves in to a corner in which there > is no economically sund way out. I don't believe that. The RISC "revolution" was simply absorbed into a more important revolution: the attack of the killer micros. There were no cubic dollars at the beginning of the attack of the killer micros, and right up until it nearly went out of business, IBM was certain that its business model could not be defeated. Robert.
From: nmm1 on 27 Apr 2010 15:22
In article <cf25e322-30f4-4915-947d-aebdbe28eec1(a)d12g2000vbr.googlegroups.com>, MitchAlsup <MitchAlsup(a)aol.com> wrote: > >> I wasn't just being insulting by referring to dogma, but referring >> to actual incidents. =A0There are a remarkable number of people around, >> including some very influential ones, who believe that the Von Neumann >> model is the only possible solution - and who will prevent alternative >> approaches from being considered. > >Another serious and huge problem is the amount of cubic dollars needed >before any new paradigm has sufficient critical mass so that it can >stand on its own two feet and begin its rise to prominence. The RISC >episode was but a small paradigm shift compared to a non-vonNeumann >model of computing. Back then, it was only a 1/5-1/3 of a billion >dollars that were needed, and for a small shift, and in 20 years ago >dollars. Yes, that's true - for CPU architectures - but I was also thinking of programming languages! The point there is, if the only languages are extremely Von Neumann, any approach to a different class of architecture would be stillborn, because nobody would be able to use it. That is one of the reasons that many of the specialist supercomputers have been essentially usuable only from Fortran - it is a far less Von Neumann language than C/C++ and their followers. The ridiculous thing is that there have been very successful languages based on different models in the past, and there are still some of them around (the functional ones being an obvious category). >And thus, we may have evolved ourselves in to a corner in which there >is no economically sund way out. Agreed. Regards, Nick Maclaren. |