From: Richard Tobin on 19 Jun 2010 19:05 In article <413163476298659733.792487jim-magrathea.plus.com(a)news.individual.net>, jim <jim(a)magrathea.plus.com> wrote: >> "Every program has a branching and converging bootstrap ancestry." >> I've written code without any ancestry at all. >I doubt that. Not unless you've developed your own CPU from first >principles. For my original formulation of the question, Rowland may well be right. I didn't consider the question of how the hardware was produced, only the direct use of software to produce more software. -- Richard
From: Richard Tobin on 19 Jun 2010 19:08 In article <634779192298680850.843587usenet-alienrat.co.uk(a)news.individual.net>, Woody <usenet(a)alienrat.co.uk> wrote: >> Lancaster University at the time I was there ran an experimental Dynix >> machine that used 16 386 chips (I forget the megahertz) - this was >> considered advanced stuff. >Wow, 16,386 is a lot :) Probably a typo for 16,384. -- Richard
From: D.M. Procida on 20 Jun 2010 04:31 Woody <usenet(a)alienrat.co.uk> wrote: > Ps: sorry if quitting is screwed. I took the PowerBook to the applestore > today to get its long term problem fixed, and am now on the iPad full > time Don't worry about it, lots of people find it hard to quit. Maybe you're not just a quitter. Daniele
From: D.M. Procida on 20 Jun 2010 04:31 Rowland McDonnell <real-address-in-sig(a)flur.bltigibbet.invalid> wrote: > > Suppose that we lost all our technology, right down to our hammers and > > nails, and had to start again with stone hand-tools > > This is your first conceptual mistake: we cannot lose all our technology > and remain anything remotely human. > > Technology is more than just the things - it's sets of techniques for > doing stuff including the things needed to do the stuff. Tool-making > and tool-use is /pre/-human. We can't lose technology and remain > remotely human at all. > > (but still had our > > knowledge, language, writing and so on). > > Thus keeping some technology, thus making your proposal, erm, silly? > Non-existent? Something like that. You're right, I should have been less sloppy. What I meant was the technological artefacts, rather than the skills (techne) themselves. And of course it's not that simple, because skills and artefacts are intimately related. What's more, simply having the artefacts and learning to use them well are quite different things, which is partly why I think Pd's 'lifetime' answer is over-optimistic. It might not take too long to build a blast furnace, but how to use it efficiently could take generations (in principle, I don't know enough about blast furnaces in particular to know whether that's plausible). Daniele
From: Richard Kettlewell on 20 Jun 2010 04:35
Justin C <justin.1006(a)purestblue.com> writes: > In article <883cceFgicU1(a)mid.individual.net>, Chris Ridd wrote: > >> What sort of blind alleys would we avoid? > > x86? > > I think that knowing, in advance, that while at first 32bit was > liberating, it became a mill-stone, and backward compatibility issues > sure hindered progress. 64bit good, 128bit better, 256bit? Would we > try and find a way to avoid this kind of thing completely? > > Massive parallelism? To get more bits add another cpu? > > I think that the architecture could be very different. This is the > sort of question that it would be great to ask of those people who > have worked 8bit 16bit 32bit and 64bit - if you could start again, how > would you do it? The 68000's somewhat hybrid 16/32 nature was a good stab, IMO. I suspect straddling more than one level at a time might not be very practical though (8 bytes per pointer when you were only going to use 3 or 4 of them might not have been very popular at 1980s RAM prices). -- http://www.greenend.org.uk/rjk/ |