From: Paul on 13 Sep 2009 01:12 On 2009-09-12 19:44:07 -0500, "Pete Dashwood" <dashwood(a)removethis.enternet.co.nz> said: > Michael Wojcik wrote: >> Pete Dashwood wrote: >>> >>> Certainly the paradigm that COBOL represented has been replaced in >>> Client/Server processing. As client/server (networking, the >>> Internet, etc.) is where MOST of the computer use in the world is >>> occurring, it is fair to say that COBOL is being replaced. >> >> Client/server computing is not "most of the computer use in the >> world". > > Yes it is. I fall somewhere in the middle on this. Take payroll for example. Data entry for payroll has been "client/server" for a long time, even back to the days of RJE. Certainly it is today, with time data being collected via the Web, smart devices, and so forth. But processing the payroll and printing the checks, doing the electronic funds transfers, and printing out the stubs is still pretty much a batch job. That's true even in tiny shops using Quickbooks. Oh that batch process is probably using DB/2 or some other relational DBMS, but that hardly makes it client/server. The inherent design of the process is batch oriented. On the hand, based upon pure quantity, client/server stuff using the web or dedicated clients is without a doubt the consumer of the vast majority of cycles running today. You hear the complaint that "everything" is a server over and over today. ;) [lots of good stuff snipped] >> Various forms of distributed processing, from web applications to >> service-oriented architectures to massive server farms to cloud >> computing, are certainly getting a lot of attention these days; and I >> do think that's the right way to go for many kinds of applications, >> including most of the things that were done as big online or batch >> applications in the past. But they don't constitute "most computing" >> unless you use a very narrow definition of "computing". > > I'm not going to argue how wide my definition is for the sake of this > argument. I stand by my original statement. There is room to debate, but the vast majority of computer cycles today are spent in client/server processing. That is really hard to argue against. What is more difficult to filter out of that is that one of the reasons that is so is that typical batch processing is far more efficient, in an absolute sense. Wht may take and hour in a client/server process (which will still have a batch oriented component by the way) may take only a few minutes as a batch process. More efficient structure to the processing. On the other hand of course. our PCs today are so very much powerful, and cheap, not much of anyone cares about spending the cycles. Nor should they I suppose. COBOL really missed the chance to become ubiquitious, in my opinion because they were so focused on getting every dollar possible from the limited customer base they started out with. You don't get better OO Cobol than the older IBM VisuaAge COBOL. It had everything going for it, including the most object oriented screen design GUI system on the market. IBM would give it away to partners, there were no runtimes, and best of all, you got IBM COBOL support. They guys where were out at Santa Teresa labs were utterly fantastic. But it didn't take off, because IBM's main focus with it was on mainframes, not on the PC world which, like so many of us, they choose to ignore and look down upon. Even I bought into some of that. :) MicroFocus has a good chance to do a lot of things. Imagine where MF could have been if they had written a COBOL backend language like Java or .NET. Instead, we have the COBOL market today in the sorry shape that it is in. And with little expectation it will grow better in any forseeable future. Bahhh! -Paul >> >> This is the same error we see from Web 2.0 pundits, New Media >> enthusiasts, "long tail" proponents and the like - they ignore the >> sectors of the industry that don't fit their models, and consequently >> mistake the innovations of the vanguard for a revolution of the >> masses. > > I'm not convinced. > > Talk to anyone under thirty and ask them what a computer is. > > It isn't a cellphone or a mainframe. > > AND they all have one and have been using it all their lives. You completely > overlooked the fact that the internet is taking around 2 billion web page > hits a day, much of this off social _NETWORKS_ (my emphasis). > > Client server networks are definitely the largest part of computing in > industry and the home. Sorry if it isn't "interesting" but reality sometimes > isn't.. :-) > > Pedantically there may be more computing cycles consumed by car engine > management systems, but that has no bearing on COBOL, which is what I was > discussing. > > Pete.
From: Howard Brazee on 14 Sep 2009 12:44 On Sat, 12 Sep 2009 20:44:49 +1200, "Pete Dashwood" <dashwood(a)removethis.enternet.co.nz> wrote: >> CoBOL hasn't been replaced by an other language. > >Perhaps not. It depends on how you look at it. > >Certainly the paradigm that COBOL represented has been replaced in >Client/Server processing. As client/server (networking, the Internet, etc.) >is where MOST of the computer use in the world is occurring, it is fair to >say that COBOL is being replaced. Client/Server processing isn't "an other language". And it really doesn't replace CoBOL so much as supercedes it. It's sort of like "replacing" the train that takes one to New York with satellite feed that connected you to London. -- "In no part of the constitution is more wisdom to be found, than in the clause which confides the question of war or peace to the legislature, and not to the executive department." - James Madison
From: Howard Brazee on 14 Sep 2009 12:46 On Sat, 12 Sep 2009 15:21:00 -0400, Michael Wojcik <mwojcik(a)newsguy.com> wrote: >Client/server computing is not "most of the computer use in the >world". Most of the computers sold in recent years are embedded >systems (and a majority of those are 8-bitters). The type of computer >with the most users, worldwide, are mobile phones - a quintessential >peer-to-peer application. If compute cycles is our metric, most >computer use is in scientific number crunching. Is the Z-80 still the top selling computer chip in the world? It's real cheap, and plenty fast enough to run a stop light. -- "In no part of the constitution is more wisdom to be found, than in the clause which confides the question of war or peace to the legislature, and not to the executive department." - James Madison
From: robertwessel2 on 14 Sep 2009 19:00 On Sep 14, 11:46 am, Howard Brazee <how...(a)brazee.net> wrote: > Is the Z-80 still the top selling computer chip in the world? It's > real cheap, and plenty fast enough to run a stop light. No. The Z-80 *chip* sells in tiny quantities. It would be far, far, far too expensive to use a discrete microprocessor in almost any embedded design suited for that class of CPU. OTOH, Z-80 *cores*, as embedded into other chips (including freely available soft cores that can but programmed into FPGAs), are fairly popular. In terms of shipped volume, at least the cores in the PIC, AVR, and 8051 families are likely an order of magnitude higher in volume than Z-80 cores. ARMs are likely higher volumes than Z-80s as well. FWIW, reliable figures are notoriously difficult to get a hold of.
From: Michael Wojcik on 16 Sep 2009 12:46
Paul wrote: > > There is room to debate, but the vast majority of computer cycles today are > spent in client/server processing. That is really hard to argue against. No, it's easy to argue against. There are orders of magnitude more embedded CPUs than general-purpose CPUs. They have lower clock rates, but those clocks add up. Add in scientific number crunching, with fewer systems but vastly more ops per second, and general-purpose computing is already out of the lead. (Just take a look at the ratings listed on top500.org.) Add in cellphones, and wave goodbye to client/server processing as it disappears behind you. The only way to support the claim that client/server processing represents a majority of compute cycles is to broaden the definition to absurdity and call things like MPP and cellphone traffic "client/server". At that point the term is no longer useful. You might as well claim the embedded CPU in a USB keyboard is a client and the PC it's attached to is a server, and say that's client/server as well. -- Michael Wojcik Micro Focus Rhetoric & Writing, Michigan State University |