From: Mok-Kong Shen on
mike wrote:

> I think that most of the problems inherent in any large-scale
> programming project result from the inherent 'fragility' of all
> programming languages.
>
> If you compare a large computing project with a large engineering
> project there are clear similarities, but one very significant
> difference is that almost any undetected error in code has the potential
> to result, somewhere down the line, in catastrophic outcomes; whereas if
> a nail is not quite hammered in as far as specified or if a light
> fitting (or even a window) is put in the wrong place then the building
> usually remains safe, functional, and fit for purpose.
>
> If someone has some ideas about a programming language/paradigm that is
> fault-resistant, not only in the sense that it reduces the number of
> actual bugs and/or errors produced, but also in the sense that many bugs
> have no significant effect on function and behaviour then large-scale
> projects may be a lot easier to manage.
>
> Whether this will ever be a possibility remains, in my opinion, unknown
> to date.

In engineering works there are "factors of safety" to take account
of variability of materials, unpredictability of actual loadings
and inaccuracies in construction etc. etc. In computing, analogous
has been done. For results for control of critical (including
in particular potentially dangerous) processes, one employs
double or triple hardware to take care of the possibility of
malfunction of hardware. As far as I know, in order to better
detect programmer errors, one similarly employs different and
independent teams of programmers to do the same job and then with
test cases to compare the results of the resulting programs.
However, if I don't err, this comparision is only done in the design
phase of the software and later only the work of one of the teams
is selected for practical application. A safer way, I think, would
have these presumably equivalent software (resulting from different
teams, employing preferably different programming languages and
environments) in actual production runs to always work parallelly
on multiple hardware (of possibly different types), so as to futher
reduce the risk of errors, since testing of software in the design
phase might not be thorough enough to uncover all errors that may be
present. Of course, errors could not be "absolutely" eradicated,
in accordance with Murphy's Law.

M. K. Shen
From: Nick Keighley on
On 26 Oct, 22:22, c...(a)tiac.net (Richard Harter) wrote:

> WHAT ARE DATA FLOW LANGUAGES?

> Some significant advantages:
>
> * Concurrency and parallelism are natural.  Code can be
> distributed between cores and even across networks.  Many of the
> problems associated with threads disappear.

hurrah!

> Some significant disadvantages:
>
> * The mindset of data flow programming is unfamiliar to most
> professional programmers.  Most dataflow programming languages
> are niche languages used by non-professional programer users.

didn't we use to design programs with data flow diagrams then convert
them into procedural programs...



--
Nick keighley


From: Nick Keighley on
On 28 Oct, 19:34, user923005 <dcor...(a)connx.com> wrote:
> On Oct 28, 10:54 am, c...(a)tiac.net (Richard Harter) wrote:
> > On Wed, 28 Oct 2009 17:29:04 +0100, "Dmitry A. Kazakov"
> > <mail...(a)dmitry-kazakov.de> wrote:
> > >On Wed, 28 Oct 2009 16:08:06 GMT, Richard Harter wrote:
> > >> On Mon, 26 Oct 2009 17:04:36 -0700 (PDT), user923005
> > >> <dcor...(a)connx.com> wrote:
> > >>>On Oct 26, 3:22=A0pm, c...(a)tiac.net (Richard Harter) wrote:


> > >>>> SHOULD IT BE TURTLES ALL THE WAY UP?
>
> > >>>> In the famous anecdote, the little old lady replies to the noted
> > >>>> philosopher, "It's turtles all the way down." =A0When it comes to
> > >>>> writing software many writers on software design and many
> > >>>> programming language creators seem to believe that it is turtles
> > >>>> all the way up.
>
> > >>>> What do I mean by "turtles all the way up"? =A0By this I mean the
> > >>>> thesis that the techniques and programming language concepts that
>
> > >>>> are used in the small can be extended indefinitely to programming
>
> > >>>> in the large. =A0In other words if we use language X for 100 line
> > >>>> programs and 1000 line programs we can also use it for 1000000
> > >>>> line programs. =A0We may have to add some extensions and new
> > >>>> features along the way, but we can increase the size of the
> > >>>> programs we write indefinitely using the same ideas.
>
> > >>>I've never seen a convincing argument to show that this is wrong.
>
> > >>>We can use a 26 letter alphabet to make little words.
<snip>
> > >>>We can use a 26 letter alphabet to make entire libraries.
>
> > >>>Why isn't the same thing true of programming languages?

you don't make a library out of letters. Your own analogy demonstrates
that lack of scalability of the letter.

> > >> It is.  We can use 1's and 0's to build software.
>
> > >Yes, but that is the machine language.

right so we change the unit as we scale


> > > When communicating ideas about
> > >software to other people we are using natural languages.

I use programming languages.

There was the guy that explained to me that a telecommunications
switch was "just shift registers!". Whilst probably true was
unhelpful. This lack of appreciation for scale and distrust of
abstraction seems very odd in field that is all about abstraction.

Boolean algebra is much easier if you just write out the wave equation
for the electron and the hole.


> > >> Similarly human brains are made out of atoms.

quarks! It's quarks all the way down!


<snip>

> > >The argument is that it is not clear what a hierarchy of programming
> > >languages adds.
>
> > You are missing the point.  He made an observation and called it
> > an argument.  
>
> I made an analogy and never called it an argument.
>
> > It wasn't.  He did go on to question whether a
> > hierarchy of programming languages had value.  
>
> I went on to say

you did not. I mean really, where did you say that?

> that we do the same thing with programming
> langauges.  

and then you go on to demonstrate that we don't...


> For instance, in C, I will write simple functions to build
> libraries.

expession -> function -> library

that's three layers of abstraction right there



> I will use libraries to build filter programs.  I will
> pipe filter programs in a chain to accomplish complicated tasks.

ooo look! data flow programming!


> This
> is a metaphor that works pretty well.  

it does but the opposite way from what you wanted.

Don't get me wrong I've spent too much time debugging designs encoded
in arcane graphical formats that then have to be debugged all over
again when converted to code (generate the code you say? why not just
write the code?) to be a fan of graphical representations; but the
fact is we do need higher level things to reasons with than raw C. And
we have them.


> A lot of computer science seems to me to be largely mythology.  
> What I mean by that is (for instance)
> the notion that OO programming will allow less expensive complicated
> systems.  It is possible to write complicated systems in OO languages
> for a reasonable cost.  But empirical studies have shown that the cost
> is not lower than writing the same systems in a simple language like C
> [1].  So now people are trying other paradigms like Aspect Oriented
> programming.  I am not saying that alternative paradigms are bad.  In
> fact, I program almost exclusively in C++ and love the OO metaphor.  I
> am just saying that a change in paradigm doesn't usually change much.
> For instance, in C++, by creating objects we do NOT reduce
> complexity.  We only hide it.

<snip>

--
Nick Keighley

"Object-oriented programming is an exceptionally bad idea
that could only have originated in California."
Dijkstra quote:

From: Nick Keighley on
On 27 Oct, 08:55, "Dmitry A. Kazakov" <mail...(a)dmitry-kazakov.de>
wrote:

> * Unmaintainable code. Look at large data flow programs (e.g. in DiaDem,
> Simulink, LabView). It is impossible to use reasonable software processes
> on them. How to compare two diagrams?

convert it to a textual represention the run diff on it. I'm not
saying it's trivial but I don't think its intractable either.


> When are they semantically equivalent?

when they are the same. Code gives you exactly the same problem. The
pretty picture is only a representaion! You're programmer's for
Turing's sake! What else do you do apart from manipulate
representations?

> How do I validate a diagram?

there were tools that could do this in the 80s


> How to search for anything in a diagram?

solved in the 80s

what's the quote about being condemed to repeat it?

I don't *like* graphical design method-ologies but I don't pretend
they can't do things that they can do!

Dammit I'm running out exclamation marks??

> Another example of this problem is represented by GUI libraries,
> which are mostly event controlled. Practically none of them can be easily
> used in multitasking environment. It is a horror to debug hangups or
> generators in messages routed from one messages loop to another. And the
> proposal is to build *everything* this way. Shudder.

though you may have a point here


--
Nick Keighley


As our circle of knowledge expands,
so does the circumference of darkness surrounding it.
(Albert Einstein)


From: Nick Keighley on
On 28 Oct, 15:44, p...(a)informatimago.com (Pascal J. Bourguignon)
wrote:
> Joshua Cranmer <Pidgeo...(a)verizon.invalid> writes:
> > On 10/28/2009 12:32 AM, Pascal J. Bourguignon wrote:
> >> Can you not specify all programming problem in less that a few
> >> thousands of lines of specification?
>
> >> Well, you can always write more detailed specifications, but I can
> >> assure you that sales peoples will always be able to put the whole
> >> specifications of your software on a 2-page booklet.
>
> > The PDF 1.6 specification was, if I recall correctly, approximately
> > 1400 pages of text. My draft of C++0 weighs in at a whopping 1314
> > pages. The JVM spec is smaller, at only 542 pages. The full x86_64 ISA
> > comes in volumes; even the ancient MC6800 ISA took 40 pages or so to
> > explain.
>
> > Many specifications, to approach the degree of completeness required
> > for independent implementation, need to drag on for hundreds or
> > thousands of pages.
>
> Why should you care about the detail of the micro-instructions of your
> MC6800?  Cannot you write a specification for a microprocessor in two
> lines?  Let the system care about the details itself.

talk to the Sparc and ARM people. They license their designs. I bet
they come to more than two lines. Oh, I forgot if they'd used Lisp
they be able to hide it all in one macro!