From: D Yuniskis on
cs_posting(a)hotmail.com wrote:
> What was interesting was that the other half the lab in which the
> freshmen were wearing out the () keys of a collection of apollos had
> juniors hand wiring modules into a 4 bit (or was it 8?) processor,
> that then software emulated a 32 bit one, at an effective clock rate
> of, oh, maybe a hz or two. So in a way it eventually did all tie
> together, but the lack of initial pragmatism was a little shocking to
> someone who had grown up with small micros.

<grin> Try designing with *analog* computers (a project from
my high school days) :-/
From: D Yuniskis on
Nobody wrote:

[snips throughout]

> On Thu, 03 Sep 2009 11:28:23 +0100, Phil O. Sopher wrote:
>
>> It is a mystery to me as to how recent graduates of Computer Science
>> are vaunted as experts on computers, yet haven't a clue about the actual
>> operation of a computer at the assembly language (or even machine code)
>> level.

> Beyond that, there is a common view that a focus on low-level details can
> be harmful.
>
> In my experience, there is some justification for that view; e.g. I've
> seen experienced assembly-language programmers spend hours shaving
> clock cycles off the inner loop of an O(n^2) algorithm when a naive
> implementation of an O(n.log(n)) algorithm turned out to be significantly
> faster.

Yup. I can recall my first Pascal program to control a DataI/O
PROM programmer: writing things like "printHex()" to convert
a 4 bit value to a printable hex character [0-9A-F]; then using
that in "printByte()" to print an 8 bit value as a pair of
hex characters; then using *that* to make "printAddress()"
to print a string a 4 (or 8) bytes as a set of hex characters,
etc.

<:-(

Part of learning is sorting out when to change "styles"
as the tools and environment changes around you!

> I've seen the same programmers struggle as the industry has moved from a
> small number of large projects employing many programmers for years to
> millions of projects taking a single programmer a week.
>
> Once upon a time, embedded development wasn't all that much different to
> developing for larger systems. These days, they're completely different
> fields. The skills required for embedded development are more likely to be
> taught on an electronic engineering course than computer science.

Wow! My experience has been exactly the opposite! Years ago
(my first design was in the 70's) "software" was just another
aspect of the hardware design. And, very much in *tune* with it.
The processor was just a different way of doing something
that you would otherwise do with sequential logic (e.g., build
an FSM out of tonka-toy logic). It was EEs that did the work
(at the time, my college didn't even offer a "CS" degree;
it was "Electrical Engineering with the Computer Science
option").

Nowadays, tools have made designing embedded systems *so*
much more like designing for a desktop system. E.g., you
don't have to burn ROMs, stuff them into your target,
cross your fingers and *hope*. You don't have to rely on
setting up "scope loops" so you can watch your code *with*
a 'scope to try to figure out what is happening. You don't
have to count *bits* of memory because you only had a hundred
bytes (for a *big* system). You were lucky to have *an*
assembler from *one* vendor for your processor. You had
no choice of HLL's (I recall PL/M being "a big deal" and
it's little more than "structured assembly language").

I think the current alleged problem is due to the fact that
it *is* possible, nowadays, to do an embedded system
design without even having the *hardware* available to
run your code! (the idea of a simulator "way back when"
was just a luxury to drool over; an *emulator* was something
you shared with every other developer in the company as they
were too damn expensive to provide "per seat")

> If you read the electronics groups, you'll find EEs complaining about
> current EE graduates understanding FPGAs and microcontrollers but not
> understanding the operation of a simple amplifier.

The observation has been made (in other fields) that we
suffer from the fact that we just "inherit" knowledge and
don't go through the equivalent of an apprenticeship
(like folks in The Trades). As a result, we don't get a "feel"
for what we have "absorbed". And, don't get to appreciate
what is behind it, etc.

I suspect this is largely true. However, the flip side -- the
long apprenticeships, etc. -- would slow the pace of "progress"
considerably in this field (i.e., how much have *toilets*
changed in the past few hundred years...? :> )
From: D Yuniskis on
Hi Niklas,

Niklas Holsti wrote:
> D Yuniskis wrote:

[snips throughout]

>> Personal experience. The language is just too "big" and
>> clumsy.
>
> So you don't like it, and of course you have that right.

<shrug> I just want to "solve problems" with the most
available options at my disposal. Ada limits the size
of the problem that you can practically solve and the
hardware on which you can solve it. E.g., are you
going to design a control system for a microwave oven
on a little 8 (or even *4*!) bit processor in Ada?
Are you going to find a compiler vendor who will support
"that" processor?

Or, are you just forced to use more hardware (than necessary)
in order to use that toolset?

> (By the way, was your experience with the original Ada, the 1983 form
> that is, or the updates from 1995 or 2005? Quite a lot of flexibility
> and power have been added in the newer forms of the language.)

Quite old. I recall the 1995 stuff being *announced* so
this predated that.

>> With good discipline, I think you can produce
>> applications that are as reliable as anything developed
>> under Ada *without* that burden.
>
> No doubt. But I like the help I get from Ada to observe that discipline.
> I'm fallible -- sometimes more, sometimes less, but rarely zero. Doing
> it in C feels like harder work. In addition, Ada gives me tools that C
> just doesn't have: modularity and encapsulation with packages, strong
> separation of interface and implementation for operations *and* data
> structures, run-time error checking (if I want it).

You can get all of that with C -- but, you have to *work* at it!
:>

I find the bigger bang (nowadays, with larger projects) is to
put the effort into the OS. Let *it* serve as a safety net
for the developer, etc. And, let it enhance the programming
environment so the developer can call on richer subsystems
to offload some problems at run-time that he would have to
solve himself, otherwise.

For example, I've started implementing a "rational, decimal,
variable precision math package" (yikes! that's a mouthful!)
to offload some of the pesky "fringe areas" that standard
math libraries don't handle well. Sure, a savvy user can
order his math operators to preserve the most precision
in his answers, etc. But, it seems easier to use resources
(time and space) to provide a service that the developer can
call upon to do these things "perfectly" with less effort.

>> ... I.e., its just
>> an inconvenience that gets in your way.
>
> It may feel like that sometimes, but there are several reports that
> indicate that in the end Ada helps programmers, in several contexts.

Yes, but note that almost all of these are larger projects
[included below for context] with large development staffs
on "off the shelf hardware" (i.e., where the hardware was
chosen as a consequence of the software requirements and
not the other way around)

I'd be more impressed if google/nokia had claimed that they had
developed the iPhone using Ada (etc.)

> For example, John McCormick reports dramatically better results from
> students in a real-time programming course when they implemented a
> model-railroad system in Ada than in C. See
> http://www.sigada.org/conf/sigada99/proceedings/p111-mccormick.pdf.
>
> Pratt-Whitney compared its production of software for military jet
> engines (initially mandated to use Ada) and for commercial engines
> (initially using whatever they wanted). The Ada side had twice the
> productivity and one-fourth the number of errors. See
> http://www.adaic.com/atwork/pw.html. Ok, the commercial side used lots
> of assembly code. But according to this report, the government side of
> Pratt-Whitney is staying with Ada, although the "Ada mandate" has been
> lifted, so they could switch if they wanted to.
>
> Stephen Zeigler compared the effectiveness of C and Ada programming at
> Rational, a compiler and SW tool business (now part of IBM), as Rational
> was gradually switching from C to Ada. See
> http://archive.adaic.com/docs/present/ajpo/pll-cost/html/tsld058.htm.
> Some quotes: "Ada cost almost half of what the C code cost, and
> contained significantly fewer defects per 1000 SLOC by 7x the C code
> (700%). Even on the new C++ code, Ada still has 440% fewer defects."
>
> In fairness, there are also studies that suggest that the languages are
> about equally effective. See
> http://www.stsc.hill.af.mil/crosstalk/1996/07/quantify.asp.
>
> YMMV, sure.
From: D Yuniskis on
Hi Chris,

Chris Stratton wrote:
> On Sep 2, 2:18 am, D Yuniskis <not.going.to...(a)seen.com> wrote:
>
>>> Personally, what I dislike is having to mention "objects" in multiple
>>> places. What I want is a system where I can instantiate things inline
>>> as I need them ("make me a menu with these choices"), but have all the
>>> allocations be predetermined during compilation (no runtime memory
>>> management surprises) and summarized for me.
>> Well, sometimes that just isn't possible. Or, at least not
>> easily automated (especially if you are resource constrained).
>> Memory management is always a tricky subject. At upper application
>> levels, you can afford (somewhat) to let the application deal
>> with memory problems (running out of heap, etc.). At lower levels
>> (e.g., within the OS), you often have to "guarantee" that
>> "/* Can't Happen */" REALLY CAN'T HAPPEN!
>
> Actually what I want to do is not very complicated. The end result I
> want can be achieved by manually typing the right things in the right
> places in the source files. What I dislike is that when I add say a
> menu to my user interface, then I have to go back to a different place
> in the file and list it's choices. I've been considering writing a
> sort of pre-preprocessor to take find these "as needed" mentions and
> move them to a place where the compiler will tolerate... but then
> compiler error locations would be off, and source-level debugging not
> reflect the editable sources...

Ah, why can't you just define them as static, file scope?

>>>> Are there any folks who have successfully deployed larger
>>>> applications in an OO language? No, I'm not talking about
>>>> desktop apps where the user can reboot when the system
>>>> gets munged. I'm working in a 365/24/7 environment so
>>>> things "just HAVE to work".
>>> The android phone I've been playing with comes close... pseudo-java
>>> with an active garbage collector...
>> <frown> I think anything that relies on GC will bite me
>> as it is hard to get deterministic performance when you don't
>> know how/when the GC will come along. Hence my suggestion
>> that new() be heavily overloaded (probably relying on lots
>> of "buffer pools" for the corresponding objects) just to
>> make sure "automatic allocation" can work in a reasonably
>> deterministic fashion (like purely static allocation)
>
> Not sure if I can blame that or not, but it crashed when my alarm
> clock went off this morning after about a week and a half of
> uptime... fortunately the "alarm" that continued to sound was a
> decent audio track

It was probably up late the night before! ;-)

From: D Yuniskis on
Hi Boudewijn,

Boudewijn Dijkstra wrote:
> Op Wed, 02 Sep 2009 08:42:33 +0200 schreef D Yuniskis
> <not.going.to.be(a)seen.com>:
>> Arlet wrote:
>>> On Sep 1, 4:43 pm, D Yuniskis <not.going.to...(a)seen.com> wrote:
>>>
>> [...] I'm just afraid others won't be able to keep up that
>> discipline or understand fully what is going on (e.g., lots of
>> function pointers -- people seem to not like pointers... let
>> alone pointers to functions! :< )
>
> People who are not comfortable with pointers haven't done enough
> assembly language programming.

Exactly! I routinely do things through pointers (including
changing program flow) as they are often an excellent
efficiency hack. And, often let you do things that would
otherwise be very difficult or clumsy to do (e.g., "Here's
a pointer to a filter function that takes a pointer to a
struct of data specific to that filter function and returns
<something>")

I find many modern languages that *don't* let me use pointers
(for fear that I will "hurt myself") to be very irritating.
Sure, they usually provide a way to do something functionally
equivalent but often at some extra *cost*.

First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8 9 10 11
Prev: AVR BASIC COMPILER source code released
Next: ATtiny10