Prev: AVR BASIC COMPILER source code released
Next: ATtiny10
From: Niklas Holsti on 2 Sep 2009 12:09 D Yuniskis wrote: > Andrew Reilly wrote: > ... >> When a similar discussion came up on Comp.arch last week, there was >> quite a bit of vocal support for both contemporary Ada and D, both of >> which are > > <groan> Stay away from Ada. It *really* doesn't seem worth the > effort! :< If that groan and warning are based on actual experience, rather than hearsay, it would be interesting to hear more about your reasons for saying so. -- Niklas Holsti Tidorum Ltd niklas holsti tidorum fi . @ .
From: Vladimir Vassilevsky on 2 Sep 2009 12:25 D Yuniskis wrote: > Hi Vladimir, > > Vladimir Vassilevsky wrote: > >> D Yuniskis wrote: >> >>> Are there any folks who have successfully deployed larger >>> applications in an OO language? No, I'm not talking about >>> desktop apps where the user can reboot when the system >>> gets munged. I'm working in a 365/24/7 environment so >>> things "just HAVE to work". >> >> >> We have a relatively big embedded system (several M of the source >> code, filesystem, TCP/IP, RTOS, etc.) developed by a team of five >> programmers. This system works in the field, 24/7, unattended. > > With a filesystem, I am assuming (not a "given") that you have > lots of resources available (?). Not a lot. Only 600MHz, 64M. In our days, that's nothing. :-) Actually, the OS, filesystem and TCP/IP is not very resource consuming: depending on the number of threads, files, sockets, etc., the system needs are practically in the ~100K range. At minimum, we can run from the BlackFin CPU L1 memory. > E.g., does the OS support VM? Unfortunately, BlackFin only has the rudimentary MMU. The full featured MMU would be very useful; I will certainly consider a CPU with the MMU for the next project like that. > Or, are all of the tasks (processes) running on it known to have > bounded resource requirements? You feel my pain... > Was the OS written in C++ or just the applications? The RTOS was written in C++. The multitasking and the hardware abstraction concepts fit nicely with the C++ paradigm; this was one of the arguments for using C++. I was very frustrated with the C call nuisance of mucos-II and ADI VDK before. > Does the OS have provisions to detect (and recover from) > crashed applications? Or, does a crashed application > bring the system to its knees? A crashed application can very well screw up everything. However, if this happens, we fall into the bootloader, so we can recover. >> At the very beginning, there was the usual trivial argument about C vs >> C++, and it was decided to use C++. Now I can say that was a wise >> choice; it would be difficult to tackle the complexity in C. > > Agreed. But, the problem (IMO) with C++ (or other 4G languages) > is that it is often hard to find folks who *really* know what's > happening "under the hood". That's the whole point: making the application programming available for dummies. The OO system is supposed to protect them from themselves. I.e., the sort of intuitive > understanding of exactly what the compiler will generate for > *any* arbitrary code fragment. > > E.g., I have been writing in C++ for many years now and I am > constantly surprised by things that happen "unexpectedly". > There's just way too many little things that go on that > catch me off guard. If I am writing for a desktop > environment, I can usually shrug and deal with it. But, > when I have a fixed, tightly constrained set of resources > to work within (TEXT, DATA and "time"), these "judgment > lapses" quickly get out of hand. :< > >>> Any tips you can share that can help me get C-like >>> behavior from a C++-like implementation? (besides the >>> obvious: "use only the C subset of C++" :> ) >> >> >> "There is no and there can't be any substitute for the intelligence, >> experience, common sense and good taste" (Stroustrup). > > > (sigh) But said by someone working for a firm with lots of > re$ource$ to devote to staff, etc. Things seem to be considerably > different in the "real world" (I am continually disappointed > with the caliber of the "programmers" I meet... "just get it > done" seems to be their mantra -- note that "right" is not > part of that! :< ) From the other hand, the bulk of the programmer's work is nothing more then a legwork; it doesn't have to be done brilliantly; it is just has to work somehow. > It is exactly this problem that has me vacillating about whether > a "highly structured C approach" would be better or worse than > doing it in C++ (or other 4G HLL). I.e., which are "average Joes" > least likely to screw up? :-/ Just recently I had to fix the project in C developed by "average Joes". There were the usual C problems with not initializing something, providing not enough of memory for something, and running out of the array size. So I think C++ is the better way to do the things. Vladimir Vassilevsky DSP and Mixed Signal Design Consultant http://www.abvolt.com
From: D Yuniskis on 2 Sep 2009 14:35 Hi Niklas, Niklas Holsti wrote: > D Yuniskis wrote: >> Andrew Reilly wrote: >> ... >>> When a similar discussion came up on Comp.arch last week, there was >>> quite a bit of vocal support for both contemporary Ada and D, both of >>> which are >> >> <groan> Stay away from Ada. It *really* doesn't seem worth the >> effort! :< > > If that groan and warning are based on actual experience, rather than > hearsay, it would be interesting to hear more about your reasons for > saying so. Personal experience. The language is just too "big" and clumsy. With good discipline, I think you can produce applications that are as reliable as anything developed under Ada *without* that burden. I liken Ada to automobiles that *force* you to wear a seatbelt prior to starting. I.e., it imposes itself on the developer/application on the assumption that the developer won't Do The Right Thing (deliberately or unintentionally). This might be A Good Thing for defense contracts, very large applications developed by large teams with diverse capabilities and skill levels, etc. But, its a heavy price to pay for "everything" (and, apparently, Industry seems to concur with that). [I wonder how long it would take to develop the iPhone if it had been written in Ada? :-/ ] Imagine having to have a person sitting in your automobile so that it will start so that *you* can check the level of the automatic transmission fluid. I.e., its just an inconvenience that gets in your way.
From: Hans-Bernhard Bröker on 2 Sep 2009 17:33 Chris Stratton wrote: > I've been considering writing a sort of pre-preprocessor to take find > these "as needed" mentions and move them to a place where the > compiler will tolerate... but then compiler error locations would > be off, and source-level debugging not reflect the editable > sources... Not necessarily. C code-generating programs have been around since effectively forever, and so have methods that allow the compiler and other tools to refer all the way back to the real source text. That's what the #line directive was invented for.
From: Boudewijn Dijkstra on 3 Sep 2009 05:55
Op Wed, 02 Sep 2009 08:42:33 +0200 schreef D Yuniskis <not.going.to.be(a)seen.com>: > Arlet wrote: >> On Sep 1, 4:43 pm, D Yuniskis <not.going.to...(a)seen.com> wrote: >> > [...] I'm just afraid others won't be able to keep up that > discipline or understand fully what is going on (e.g., lots of > function pointers -- people seem to not like pointers... let > alone pointers to functions! :< ) People who are not comfortable with pointers haven't done enough assembly language programming. -- Gemaakt met Opera's revolutionaire e-mailprogramma: http://www.opera.com/mail/ |