From: Royston Vasey on

"D Yuniskis" <not.going.to.be(a)seen.com> wrote in message
news:hu7j6t$1q8$1(a)speranza.aioe.org...
> Hi Royston,
>
> Royston Vasey wrote:
>> I once read somewhere that if you can write in assembler then C should be
>> easy - either I misunderstood or whoever said it was WRONG! :)
>
> The problem you are having is that C hides a bit more of the
> hardware from you that ASM would.
>
> E.g., in ASM, you would be more aware of what's on the stack, etc.
> You wouldn't do something like:
>
> MAIN:
> ...
> CALL FOO
> ...
>
>
> FOO: CALL BAR
> ...
>
>
> BAR: CALL BAZ
> ...
>
>
>
> BAZ: JUMP MAIN
>
> (because you'll munge the stack!)
>
>> Now to implement it!

That's true. I guess the advantage of C when I become more conversant will
be the speed of getting code up & running.


From: RockyG on
>
>"D Yuniskis" <not.going.to.be(a)seen.com> wrote in message
>news:hu7j6t$1q8$1(a)speranza.aioe.org...
>> Hi Royston,
>>
>> Royston Vasey wrote:
>>> I once read somewhere that if you can write in assembler then C should
be
>>> easy - either I misunderstood or whoever said it was WRONG! :)
>>
>> The problem you are having is that C hides a bit more of the
>> hardware from you that ASM would.
>>
<snip>
>>> Now to implement it!
>
>That's true. I guess the advantage of C when I become more conversant will

>be the speed of getting code up & running.
>
>
Changing platforms (different processor etc.) becomes a lot easier,
especially if you abstract the hardware properly. (HAL)

---------------------------------------
Posted through http://www.EmbeddedRelated.com
From: George Neuner on
On Fri, 4 Jun 2010 12:08:28 +0800, "Royston Vasey" <royston(a)vasey.com>
wrote:

>"Albert van der Horst" <albert(a)spenarnc.xs4all.nl> wrote
>> Royston Vasey <royston(a)vasey.com> wrote:
>>
>>> I'm teaching myself C using Microchip's C18 compiler.
>>
>> Although this is c.l.e I would recommend using Turbo C 2.0 to learn
>> C. If you can get it. No distraction from the language itself.
>
>Thanks Albert, but I'm using C18 as my objective is to created an embedded
>device and the direct route suits me best.

I get that you want to dive into hardware, but you'd really be better
off learning the language *before* you try to use it for an embedded
project. The problem with C is that it _looks_ simple - the truth is
that it will be quite a while before you will be able to write
reliable programs.

Compilers for small MPUs, DSPs and PICs (the generic "PIC") tend to
have non-standard features, weird limitations and just more plain old
bugs than compilers for popular desktop OSes. And cross-compiling for
an embedded target creates build and test issues that desktop systems
don't have. All these things are confusing distractions that you
don't need while you're trying to learn a language.

There are decent, free compilers available for just about any OS.
Except for GCC, most won't be C99 compilers, but any ANSI compiler
will do for learning.

George
From: Paul Keinanen on
On Wed, 02 Jun 2010 08:35:24 -0600, hamilton <hamilton(a)nothere.com>
wrote:

>On 6/2/2010 6:24 AM, Royston Vasey wrote:

>> Using assembly I would have used "goto" to steer execution where I wanted
>> it, but how is it approached in C?
>
>There is no "goto" in C.
>
>"goto" in C is bad (very bad) practice.

Up to the 1960's usually the only way to alter the program execution
was some kind of jump/branch/goto instructions and some primitive loop
constructs on some high level languages (such as the DO loop in
Fortran IV), thus gotos had to be used almost exclusively.

With languages containing some structured features that are easy to
use, the need for gotos was significantly reduced, but not eliminated
completely.

The C-language lacks several features such as loop naming (allowing
exiting multiple nested loops at once) or switch/case style error
handlers at the end of module and thus gotos are still required.

I would consider the slogan "goto considered harmful" as a harmful
statement, since applying it blindly has created a lot of unreadable
and hence unmaintainable code (such as weird status variables or very
deeply nested if-statements) instead of using one or two well placed
gotos to simplify the program structure.

After all the Dijkstra/Wirth slogan "goto considered harmful" was
intended to advocate the structured programming model and languages
based on that model.

From: David Brown on
George Neuner wrote:
> On Fri, 4 Jun 2010 12:08:28 +0800, "Royston Vasey" <royston(a)vasey.com>
> wrote:
>
>> "Albert van der Horst" <albert(a)spenarnc.xs4all.nl> wrote
>>> Royston Vasey <royston(a)vasey.com> wrote:
>>>
>>>> I'm teaching myself C using Microchip's C18 compiler.
>>> Although this is c.l.e I would recommend using Turbo C 2.0 to learn
>>> C. If you can get it. No distraction from the language itself.
>> Thanks Albert, but I'm using C18 as my objective is to created an embedded
>> device and the direct route suits me best.
>
> I get that you want to dive into hardware, but you'd really be better
> off learning the language *before* you try to use it for an embedded
> project. The problem with C is that it _looks_ simple - the truth is
> that it will be quite a while before you will be able to write
> reliable programs.
>
> Compilers for small MPUs, DSPs and PICs (the generic "PIC") tend to
> have non-standard features, weird limitations and just more plain old
> bugs than compilers for popular desktop OSes. And cross-compiling for
> an embedded target creates build and test issues that desktop systems
> don't have. All these things are confusing distractions that you
> don't need while you're trying to learn a language.
>
> There are decent, free compilers available for just about any OS.
> Except for GCC, most won't be C99 compilers, but any ANSI compiler
> will do for learning.
>
> George

I disagree with that advice. Programming C on a "big system" and
programming C on an embedded system are very different. People who have
learned C by reading books (or doing courses) and programming on
Windows, Linux, or whatever, often have a lot of unlearning to do before
they can write decent embedded software. They'll use "int" everywhere
with no consideration for the underlying cpu and they'll use floating
point, memory space, "printf" and "malloc" as though they were as cheap
as on a PC. They will miss out all understanding of interrupts,
volatiles, hardware access, resource limitations, etc.

I'd agree that the PIC's (at least, the small PIC's) are awkward devices
to learn and have a lot of idiosyncrasies. My recommendation here is
simply to drop the PIC and choose a better microcontroller. But if you
want to learn programming microcontrollers, learn with microcontrollers.
First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8
Prev: ready to run 32bit controller
Next: ARM7