From: John on
Hi Miem,

In article <1159777724.426016.42900(a)b28g2000cwb.googlegroups.com>,
MiemChan(a)gmail.com says...
> As an amateur embedded circuit player, I have used couple of AVR and
> PIC microcontrollers in the past.
>
> In these days it is not to hard to find small, ARM based ready to use
> embedded boards under $100. They seems to have faster clock speed then
> most of the AVR and PIC boards.
>
> Can anybody shortly compare ARM with PIC ad AVR interms of (a)
> performance (b) software support (c) price?

Unless the project requires it, I would say stick with an AVR (my first
choice).

I've finished one project using some AVRs and now I'm attempting to use
an NXP/Philips LPC2103. I went with the LPC2103 mainly because it has a
fast A/D and it's inexpensive. I've worked with 32-bit processors on
other projects, including ARMs.

Here's my lengthy comparason of AVR vs. ARM development...

For the AVR I use CodeVision and I find it to be a very good compiler,
from a user perspective. I found the peripheral wizard in CAVR is
*very* handy -- you can start using the peripheral very quickly and you
don't have to remember the sometimes complicated initalization sequence
or register settings. With CAVR, when you're done compiling, you get
useful information on RAM and Flash resource utilization. I use
UltraEdit32 for the code writing, so I didn't use CAVR's IDE that much,
but I found it a sufficient IDE.

I did debugging using the Atmel JTAG ICE mkII and AVR studio and
debugWire. I didn't think it would very well, but surprisingly I have
very few complaints. The debugging capabilities of the new AVRs (JTAG
or debugWire) is quite good, single-stepping was very fast (you hit a
key, it steps instantly), and overall AVR Studio worked well. You can
do all the standard things you want, look at registers, memory
locations, watch variables, etc. Since AVR Studio is written by Atmel,
you get views of peripheral registers which are named, with their port
bits broken down, and you can toggle the bits as you see fit. There are
some rough spots (enabling/disabling debugWire should be done
automatically if you goto into programming mode or debug mode, is my
major gripe). CAVR has some nice extentions like PORTC.3 = 1 means bit
3 of port C is set to 1. Those kinds of extentions, I found, are very
handy in embedded prorgamming.

Contrast this to my current setup with the LPC2103. I am using the
GNUARM toolchain set (thanks Rick/Pablo/everyone else who put it
together) which in itself works. I followed a tutorial written by "Jim
Lynch" which shows how to get GNUARM, the Eclipse IDE and the OpenOCD
GBD daemon all working together. I have an existing piece of JTAG
hardware that works with OpenOCD, so I didn't have any additional
hardware costs.

With the ARM development you'll have to make a choice between sticking
your code in FlashROM and executing from there (can be slower, but
usually more code space) or putting it in RAM (not much room). This is
a limitation of working with a CPU vs. a microcontroller. A big deal
for ARM7-TDMI devices is they only have two hardware break-points. So
if you want to single-step your code which is in Flash, that requires
both hardware breakpoints. If you're using any open source tools, you
can almost forget about single-stepping and setting meaningful
breakpoints. If you want software breakpoints, you'll need to stick
your code in the limited RAM. This a big tradeoff, for the LPC2103
there is 32 KBytes of Flash but only 8KBytes of RAM.

Getting the GNUARM+Eclipse+OpenOCD working is a time consuming setup in
my opinion. The compiler works, but you'll spend a decent amount of
time mucking with C run-time files (crt0.s), assembly initalization
code, linker scripts and other things. Thank fully the LPC2000 forum at
Yahoo has some pre-exiting examples you can use as a starting point.

Eclipse has (in my opinion) an overly complicated user interface that
can be quite slow and unresponsive at times. It seems like it's very
customizable, but if you start digging, you'll find you can't streamline
it too much. Using the Eclipse IDE for writing code works OK, but using
the "Zylin Embedded CDT Debugger" is not a pleasant experience (at least
with OpenOCD), I found it very unreliable. I have since switched to the
Insight debugger with my code executing from RAM.

Insight works OK, but single-stepping takes 4-5 seconds per step! The
AVR setup single-steps instantly (or so it feels). Insight of course
has no knowledge of the chip's peripherals, so if you want twiddle
enable bits or look at peripheral settings, you'll have to dump the
memory location and work backwards.

So, on paper using one of these ultra-cheap ARM "microcontrollers" looks
good, but I think you'll find there's a decent sized leap to get it
going. I had been thinking of using these ARM parts in some personal
projects, but for now I'm sticking with the AVRs.

Someone might be quick to point out a commercial compiler would work
better and that it is unfair to compare CAVR, a commercial compiler, to
the free GNU toolset. This might be true, but commercial ARM compilers
are usually more than a few hundred $$ and they usually only work with
their JTAG debug tools, so you're very quickly locked in. Many of the
commercial ARM toolchains (Keil, Rowley for example) are based on the
GNU toolchain, so all of those limitations come along for the ride.

My $0.02

John.
From: Ulf Samuelsson on
linnix wrote:
>> Almost all ARM have JTAG so if you need OCD you lose multiple pins.
>
> That is the positive side of ARM. Jtag is always there, and reliable.
> AVR Jtag, on the other hand, could be disabled, and thus un-reliable
> by definition.

JTAG can be disabled in AT91SAM7 circuits as well.
It is *MANDATORY* if you want any type of code protection...
(The Boundary Scan will still work of course)

--
Best Regards,
Ulf Samuelsson
ulf(a)a-t-m-e-l.com
This message is intended to be my own personal view and it
may or may not be shared by my employer Atmel Nordic AB


From: Ulf Samuelsson on
Buddy Smith wrote:
> steve <bungalow_steve(a)yahoo.com> wrote:
>
>> AVR and PIC aren't really comparable with ARM, the first two are very
>> low cost/power 8 bit machines, the ARM is a higher power, higher cost
>> 32 bit machine. If you need to make a device that needs to run on a
>> coin cell for 2 years, you can't pick an ARM processor, if you need a
>> CPU that can do real time FFT, a PIC won't do it.
>
> I thought so too, but the products from luminary micro
> (luminarymicro.com), discussed in this newsgroup recently and in
> Circuit Cellar, have changed my mind.
>
> They make ARM CPUs with very little RAM and flash, on the cheap....
> they say less than one dollar in 10k quantities (from an advertising
> spiel)

LMI make Cortex chips which are incompatible with most of the others.
Apparently they are financed by ARM themselves.
I guess that is one reason why the uptake is not dramatic.

> ttyl,
>
> --buddy

--
Best Regards,
Ulf Samuelsson
ulf(a)a-t-m-e-l.com
This message is intended to be my own personal view and it
may or may not be shared by my employer Atmel Nordic AB


From: Joseph on
Ulf Samuelsson wrote:

>
>
> LMI make Cortex chips which are incompatible with most of the others.
> Apparently they are financed by ARM themselves.
> I guess that is one reason why the uptake is not dramatic.
>

Hi Ulf,

You might have been seriously misinformed :-)
LMI is not financed by ARM. We are two different companies, and LMI
is a ARM partner.

The definition of incompatible is a bit unclear.
Same as any Cortex-M3 chips, the LuminaryMicro Cortex-M3 chips are not
binary compatible with traditional ARM processors. The Thumb
instructions is the same (except BLX and SETEND instructions). But
startup code, interrupt handlers and system control codes (e.g. mode
switching) will have to be rewritten.

However, application codes developed for LuminaryMicro parts will work
on any other Cortex-M3 parts (of course some code might need to be
changed if the peripherals / memory map are different).

regards,
Joseph
From: Isaac Bosompem on

John wrote:
> Hi Miem,
> *snip *

I guess I should add my $0.02 as well. I did not find the transition
from PIC/8051 MCUs I was working with before to ARM chips to be very
difficult at all. Yes I had to write my initialization code and the
linker scripts but they are quite easy to learn. At first I was scared
by linker scrips because everytime I opened one up I'd be like "what
the hell is this?" but after learning the syntax its not so bad.

I am working with the AT91SAM7S256, which is a pretty pleasant chip to
work with.

I did also read the tutorial but I didn't read through all of it.
Eclipse is damn terrible, consumes a large amount of memory (seriously,
on my system it consumes almost as much physical memory as that FEAR
game) and is very slow.

Since I am working on a VERY limited budget, I use Crimson Editor to
edit and compile my code and then use Insight to debug it. For me, its
simple, simply press Ctrl+2 to do a make clean and Ctrl+1 to build the
source to both an ELF and binary. I'd say to learn it because there
might be a time in which you will need a 32-bit MCU and you don't want
the additional burden of learning at that time.

Also if you are now working with the 8-bit AVR, why not try the MSP430
as well? I have a cheap board on it that is powered with a watch
battery and it keeps going (of course the CPU is running off the
internal DCO, which is only around 800kHz).

-Isaac