From: David Brown on
Oliver Betz wrote:
> David Brown wrote:
>
> [...]
>
>> Even their paid-for subscription versions have source tarballs - you can
>> re-compile them yourself with the node-locking code disabled if you want
>
> Sure? As far as I understand, you don't get the library sources in the
> personal version.
>

I was thinking here about the tools themselves - gcc, binutils, gdb,
etc. I believe you get the sources to some parts of the library with
the personal version, but not all. As with many companies that make
their livings selling toolchains based on gcc, CodeSourcery provide
libraries that give you something more than the traditional C library
(either more functions, better optimised code, etc.)
From: David Brown on
John Devereux wrote:
> David Brown <david.brown(a)hesbynett.removethisbit.no> writes:
>
>
> [...]
>
>>
>> For building this sort of thing, I'd recommend using virtual machines
>> -
>> install Virtual Box (it's free, and runs on Linux and Windows) and
>> make your build machines as virtual installations. That way you can
>> easily try out different distros and keep your test builds isolated
>> and under control. Sometimes these things work better with particular
>> versions of particular tools (though gcc should build cleanly with
>> most tool versions), and with virtual machines for testing you can
>> avoid messing around with different versions of gcc on your main
>> machine.
>
> That seems like a lot of work - I just set the PREFIX in the build
> script and use that path subsequently in project makefiles. E.g. install
> to /opt/arm-elf-4.4.0. You can also set CC before building if you want
> to build with a different compiler version.
>

Nah, Virtual Box machines are very easy once you've tried it a couple of
times. Typical Linux distros install quickly and easy since you've got
no complicated hardware, a fast virtual CD (i.e., an iso file on your
disk), and for this sort of thing you can completely ignore most user
interactive software or configuration (no need to find yourself a theme
that matches your office wallpaper). It is also very easy to take
snapshots, archive your build machines, etc. And when you are following
a how-to that starts "I used Fedora 10..." and you've got Ubuntu 9.04,
you can just make a Fedora 10 machine and save yourself some work.

And of course, if you really screw things up, you haven't messed with
your main system.

As I said, gcc typically builds cleanly with most tool versions, so a
../configure prefix is often enough. The Virtual Box setup works
particularly well for more complex systems, such as buildroot setups for
an entire embedded Linux system.

An alternative "lighter" solution is to use something like openvz - it's
a sort of advanced chroot. There is much less overhead than openvz, but
you still get to have separate distro installations in each openvz
container.

> Lately I've been archiving the entire compiler along with each projects
> source code. That is, the stripped toolchain binaries are in a
> subdirectory of the project and are put under revision control along
> with it. Also there is a compiler build script as part of the project
> which can fetch the source code and rebuild the compiler if needed.
>

That's a good idea - it means you always have access to the tools you
used, even if you later are using a completely different system.

An alternative here is to do all your builds for a project within a
virtual machine, and archive the entire virtual machine. It takes a bit
more space, but is perhaps the most complete archive of the build
environment.
From: John Devereux on
David Brown <david.brown(a)hesbynett.removethisbit.no> writes:

> John Devereux wrote:
>> David Brown <david.brown(a)hesbynett.removethisbit.no> writes:
>>
>>
>> [...]
>>
>>>
>>> For building this sort of thing, I'd recommend using virtual machines
>>> -
>>> install Virtual Box (it's free, and runs on Linux and Windows) and
>>> make your build machines as virtual installations. That way you can
>>> easily try out different distros and keep your test builds isolated
>>> and under control. Sometimes these things work better with particular
>>> versions of particular tools (though gcc should build cleanly with
>>> most tool versions), and with virtual machines for testing you can
>>> avoid messing around with different versions of gcc on your main
>>> machine.
>>
>> That seems like a lot of work - I just set the PREFIX in the build
>> script and use that path subsequently in project makefiles. E.g. install
>> to /opt/arm-elf-4.4.0. You can also set CC before building if you want
>> to build with a different compiler version.
>>
>
> Nah, Virtual Box machines are very easy once you've tried it a couple
> of times. Typical Linux distros install quickly and easy since you've
> got no complicated hardware, a fast virtual CD (i.e., an iso file on
> your disk), and for this sort of thing you can completely ignore most
> user interactive software or configuration (no need to find yourself a
> theme that matches your office wallpaper). It is also very easy to
> take snapshots, archive your build machines, etc. And when you are
> following a how-to that starts "I used Fedora 10..." and you've got
> Ubuntu 9.04, you can just make a Fedora 10 machine and save yourself
> some work.

VirtualBox is geat - I do Visual C++ development in it on my debian
systen. It is also very good for testing Windows software releases, on
multiple versions of windows and letting you quickly roll back the
installation process each time.

[...]

>
>> Lately I've been archiving the entire compiler along with each projects
>> source code. That is, the stripped toolchain binaries are in a
>> subdirectory of the project and are put under revision control along
>> with it. Also there is a compiler build script as part of the project
>> which can fetch the source code and rebuild the compiler if needed.
>>
>
> That's a good idea - it means you always have access to the tools you
> used, even if you later are using a completely different system.

With git doing the revision control it is very fast and compact too.

> An alternative here is to do all your builds for a project within a
> virtual machine, and archive the entire virtual machine. It takes a
> bit more space, but is perhaps the most complete archive of the build
> environment.

But will your copy of VirtualBox 10 years from now be able to read
todays virtual machine snapshot? Aha, but you will be able to install a
copy of todays VirtualBox on a new virtual machine, and use that! :)

--

John Devereux
From: Carlo Caione on
On 04/05/2010 20:09, Tim Wescott wrote:
> Is anyone out there doing development for the ARM Cortex (specifically
> the m3) with the Gnu tools?

Yep...

> Are you using the CodeSourcery set, or are you building your own?

CodeSourcery is fine for me...

> If so, how are things going? There seems to be a welter of "how to"
> pages on this, but nearly all of them seem to be as old as the hills.

I agree....
I have just setup an environment with:
CodeSourcery + Eclipse CDT + GNUARM eclipse plugin

> My spare-time job right now is bringing up a set of tools that'll work
> on Linux and will let me develop on the TI LM3S811. I'm trying to keep
> everything 100% open source; since CodeSourcery is exceedingly coy about
> coughing up source code (I certainly haven't found it) and because their
> install scripts don't seem to be terribly compatible with my Linux
> installation (Ubuntu Karmic) I'm building from scratch.

In ArchLinux (and windows) the installation is ok...

> Things seem to be going well, although not completely straightforward --
> my current task is to write or find the obligatory startup code to
> establish a C++ run-time environment so that the rest of my code will
> work, and to verify that OpenOCD really does function on this machine.

I have an ST-LINK JTAG so I used the ST-LINK_gdbserver, not OpenOCD

> Aside from "you're crazy, see a shrink!" does anyone have any useful
> observations on the process? Any known-fresh web pages?

Well.... the steps for me:
0) install CodeSourcery
1) install Eclipse
2) install Eclipse C/C++ Development Tooling - CDT
3) install GNU ARM Eclipse Plugin
(http://sourceforge.net/projects/gnuarmeclipse/)
4) install CMSIS and the StdPeriph Driver
5) take a standard linker script for STM32 (do not forget ENTRY_POINT)
6) configure everything :)

Enjoy...

--
Carlo


--- news://freenews.netfront.net/ - complaints: news(a)netfront.net ---
From: Oliver Betz on
David Brown wrote:

[...]

>>> Even their paid-for subscription versions have source tarballs - you can
>>> re-compile them yourself with the node-locking code disabled if you want
>>
>> Sure? As far as I understand, you don't get the library sources in the
>> personal version.
>
>I was thinking here about the tools themselves - gcc, binutils, gdb,

for these, there seems to little or no difference between the
versions.

>etc. I believe you get the sources to some parts of the library with
>the personal version, but not all. As with many companies that make

"Professional Edition also includes debuggable versions of the
run-time libraries".

I will ask them when I start again an evaluation of the differences.
When I tried last time, urgent other work prevented me from finishing
my tests.

Oliver
--
Oliver Betz, Munich
despammed.com might be broken, use Reply-To:
First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8 9 10 11
Prev: Simulation of ARM7TDMI-S
Next: Which controller to use?