From: Przemek Klosowski on
On Fri, 21 May 2010 09:43:57 +0100, Chris H wrote:

> There is no need except for amusement. GCC is a LONG way behind the main
> commercial compilers.

Well, I am sure that some commercial compilers, especially those written
by smart guys like Walter, and the CPU designers like ARM, will beat GCC.
At the same time, here's an example how x86 GCC does quite well in a
contest against Intel, Sun, Microsoft and LLVM compilers:

http://www.linux-kongress.org/2009/slides/
compiler_survey_felix_von_leitner.pdf

It's an interesting paper in several ways---he points out that compilers
are often so good that tactical optimizations often don't make sense.
From: Walter Banks on


Przemek Klosowski wrote:

> On Fri, 21 May 2010 09:43:57 +0100, Chris H wrote:
>
> > There is no need except for amusement. GCC is a LONG way behind the main
> > commercial compilers.
>
> Well, I am sure that some commercial compilers, especially those written
> by smart guys like Walter, and the CPU designers like ARM, will beat GCC.
> At the same time, here's an example how x86 GCC does quite well in a
> contest against Intel, Sun, Microsoft and LLVM compilers:
>
> http://www.linux-kongress.org/2009/slides/
> compiler_survey_felix_von_leitner.pdf
>
> It's an interesting paper in several ways---he points out that compilers
> are often so good that tactical optimizations often don't make sense.

The paper deals with a dozen or so optimizations and shows
the variation on the generated code, quite useful. What is missing
from the paper is any form of analysis when the compiler should
utilize a specific optimization and how each of the compilers
made that choice.

The paper touches on source code ways to improve the quality
of source level debugging information. Source level debugging is
important but in many fundamental ways this is one of the major
aggravating factors in gcc. One of the fundamental ways to ship
reliable code is to ship the code that was debugged and tested.
Code motion and other simple optimizations leaves GCC's
source level debug information significantly broken forcing
many developers to debug applications with much of the
optimization off then recompile later with optimization on but
the code largely untested.



Regards,


Walter..
--
Walter Banks
Byte Craft Limited
http://www.bytecraft.com










From: David Brown on
On 25/05/2010 14:57, Walter Banks wrote:
>
>
> Przemek Klosowski wrote:
>
>> On Fri, 21 May 2010 09:43:57 +0100, Chris H wrote:
>>
>>> There is no need except for amusement. GCC is a LONG way behind the main
>>> commercial compilers.
>>
>> Well, I am sure that some commercial compilers, especially those written
>> by smart guys like Walter, and the CPU designers like ARM, will beat GCC.
>> At the same time, here's an example how x86 GCC does quite well in a
>> contest against Intel, Sun, Microsoft and LLVM compilers:
>>
>> http://www.linux-kongress.org/2009/slides/
>> compiler_survey_felix_von_leitner.pdf
>>
>> It's an interesting paper in several ways---he points out that compilers
>> are often so good that tactical optimizations often don't make sense.
>
> The paper deals with a dozen or so optimizations and shows
> the variation on the generated code, quite useful. What is missing
> from the paper is any form of analysis when the compiler should
> utilize a specific optimization and how each of the compilers
> made that choice.
>

That wasn't really the point of the paper. I believe the author was
aiming to show that it is better to write logical, legible code rather
than "smart" code, because it makes the code easier read, easier to
debug, and gives the compiler a better chance to generate good code.
There was a time when you had to "hand optimize" your C code to get the
best results - the paper is just showing that this is no longer the
case, whether you are using gcc or another compiler (for the x86 or
amd64 targets at least). It was also showing that gcc is at least as
smart, and often smarter, than the other compilers tested for these
cases. But I did not see it as any kind of general analysis of the
optimisations and code quality of gcc or other compilers - it does not
make any claims about which compiler is "better". It only claims that
the compiler knows more about code generation than the programmer.

> The paper touches on source code ways to improve the quality
> of source level debugging information. Source level debugging is
> important but in many fundamental ways this is one of the major
> aggravating factors in gcc. One of the fundamental ways to ship
> reliable code is to ship the code that was debugged and tested.
> Code motion and other simple optimizations leaves GCC's
> source level debug information significantly broken forcing
> many developers to debug applications with much of the
> optimization off then recompile later with optimization on but
> the code largely untested.
>

I don't really agree with you here. There are three points to remember
here. One is that /all/ compilers that generate tight code will
re-arrange and manipulate the code. This includes constant folding,
strength reduction, inlining, dead-code elimination, etc., as well as
re-ordering code for maximum pipeline throughput and cache effects (that
applies more to bigger processors than small ones). You can't generate
optimal code and expect to be able to step through your code line by
line in logical order, or view (and change) all local variables. Top
range debuggers will be able to fake some of this based on debugging
information from the compiler, but it will be faked.

I make no claims that gdb is such a "top range" debugger, and it is
definitely the case that while many pre-packed gcc toolchains include
the latest and greatest compiler version, they are often lax about using
newer and more powerful gdb versions. Add to that the fact that many
people use a simple "-g" flag with gcc to generate debugging
information, rather than flags giving more detailed debugging
information (gcc can even include macro definitions in the debugging
information if you ask it nicely), and you can see that people often
don't use as powerful debugging tools as they might with gcc. That's a
failing in the way gcc is often packaged and configured, rather than a
failing in gcc or gdb.

Secondly, gcc can generate useful debugging information even when fully
optimising, without affecting the quality of the generated code. Many
commercial compilers I have seen give you a choice between no debug
information and fast code, or good debug information and slower code.
gcc gives you the additional option of reasonable debug information and
fast code. I can't generalise as to how this compares to other
commercial compilers - it may be that the ones I used were poor in this
regard.

Thirdly, there are several types of testing and several types of
debugging. When you are debugging your algorithms, you want to have
easy and clear debugging, with little regard to the speed. You then
have low optimisation settings, avoid inlining functions, use extra
"volatile" variables, etc. When your algorithm works, you can then
compile it at full speed for testing - at this point, you don't need the
same kind of line-by-line debugging. But that does not mean your
full-speed version is not debugged or tested! Thus you do some of your
development work with a "debug" build at "-O1 -g" or even "-O0 -g", and
some with a "release" build at "-Os -g" or "-O3 -g".

mvh.,

David



From: Grant Edwards on
On 2010-05-25, Przemek Klosowski <przemek(a)tux.dot.org> wrote:
> On Fri, 21 May 2010 09:43:57 +0100, Chris H wrote:
>
>> There is no need except for amusement. GCC is a LONG way behind the main
>> commercial compilers.
>
> Well, I am sure that some commercial compilers, especially those written
> by smart guys like Walter, and the CPU designers like ARM, will beat GCC.
> At the same time, here's an example how x86 GCC does quite well in a
> contest against Intel, Sun, Microsoft and LLVM compilers:
>
> http://www.linux-kongress.org/2009/slides/compiler_survey_felix_von_leitner.pdf
>
> It's an interesting paper in several ways

Is the paper available somewhere?

--
Grant Edwards grant.b.edwards Yow! I am NOT a nut....
at
gmail.com
From: Albert van der Horst on
In article <4BFBC92D.9019CE15(a)bytecraft.com>,
Walter Banks <walter(a)bytecraft.com> wrote:
<SNIP>
>
>The paper touches on source code ways to improve the quality
>of source level debugging information. Source level debugging is
>important but in many fundamental ways this is one of the major
>aggravating factors in gcc. One of the fundamental ways to ship
>reliable code is to ship the code that was debugged and tested.
>Code motion and other simple optimizations leaves GCC's
>source level debug information significantly broken forcing
>many developers to debug applications with much of the
>optimization off then recompile later with optimization on but
>the code largely untested.

Tanenbaum once said in a lecture:
" Global optimisers and symbolic debuggers are each others
arch enemies"
A moment of thought should be enough to convince one self of
the truth of this.

I fail to see how this situation is different for GCC than for
any compiler.

By the way.
- The very best code is tested but never debugged,
because there is no need.
(Chuck Moore the inventor of Forth reportedly never debugs.
He checks his code and it works. Mostly subprograms are one line.
That makes it easier, of course. )
- I always run tests on shipped code. Don't you?
- If you expect the outcomes of different optimisation levels
to be different, you're living a dangerous live, because
apparently you don't trust your code not to have undefined behaviour.

>Walter Banks

Groetjes Albert

--
--
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- being exponential -- ultimately falters.
albert(a)spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst

First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8 9 10 11
Prev: Simulation of ARM7TDMI-S
Next: Which controller to use?