From: Andy on
On Jan 12, 2:19 pm, karthikbalaguru <karthikbalagur...(a)gmail.com>
wrote:
> Maybe, it would be better if it
> can do it before user rather than
> behind the scenes so that the user
> is aware of the kind of optimzations
> that are being done and could be
> taken care. It is also needed so that
> the developer will be able to take care
> of further developments on the same
> program/software and also for the
> maintenance activities on that
> program/software.

As a hardware designer using HDL (VHDL) I have also struggled with
this issue of writing "the best" code for a given purpose. I used to
fret every gate and delay when I coded something, trying to make my
code directly lead to the most optimum implementation (usually
performance or size). However, as I have gotten more experience (and I
hope knowlede too), I have come to recognize that "the best" code is
that code that is easy to read, write, understand (review), and
maintain, while meeting performance (speed, power, area)
requirements. I start out with the most straight-forward
implementation of the algorithm, and see if it meets my requirements
by the time the tools are finished optimizing it. If it does meet
them, I'm done. Only if/when it does not meet performance will I start
to sacrifice the read/write/understand/maintain goals.

In the same manner, if I can code a piece of SW such that it is
easiest to use (r/w/u/m), yet it is still sufficiently optimized by
the tools to meet performance requirements, that's what I should do.
Coding something such that is harder to r/w/u/m for no or needless
improvement in final performance is a loosing battle.

As an academic excercise, it may be helpful to understand how code can
be written well or poorly WRT un-optimized performance, especially if
you were writing an optimizer, but for everyday use, I don't see the
benefit. If you are talking about much higher level optimizations than
can be done by current tools, optimization technology will have to
grow quite a bit before it would be available either in "behind the
scenes" or "up-front" (source modification) versions.

Andy
From: karthikbalaguru on
On Jan 15, 3:26 am, Hans-Bernhard Bröker <HBBroe...(a)t-online.de>
wrote:
> David Brown wrote:
> > Nothing stops the compiler from doing this sort of thing in /theory/.
> > But /practice/ is a different matter.
>
> The same thing applies to the original question.  If a compiler's
> full-blown optimizer, given practically infinite time to ponder the
> problem, can't get that analysis job done, then no other tool can, and
> certainly not while just looking over the programmers' shoulders as they
> type.

I think, the tool
should be trained to develop tiny
logics by giving tiny infos to it.
I think, the tool should also be
trained to develop its own
self decision capabilities by
giving lot of small tasks that might
require tiny decisions initially.
The above should make it
robust in finding alternate logic.
Maybe, it can also bank on the
reliable sources on internet or
other servers to get more info
incase it runs out of steam. But
i think, it is better to avoid some
internet dependencies as
sometimes we might need to
use a PC that is not connected
to internet !

Thx in advans,
Karthik Balaguru
From: Chris McDonald on
Andrew Poelstra <apoelstra(a)localhost.localdomain> writes:

> .......
>What if all it did was count ops to get a big-O estimate, and check
>the function name for words like 'sort' or 'hash' or 'find' to suggest
>algorithms?

This has "Halting Problem" written all over it!

--
Chris.
From: Lie Ryan on
On 01/15/10 12:51, karthikbalaguru wrote:
> On Jan 15, 3:26 am, Hans-Bernhard Br�ker <HBBroe...(a)t-online.de>
> wrote:
>> David Brown wrote:
>>> Nothing stops the compiler from doing this sort of thing in /theory/.
>>> But /practice/ is a different matter.
>>
>> The same thing applies to the original question. If a compiler's
>> full-blown optimizer, given practically infinite time to ponder the
>> problem, can't get that analysis job done, then no other tool can, and
>> certainly not while just looking over the programmers' shoulders as they
>> type.
>
> I think, the tool
> should be trained to develop tiny
> logics by giving tiny infos to it.
> I think, the tool should also be
> trained to develop its own
> self decision capabilities by
> giving lot of small tasks that might
> require tiny decisions initially.
> The above should make it
> robust in finding alternate logic.
> Maybe, it can also bank on the
> reliable sources on internet or
> other servers to get more info
> incase it runs out of steam. But
> i think, it is better to avoid some
> internet dependencies as
> sometimes we might need to
> use a PC that is not connected
> to internet !

I found a similar tool:
http://xkcd.com/416/

From: Nick Keighley on
On 13 Jan, 16:43, dj3va...(a)csclub.uwaterloo.ca.invalid wrote:
> In article <4b4def88$0$22938$e4fe5...(a)news.xs4all.nl>,
> [Jongware] <so...(a)no.spam.net> wrote:
> >Walter Banks wrote:

> >> Defining goals at a much higher level than C opens the possibilities
> >> for automating algorithmic choices at the function level.
>
> >Aha -- wouldn't the logical end point be a programming language where
> >you type "word processor", save it as source, compile, and have a word
> >processor?
>
> Why bother to compile it?  Just have it interpret on-the-fly.
> That way you could even run it in interactive mode, and it's
> sufficiently high-level that even non-programmers could usefully use
> it.
>
> Unix people call this a "shell".

I'm guessing you're trying to be funny/ironic. But in case you aren't,
Unix has dozens of stranglely incompatible Command Line Interfaces
that Unix people call "shells". None of them are word processors.


--
Campaign Against Unix Bigotry