Prev: A tool that suggests optimized logic for a piece of code/module/function
Next: How to ascertain the type of files being read
From: Andrew Poelstra on 14 Jan 2010 18:56 On 2010-01-14, Hans-Bernhard Br�ker <HBBroeker(a)t-online.de> wrote: > David Brown wrote: > >> Nothing stops the compiler from doing this sort of thing in /theory/. >> But /practice/ is a different matter. > > The same thing applies to the original question. If a compiler's > full-blown optimizer, given practically infinite time to ponder the > problem, can't get that analysis job done, then no other tool can, and > certainly not while just looking over the programmers' shoulders as they > type. > What if all it did was count ops to get a big-O estimate, and check the function name for words like 'sort' or 'hash' or 'find' to suggest algorithms? For as little as most programmers write sorting and searching code by hand these days, it's a lot of effort to remember all the different algorithms out there.
From: Måns Rullgård on 14 Jan 2010 20:58 karthikbalaguru <karthikbalaguru79(a)gmail.com> writes: > On Jan 15, 3:26�am, Hans-Bernhard Br�ker <HBBroe...(a)t-online.de> > wrote: >> David Brown wrote: >> > Nothing stops the compiler from doing this sort of thing in /theory/. >> > But /practice/ is a different matter. >> >> The same thing applies to the original question. �If a compiler's >> full-blown optimizer, given practically infinite time to ponder the >> problem, can't get that analysis job done, then no other tool can, and >> certainly not while just looking over the programmers' shoulders as they >> type. > > I think, the tool should be trained to develop tiny logics by giving > tiny infos to it. I think, the tool should also be trained to > develop its own self decision capabilities by giving lot of small > tasks that might require tiny decisions initially. The above should > make it robust in finding alternate logic. Maybe, it can also bank > on the reliable sources on internet or other servers to get more > info incase it runs out of steam. But i think, it is better to avoid > some internet dependencies as sometimes we might need to use a PC > that is not connected to internet ! There's a name for that: outsourced engineer. -- M�ns Rullg�rd mans(a)mansr.com
From: Paul Keinanen on 15 Jan 2010 00:54 On Thu, 14 Jan 2010 23:56:05 GMT, Andrew Poelstra <apoelstra(a)localhost.localdomain> wrote: >On 2010-01-14, Hans-Bernhard Br�ker <HBBroeker(a)t-online.de> wrote: >> David Brown wrote: >> >>> Nothing stops the compiler from doing this sort of thing in /theory/. >>> But /practice/ is a different matter. >> >> The same thing applies to the original question. If a compiler's >> full-blown optimizer, given practically infinite time to ponder the >> problem, can't get that analysis job done, then no other tool can, and >> certainly not while just looking over the programmers' shoulders as they >> type. >> > >What if all it did was count ops to get a big-O estimate, and check >the function name for words like 'sort' or 'hash' or 'find' to suggest >algorithms? Which will generate a lot of false alarms if the tool is not capable of knowing the actual data sizes used. There is not much point of using some complex sorting algorithm, if the actual number of data element is small, say 5-25. The code size and startup time than the total sort time for a small data set. If the tool is used for small embedded systems, it should not suggest algorithms that could use deep recursion (excessive stack usage) with some pathological input data. At some platforms, the use of dynamic memory or recursion might be strongly discouraged or even impossible. A tool that is nagging of insignificant things and/or makes useless or bad suggestions is simply useless or directly harmful.
From: Walter Banks on 15 Jan 2010 06:44 David Brown wrote: > Nothing stops the compiler from doing this sort of thing in /theory/. > But /practice/ is a different matter. It is not often that the compiler > can find out what code (of a size large enough to be called an > "algorithm") actually does - and it has to be sure that any replacements > do at least as "good" a job. That means, in the words of the Wine > project, bug-for-bug compatibility. It's no easy job, especially with C > where you often can't (or don't, even if you /can/) properly express > what you want the code to do, but merely how you want it to do it. David, To add to your comment. An optimizing compiler when facing a bubble sort will generate the most efficient bubble that it can but the bubble sort still will not compete with a quick sort in most cases. A good compiler's job is to do what it is told as efficiently as possible. Regards, -- Walter Banks Byte Craft Limited http://www.bytecraft.com
From: Walter Banks on 15 Jan 2010 07:04
Hans-Bernhard Br�ker wrote: > David Brown wrote: > > > Nothing stops the compiler from doing this sort of thing in /theory/. > > But /practice/ is a different matter. > > The same thing applies to the original question. If a compiler's > full-blown optimizer, given practically infinite time to ponder the > problem, can't get that analysis job done, then no other tool can, and > certainly not while just looking over the programmers' shoulders as they > type. The compilers optimizer primarily goal is to map an application on a target processor. Most good optimizers have a lot of information on the application code and the resources required to implement a specific instance. A possible implementation might involve trying alternative approaches and using compiler metrics This could be automated. Regards, -- Walter Banks Byte Craft Limited http://www.bytecraft.com |