Prev: Pocket Lisp Machine
Next: Next Generation of Language
From: Henry Bigelow on 26 Sep 2006 17:02 > > but i'm not sure why certain lisp programs in the shootout are so > > large. how do i interpret the results of the shootout? for example, > > lisp fannkuch is 26 times slower than the c version, and lisp chameneos > > is 120x larger. > > http://shootout.alioth.debian.org/gp4/benchmark.php?test=fannkuch&lang=sbcl&id=0 > > 120x larger? sorry for the confusion. by "larger" i meant memory use. here are the stats for Chameneos: SBCL: 62,656 MB GCC C: 524 MB see: http://shootout.alioth.debian.org/gp4/benchmark.php?test=chameneos&lang=sbcl http://shootout.alioth.debian.org/gp4/benchmark.php?test=chameneos&lang=gcc
From: Henry Bigelow on 26 Sep 2006 17:11 pbunyk(a)gmail.com wrote: > Welcome here! thank you. > > > but i'm not sure why certain lisp programs in the shootout are so > > large. how do i interpret the results of the shootout? for example, > > lisp fannkuch is 26 times slower than the c version, and lisp chameneos > > is 120x larger. > > It would be useful to post a link to the actual data page -- at least > to see what they count as "program size". Unless one does lisp -> C -> > executable conversion (which is totally possible with some less popular > lisp dialects), I'd assume that resulting lisp executable is a complete > image, which would include all of the library AND compiler. > sorry for the confusion, by "larger" i meant memory use. here's the link: http://shootout.alioth.debian.org/gp4/benchmark.php?test=all&lang=sbcl&lang2=gcc > > is it possible to improve on these numbers? > > Minimize consing, use lisp arrays, make sure to declare your variables > (to let compiler know what it is allowed to optimize) -- at the end > memory requirements should not be much worse with lisp than for C or > Fortran (except for the above-mentioned constant additon of having > compiler and library "always there", which is a bonus ;-) ). > > As to speed -- this is the required reading: > http://www.flownet.com/gat/papers/lisp-java.pdf ;-) thanks for the link. looks very encouraging. a more important question is: is this shootout really definitive? are all the programs up there written such that one would consider them both elegant and efficient, or are some very poorly written? is it a popular shootout, or are there other benchmarks that people like better? thanks, henry > > Of course one would want to use compiled version of lisp, check out > SBCL... > > Just my $0.02, > > Paul B.
From: Robert Dodier on 26 Sep 2006 17:21 Henry Bigelow wrote: > i'm a bioinformatics student, writing a bayesian network software with > a very demanding memory requirement, and potentially long running time > for training. it predicts protein structure, and must have hundreds of > nodes for each protein, each with a matrix storing relationships > between amino acids and structural states. if this is done > efficiently, it certainly fits in a 2GB machine. My advice to you is to use R (http://www.r-project.org). It is a pleasant programming language, and there is a lot of contributed code, which includes Bayesian stuff and bioinformatics (not sure if it includes the intersection of the two). In any event R has a large, active user community and chances are good you'll find people with similar interests. I say this after having written a lot of statistical code (including Bayesian inference) in a variety of languages. It's not clear to me that Lisp's particular strength (code = data) is going to be much of a win for you. If you were writing a general purpose Bayesian inference package, probably so. But I'm guessing that what you are going to do is derive some equations by hand, code them, and then run them on enormous data sets. I don't see much scope for code = data there. YMMV. Lisp isn't a bad choice in this context; it is probably better than C or Perl. FWIW Robert Dodier
From: pbunyk on 26 Sep 2006 18:44 There is this page on Shootout site: http://shootout.alioth.debian.org/gp4/miscfile.php?file=benchmarking&title=Flawed%20Benchmarks and the first link off it is quite educational... ;-) > a more important question is: > > is this shootout really definitive? are all the programs up there > written such that one would consider them both elegant and efficient, > or are some very poorly written? > is it a popular shootout, or are there other benchmarks that people > like better? This is the first time I've heard of this particular competition (and set of benchmarks) -- it looks rather cool! I doubt it has the same clout as SPEC though... :-) Hey, while you are at learning lisp, why not debug that 120x memory program to see why this happens? profile and time are your friends... ;-) CL-USER>CL-USER> (time (main 5000000)) 10000000 Evaluation took: 34.466 seconds of real time 12.708794 seconds of user run time 21.369335 seconds of system run time [Run times include 0.044 seconds GC run time.] 0 page faults and 79,975,616 bytes consed. NIL CL-USER> -- yes, almost 80 MB consed -- but why the heck threading overherad is so high? (most of runtime is in system time, uh-huh...) Paul B.
From: JShrager on 26 Sep 2006 18:57
Much as I appreciate Paolo's nod to BioBike, let me second Robert's recommendation of R if all you need to do is crunch a bunch of numbers that represent a bayes net. R has a quasi-reasonable quasi-Lisp programming language and numerous packages to support mathematical and statistical computing (incl. extensive bayesian machinery). BioBike is designed for knowledge-based, not numbers-based biocomputing. (Although, of course, it can do both, but why try to shoe-horn every kind of computation into the same programming language even if you could, esp. when it's so easy to make multiple languages live happily together?) To do most "typical" biocomputing -- sequence processing, stats, bayes nets, and those sorts of off-the-shelf numerical things -- we call down to non-Lisp (R-based or C-based) code and just interface them up to BioBike's Lisp so that you can use Lisp to do the more difficult knowledge processing that Lisp is good at. If you are interested in using BioBike to do knowledge processing on top of your bayes net machinery I'm sure that the BioBike support team will be happy to help you out. |