Prev: Algorithm book recommendation?
Next: TaskScheduler.dll
From: Randy Howard on 21 Jun 2005 20:34 In article <42b878cf.827209258(a)news.sbtc.net>, cri(a)tiac.net says... > This is often an important consideration. For example, suppose that > algorithm A has a cost of 1000000*n and algorithm B has a cost of n*n, > and that n will never be greater than 1000. The catch, of course, is > that the O(n*n) algorithm doesn't scale. In this case, his version of 1000000 is only 32. I think we can safely assume that n<32 is not a guarantee and move on. -- Randy Howard (2reply remove FOOBAR) "I don't really care about being right you know, I just care about success." --Steve Jobs
From: pete on 21 Jun 2005 21:44 Tim Rentsch wrote: > > pete <pfiland(a)mindspring.com> writes: > > > Richard Harter wrote: > > > > > > On Tue, 21 Jun 2005 13:23:53 GMT, > > > pete <pfiland(a)mindspring.com> wrote: > > > > > > >An simple O(n*log(n)) solution for floating types > > > >would be to heapsort the array > > > >and do a sequential check for equality. > > > > > > And, of course, one can do it in O(n) time and O(1) > > > space without the xor trick. To quote someone (me): > > > > > > "If the array may be rearranged then we can again > > > find the singleton > > > with O(1) space and O(n) time. The general idea is to > > > select a pivot > > > value and then partition the array into elements smaller than the > > > pivot, equal to the pivot, and larger than the pivot. > > > If the pivot is > > > the singleton we are done; otherwise repeat on > > > the partition having an > > > odd number of elements." > > > > I think that algorithm suffers from the same O(n * n) > > worst case as simple quick sort. > > Richard's algorithm is making use of finding the median > value of the array, which can be done in O(n) time. If you > can guarantee that the array is split nearly in half each > time, then it's possible to subdivide and zero in on the > unique element in O(n) time; using the median as the pivot > value provides that guarantee. If your pivot selection is O(n) and it partitions into near even halves, then you're going to have to do pivot selection O(log(n)) times, and you're working on an O(n * log(n)) algorithm. > > It's possible that in a worst case, > > that all of your even sized partitions > > might be two elements in size. > > In that case, the partioning will be O(n), > > and the iterations will be O(n). > > At each step there are only two partions Richard Harter described three partitions: "The general idea is to select a pivot value and then partition the array into elements smaller than the pivot, equal to the pivot, and larger than the pivot." -- pete
From: pete on 21 Jun 2005 21:49 Richard Harter wrote: > > On Tue, 21 Jun 2005 19:14:15 GMT, pete <pfiland(a)mindspring.com> wrote: > > >Richard Harter wrote: > >> > >> On Tue, 21 Jun 2005 13:23:53 GMT, pete <pfiland(a)mindspring.com> wrote: > >> > >> >spinoza1111(a)yahoo.com wrote: > >> > >> (almost nothing worth quoting) > >> > >> [snip] > >> > >> >An simple O(n*log(n)) solution for floating types > >> >would be to heapsort the array > >> >and do a sequential check for equality. > >> > >> And, of course, one can do it in O(n) > >> time and O(1) space without the > >> xor trick. To quote someone (me): > >> > >> "If the array may be rearranged then we can again > >> find the singleton > >> with O(1) space and O(n) time. > >> The general idea is to select a pivot > >> value and then partition the array into elements smaller than the > >> pivot, equal to the pivot, and larger than the pivot. > >> If the pivot is > >> the singleton we are done; otherwise repeat on the > >> partition having an odd number of elements." > > > >I think that algorithm suffers from the same O(n * n) > >worst case as simple quick sort. > > > >It's possible that in a worst case, > >that all of your even sized partitions > >might be two elements in size. > >In that case, the partioning will be O(n), > >and the iterations will be O(n). > > Yes and no. Recall that there are algorithms[1] for finding the > median that are strictly O(n) (aka theta(n) in some circles) so we can > guarantee that each partitioning produces two equal[2] sized > partitions. Ergo, the worst case can be forced to be O(n). If you only had to do a single pivot selection, then the whole algorithm would be O(n), but you have to do pivot selections O(log(n)) times. I'll take back the O(n * n) worst case claim, but an O(n) operation, done O(log(n)) times, is an O(n * log(n)) algorithm. -- pete
From: pete on 21 Jun 2005 22:02 Richard Harter wrote: > To do him justice (would that he were done justice) the text you quote > implies the converse, that he does understand the O() concept. His > point, such as it is, might be that in practice what is important is > the cost in the range of n in interest. (At least that is what it > appears that he is grasping at, other than an assortment of straws.) That's exactly what he's grasping at. When the specification calls for an O(n) algorithm, what does that suggest about the size of n? Does it suggest that we are interested in large n? Does it suggest that we are interested in n so small that it doesn't make a difference what order the algorithm is? I think it suggests that we are more interested in large n. Specifying an order for the algorithm is probably not the best way for the puzzle to suggest that the order of the algorithm doesn't really matter. -- pete
From: pete on 21 Jun 2005 22:04
Richard Harter wrote: > however the cost of comparisons and the > cost of data move grows with n. Do you mean that as n increases, that the cost of each comparison and each data move also increases? If so, how? -- pete |