Prev: Algorithm book recommendation?
Next: TaskScheduler.dll
From: Richard Harter on 21 Jun 2005 23:53 On Wed, 22 Jun 2005 01:49:41 GMT, pete <pfiland(a)mindspring.com> wrote: >Richard Harter wrote: >> >> On Tue, 21 Jun 2005 19:14:15 GMT, pete <pfiland(a)mindspring.com> wrote: >> >> >Richard Harter wrote: >> >> >> >> On Tue, 21 Jun 2005 13:23:53 GMT, pete <pfiland(a)mindspring.com> wrote: >> >> >> >> >spinoza1111(a)yahoo.com wrote: >> >> >> >> (almost nothing worth quoting) >> >> >> >> [snip] >> >> >> >> >An simple O(n*log(n)) solution for floating types >> >> >would be to heapsort the array >> >> >and do a sequential check for equality. >> >> >> >> And, of course, one can do it in O(n) >> >> time and O(1) space without the >> >> xor trick. To quote someone (me): >> >> >> >> "If the array may be rearranged then we can again >> >> find the singleton >> >> with O(1) space and O(n) time. >> >> The general idea is to select a pivot >> >> value and then partition the array into elements smaller than the >> >> pivot, equal to the pivot, and larger than the pivot. >> >> If the pivot is >> >> the singleton we are done; otherwise repeat on the >> >> partition having an odd number of elements." >> > >> >I think that algorithm suffers from the same O(n * n) >> >worst case as simple quick sort. >> > >> >It's possible that in a worst case, >> >that all of your even sized partitions >> >might be two elements in size. >> >In that case, the partioning will be O(n), >> >and the iterations will be O(n). >> >> Yes and no. Recall that there are algorithms[1] for finding the >> median that are strictly O(n) (aka theta(n) in some circles) so we can >> guarantee that each partitioning produces two equal[2] sized >> partitions. Ergo, the worst case can be forced to be O(n). > >If you only had to do a single pivot selection, >then the whole algorithm would be O(n), >but you have to do pivot selections O(log(n)) times. > >I'll take back the O(n * n) worst case claim, >but an O(n) operation, done O(log(n)) times, >is an O(n * log(n)) algorithm. You are missing a point. There are indeed log(n) selections but costs are n, n/2, n/4, etc, for a total cost of 2n which is O(n). Think about it; if you really don't see it ask again, and I will be tediously detailed in the analysis. Richard Harter, cri(a)tiac.net http://home.tiac.net/~cri, http://www.varinoma.com Save the Earth now!! It's the only planet with chocolate.
From: Richard Harter on 21 Jun 2005 23:57 On Wed, 22 Jun 2005 01:44:57 GMT, pete <pfiland(a)mindspring.com> wrote: >Tim Rentsch wrote: >> >> pete <pfiland(a)mindspring.com> writes: >> >> > Richard Harter wrote: >> > > >> > > On Tue, 21 Jun 2005 13:23:53 GMT, >> > > pete <pfiland(a)mindspring.com> wrote: >> > > >> > > >An simple O(n*log(n)) solution for floating types >> > > >would be to heapsort the array >> > > >and do a sequential check for equality. >> > > >> > > And, of course, one can do it in O(n) time and O(1) >> > > space without the xor trick. To quote someone (me): >> > > >> > > "If the array may be rearranged then we can again >> > > find the singleton >> > > with O(1) space and O(n) time. The general idea is to >> > > select a pivot >> > > value and then partition the array into elements smaller than the >> > > pivot, equal to the pivot, and larger than the pivot. >> > > If the pivot is >> > > the singleton we are done; otherwise repeat on >> > > the partition having an >> > > odd number of elements." >> > >> > I think that algorithm suffers from the same O(n * n) >> > worst case as simple quick sort. >> >> Richard's algorithm is making use of finding the median >> value of the array, which can be done in O(n) time. If you >> can guarantee that the array is split nearly in half each >> time, then it's possible to subdivide and zero in on the >> unique element in O(n) time; using the median as the pivot >> value provides that guarantee. > >If your pivot selection is O(n) and it partitions >into near even halves, then you're going to have to >do pivot selection O(log(n)) times, >and you're working on an O(n * log(n)) algorithm. You snipped the important part that showed why it was O(n). > >> > It's possible that in a worst case, >> > that all of your even sized partitions >> > might be two elements in size. >> > In that case, the partioning will be O(n), >> > and the iterations will be O(n). >> >> At each step there are only two partions > >Richard Harter described three partitions: >"The general idea is to select a pivot value and then partition the > array into elements smaller than the pivot, equal to the pivot, > and larger than the pivot." So I did; however the "equal tot he pivot" partition has either one or two elements. It is the other partitions that have size n/2, n/4, etc. Richard Harter, cri(a)tiac.net http://home.tiac.net/~cri, http://www.varinoma.com Save the Earth now!! It's the only planet with chocolate.
From: Richard Harter on 22 Jun 2005 00:28 On Wed, 22 Jun 2005 02:04:30 GMT, pete <pfiland(a)mindspring.com> wrote: >Richard Harter wrote: > >> however the cost of comparisons and the >> cost of data move grows with n. > >Do you mean that as n increases, that the cost of each >comparison and each data move also increases? >If so, how? Recall that we are talking about costs as n grows arbitrarily large. That said, here are the considerations: Data move costs: As n increases the average physical distance that data elements move increases. The speed at which they can move is bounded by the speed of light; ergo the cost of data moves must increase with n as n grows arbitrarily large. This is a theoretical restriction but it is mirrored in real computers; storage is hierarchical. Cost of comparisons: This is a bit more complicated. If the keys are distinct then they must be at least log(n) bits long. If we compare two random keys the average number of bits that need to be inspected is O(1). However as we progress in the sort process the keys being compared are closer together, which implies that the bit at which they differ moves from the first bit to the last. A naive comparison will have an average cost of O(log n) bit inspections. One can track the position at which the comparison must begin; however this has a cost of at least O(log(log n)) operations. In practice sorts are done on two kinds of keys - numbers and strings. The length of the key in bits is considerably larger than log2(n), and the hardware has parallelism. In consequence comparisons can be treated as having a fixed cost given that n is limited. Richard Harter, cri(a)tiac.net http://home.tiac.net/~cri, http://www.varinoma.com Save the Earth now!! It's the only planet with chocolate.
From: Gerry Quinn on 22 Jun 2005 05:47 In article <42b8e1f5.18183896(a)news.sbtc.net>, cri(a)tiac.net says... > > As n increases the average physical distance that data elements move > increases. The speed at which they can move is bounded by the speed > of light; ergo the cost of data moves must increase with n as n grows > arbitrarily large. Indeed, if you go far enough general relativity starts to bite, insofar as there are plausible conjectures limiting the amount of information, or combinations of energy and information, that can be packed into a region of a given size without collapsing into a black hole. - Gerry Quinn
From: Michael Wojcik on 22 Jun 2005 08:41
In article <1119251599.617332.209790(a)g43g2000cwa.googlegroups.com>, spinoza1111(a)yahoo.com writes: > Michael Wojcik wrote: > > In article <d8sd3h$2sp2$1(a)grapevine.lcs.mit.edu>, Christopher Barber <cbarber(a)curl.com> writes: > > > spinoza1111(a)yahoo.com wrote: > > > > Christopher Barber wrote: > > > > The current (ISO 2002) COBOL standard includes bitwise operations, too. > > So does APL. And Scheme. Fortran 95 has an ieor intrinsic, I believe. > > Perl has bitwise xor. Hey, Ruby has it too. > > Of course, this remains a stunning irrevelance to the fact that the XOR > algorithm cannot be said to be O(n), because it replaces equality in an > operation that is from the standpoint of big O theory not valid. If by "big O theory" you mean "big-O notation", then you are wrong. If you mean something else by the former phrase, then your argument is irrelevant. > You > have to assume that "in addition to test for equality, we can regard > numbers as bitwise accessible". 1. This assumption does not affect the representation of the algorithm's efficiency in big-O notation. 2. This assumption is unnecessary; XOR can be implemented without access to the numbers' bitwise representation, since it's possible to compute a number's base-two representation (in linear time). > > I seem to be able to see the bitwise representation of an integer in > > all of those languages, too. Apparently they're all dialects of C. > > You can spew these "counterexamples" all you like. All they show is a > failure to understand how there can be a group of abstract disciplines > each demanding minimal preconditions. Yes. However, it's a failure on your part, not mine. > > Indeed. Out of curiousity, Tisdale, how would endianness, or any > > ordering, affect any Boolean operator? > > What part of shift don't you understand, Homer? The part where they've magically become Boolean operators. How do you "shift" a single binary value? > > There are *bit-level* operations in which the ordering of bits in > > larger structures makes a difference, but they're not "bitwise". > > Because that makes you happy? Because that's what the word properly means. If "bitwise" means anything other than "an operation combining one or more bits chosen independently", then *all* operations are "bitwise", and the word is empty. > I've seen what truth you'll sacrifice in > a savage attack, so this does not surprise. I don't believe you've ever seen me mount "a savage attack", so I greatly doubt that. > > > > But owing to corporate surveillance and its Benthamite potential, the > > > > speech is silenced by posters who have to treat the theory as a set of > > > > dead answers. > > > > > > You keep spewing this "corporate surveillance and Benthamite potential" > > > mantra. What does it mean? > > > > It's a half-assed allusion to Michel Foucault's _Discipline and > > Punish_ [snip]. > > This is nonsense. You brought it up. Or do you mean my precis of Foucault and Spivak is wrong? Feel free to provide citation and argument. > As "corporate MIS professionals" your biggest fear is job loss Hardly. > and as such you sit and snipe like maiden aunts at each other > which effectively silences cross-company collegiality and solidarity. I enjoy quite a bit of cross-company collegiality and solidarity. From where I sit, the silencing process has been decidedly ineffective, I'm afraid. > You reify existing platforms and your knowledge is the knowledge of > "mistakes, carried through to perfection". How is an "existing platform" not already reified? > You prove Bentham illustrative. I never said he wasn't; I merely noted that you haven't yet used him to illustrate anything. > > Doesn't matter. ERT's [should be EGN's] > > claim that someone is being silenced is vacant > > and closed to refutation, because you'd have to prove a negative to > > refute it. Unless and until he comes up with a real argument on the > > subject, there's no point in paying any attention to it. (Even then, > > of course, there may not be any point.) > > This argument is cop logic and as such morally repugnant. It's merely pointing out the difference between a rigorous argument about silencing - like Spivak's - and your handwaving. What I find repugnant, frankly, is your habit of citing serious thinkers in a failed attempt at argumentum ad verecundiam to bolster your own vapid claims. It's rhetorically cheap, intellectually empty, and insulting to those of us who actually spend time thinking about their ideas and whether and how they apply to a situation. > It declares > that the victim, insofar as she is prevented to speak, can never > "prove" anything. Nice try. It does nothing of the sort. It doesn't apply to Subaltern Studies (from Spivak on out), for example, because it critiques your (defective) argument, not its ostensible object. It doesn't attach to real arguments about silencing at all. Your statement was a bare existential claim (there are people who have been silenced) with no logical content. That's not an argument - it's a protestation of faith. It *implies* an argument, but unless and until that argument is made explicit, then like all articles of faith it's unanswerable, and so unworthy of discussion. Or in terms of propositional logic: you asserted Q, an existential proposition ("exists X"). That implies an argument consisting of a set of postulates P and a relation P -> Q. A challenge to the truth- status of P (namely, that none of its members have been stated) says *nothing* about the truth-status of Q. > FYI, humane forensics can indeed establish and have > established in diverse legal fields that victims were silenced. Congratulations - you have almost presented an actual argument. Unfortunately it is irrelevant, as it argues a point which is not being contested. > > (The quintessential critical-theory argument for the silencing of a > > group through competing discourses is Gayatri Spivak's classic "Can > > the Subaltern Speak?"... > > This is a mastery of word games It's a mastery of the relevant material - something you've yet to demonstrate. (I don't see how citing specific authors and works constitutes a "word game", except in a very broad and not very useful Derridean sense, but call it that if you like. Call it a game of quoits if it pleases you.) > which in the American academy ensure > that critical theory is PERCEIVED as a useless word game. So perceived by some, certainly. Somehow I doubt that dropping jargon from critical theory would eliminate that perception, however. > Furthermore, it is foolish to say that Foucault focused on *compulsive* > as opposed to *repressive* power. It misses the entire boat, for > Foucault's work deconstructed the opposition between *compulsive* and > *repressive* power. To say Foucault "deconstructed" anything is to fundamentally misunder- stand Foucault's relationship to structuralism and poststructuralism. Your reading of Foucault's understanding of power appears (though it's difficult to tell from this snippet) to misapprehend both the genealogy of modern structures of power from his earlier work, particularly _Discipline and Punish_, and the poststructuralist reexamination of power from the later work, particularly the first volume of _History of Sexuality_. But none of this is relevant to comp.programming, and I don't care to debate it further here - especially not with someone who's done little to demonstrate any real command of the material. (Feel free to snipe at that all you like.) > If you've read him, you'd understand (for example) that the Yuppie, > working out in the gym, is neither compulsive nor repressed and that > the Freudian categories don't apply. He freely chooses to work out "in > order" to succeed but also takes pleasure in the aerobics "high". A vapid example. A classic Freudian reading works equally well, with the Pleasure Principle and the Reality Principle operating in tandem. > Power, in other words, is fully capillary, pushing itself into daily > "choices". Cf., for example, the Sunday paper in any major city. It > advises people in the register of pleasure but also restricts itself, > of commercial necessity, to pushing choices that its advertisers favor: > for example, it's realy questionable whether people in their twenties > should buy lavish homes as opposed to seeing the world, but in the > Sunday paper view of the world, "seeing the world" is exclusively > restricted to purchasing a closed-loop vacation package AFTER buying a > house and creating a retirement nest-egg. That, on the other hand, is a decent one. Congratulations. That'd probably get you a pat on the back in a first-year seminar. > You don't understand Foucault. Nice try. Should I ever come to care about your evaluation of my understanding I'll be sure to take steps to remedy that. > But what's worse is you don't understand > that power-as-capillary-pleasure creates bullying campaigns (such as > the eradication of the homeless world-wide or the Roma in Europe) when > the capillary mechanisms fail. When they fail? They don't fail. That's the damn *point* of _HoS_ 1. (Now, if you wanted to cite, say, Judith Butler's _The Psychic Life of Power_ in making that argument, I might find it a bit more persuasive. But Foucault? No. This reading reminds me of William Kerrigan's description of Foucault as "paranoid" - a basic error in understanding Foucault's tone. In some ways Foucault is closer to Zizek than he is to Freud, or certainly than he is to Marx.) > Here, we're supposed to take "pleasure" in the supposed "power" of C to > refute O(n) theory That supposition exists only in your fevered imagination. -- Michael Wojcik michael.wojcik(a)microfocus.com Painful lark, labouring to rise! The solemn mallet says: In the grave's slot he lies. We rot. -- Basil Bunting |