From: Michael Wojcik on

In article <d8sd3h$2sp2$1(a)grapevine.lcs.mit.edu>, Christopher Barber <cbarber(a)curl.com> writes:
> spinoza1111(a)yahoo.com wrote:
> > Christopher Barber wrote:
>
> >>Not true. Most modern languages do support bitwise operations on integers.
> >>What languages are you thinking of that do not? True that many languages
> >>will not let you access the bit-pattern of arbitrary data structures.
> >
> > Incorrect. Only C lets you "see" the bitwise representation of an
> > integer.
>
> C, C++, C#, Java, VB all support efficient bitwise operations, including XOR,
> on integers. What language are you thinking of that doesn't support this?

The current (ISO 2002) COBOL standard includes bitwise operations, too.
So does APL. And Scheme. Fortran 95 has an ieor intrinsic, I believe.
Perl has bitwise xor. Hey, Ruby has it too.

I seem to be able to see the bitwise representation of an integer in
all of those languages, too. Apparently they're all dialects of C.

> > Using XOR, you just happen to get lucky, in the sense that XOR just
> > happens to be independent of bigendian and smallendian.
>
> It doesn't "just happen" to be independent of byte order, it is an inherent
> quality of the operation.

Indeed. Out of curiousity, Tisdale, how would endianness, or any
ordering, affect any Boolean operator?

> > But the
> > corrupting effect of this puzzle is that it will teach the tyro to keep
> > on the lookout for bitwise stunts and sooner or later, said tyro will
> > find some stunt that does depend on the order of bits.
>
> Any "stunt" involving XOR will not depend on the order of the bits.

And ditto any "bitwise" stunt, which by definition cannot depend on
the order of the bits. That's what makes it "bitwise".

There are *bit-level* operations in which the ordering of bits in
larger structures makes a difference, but they're not "bitwise".

> > But owing to corporate surveillance and its Benthamite potential, the
> > speech is silenced by posters who have to treat the theory as a set of
> > dead answers.
>
> You keep spewing this "corporate surveillance and Benthamite potential"
> mantra. What does it mean?

It's a half-assed allusion to Michel Foucault's _Discipline and
Punish_ (the title of the English translation), which elaborates a
theory of the conditioning of the subject through surveillance, based
in part on the example of Jeremy Bentham's "Panopticon" prison
design.

In brief, the "Benthamite" part of Foucault's argument goes like
this: the point of the Panopticon is that prisoners cannot tell when
they are being observed by the guards. Because they are always
potentially under surveillance, they learn to police their own
actions - and so they internalize the function of the guards,
becoming guards of themselves.

D&P is actually one of Foucault's more accessible books, and it
contains some interesting ideas. Like any other body of critical
theory, though, it's easy to claim it applies in any given situation,
and rather harder to make a convincing case for it.

> Whose speech is being silenced?

Doesn't matter. ERT's claim that someone is being silenced is vacant
and closed to refutation, because you'd have to prove a negative to
refute it. Unless and until he comes up with a real argument on the
subject, there's no point in paying any attention to it. (Even then,
of course, there may not be any point.)

(The quintessential critical-theory argument for the silencing of a
group through competing discourses is Gayatri Spivak's classic "Can
the Subaltern Speak?", either in its original form from _Oxford
Literary Review_ 9, or the longer version that appears in _Marxism
and the Interpretation of Culture_, which includes a lengthy coda
explaining Spivak's choices regarding theories of culture. But
"Subaltern" is based primarily on Jacques Derrida's deconstruction
and secondarily on the schema Freud outlines in "A Child is Being
Beaten"; in the long version of the article, Spivak explicitly
rejects Foucault's methodology for her project. Foucault, converse-
ly, was rarely interested in the *repressive* aspects of power,
focussing instead on the *compulsive* ones - so he'd be more appro-
priate for considering how people are compelled to speak than on how
they're prevented from doing so.)

--
Michael Wojcik michael.wojcik(a)microfocus.com

The presence of those seeking the truth is infinitely preferable to
the presence of those who think they've found it. -- Terry Pratchett
From: Randy Howard on
In article <d8urr80a5r(a)news3.newsguy.com>, mwojcik(a)newsguy.com says...
> The current (ISO 2002) COBOL standard includes bitwise operations, too.
> So does APL. And Scheme. Fortran 95 has an ieor intrinsic, I believe.
> Perl has bitwise xor. Hey, Ruby has it too.
>
> I seem to be able to see the bitwise representation of an integer in
> all of those languages, too. Apparently they're all dialects of C.

Through the eyes of a sociologist, that might be the case.

> > > Using XOR, you just happen to get lucky, in the sense that XOR just
> > > happens to be independent of bigendian and smallendian.
> >
> > It doesn't "just happen" to be independent of byte order, it is an inherent
> > quality of the operation.
>
> Indeed. Out of curiousity, Tisdale, how would endianness, or any
> ordering, affect any Boolean operator?

I think you have ERT and nilgewater confused. Much as they both exhibit
some similar traits, albeit in different newsgroups, I am fairly well
convinced that not even ERT is this badly confused about programming
concepts.

> Doesn't matter. ERT's claim that someone is being silenced is vacant
> and closed to refutation, because you'd have to prove a negative to
> refute it.

s/ERT/nilgewater

Otherwise, correct.

--
Randy Howard (2reply remove FOOBAR)
"I don't really care about being right you know,
I just care about success." --Steve Jobs
From: pete on
Michael Wojcik wrote:
>
> In article <d8sd3h$2sp2$1(a)grapevine.lcs.mit.edu>,
> Christopher Barber <cbarber(a)curl.com> writes:
> > spinoza1111(a)yahoo.com wrote:
> > > Christopher Barber wrote:

> Tisdale

Nilges

> ERT's

EGN's

.... unless ER Tisdale is doing a perfect EG Nilges imitation.

--
pete
From: spinoza1111 on


Christopher Barber wrote:
> spinoza1111(a)yahoo.com wrote:
> > Christopher Barber wrote:
>
> >>Why bother with optimizing a bad algorithm? The only time you would use
> >>such an algorithm is when n is very small so there is no need to perform
> >>any other optimizing tricks. When n is large you had better be using
> >>an O(n) algorithm.
> >
> > This is what optimizing compilers do, mate.
>
> Uh, no they don't. No optimizing compiler I know of would optimize your
> bad algorithm, but if you know of one, please speak up.

Look, I've discussed how the language of "bad" (as opposed to order(x))
misleads people. But if you want to persist in this biographical mode,
then you are ignorant of optimizing compilers which perform operations
including constant subexpression evaluation and doubling of the effect
of for loops.

Of course, many posters here are ignorant of these issues because when
they think of a constant subexpression they think of 1+1-a and say,
this be "bad", Commie, terrorist, child molester code because of the
"unnecessary" constant evaluation.

Then you show them how it is common praxis for readability to split
long constants as in

a = "Bnarg bnarg bnargbnarg bnarg bnarg bnarg bnarg bnarg " +
"bnargbnarg bnarg bnarg bnarg bnarg bnarg bnarg bnarg " +
b

where this is a constant subexpression and they are like d'oh.


>
> > And in software maintenance
> > you need to know how to make low-risk code improvements to "bad",
> > Commie, terrorist, child molestor algorithms.
>
> Sure, and you also know when not to.
>
> > There are no "bad" algorithms just as there are no bad dogs.
>
> There are bad algorithms (though not in the moral sense) and there
> are bad dogs too (and I am a dog owner).

"Bad" as a technical predicate HAS a pre-existing moral sense and
therefore you cannot say "though not in the moral sense". These is
weasel words because you are either making a serious charge or else you
are jerking off.

>
> >>>I agree that an O(log n) algorithm is in some nonmoral sense better
> >>>than O(n), which is better than O(n^y)! But in real programming there's
> >>>much more to know.
> >>
> >>Like what?
> >>
> >>Your algorithm doesn't appear to have any practical use and you have yet
> >>to suggest one.
>
> > What IS of practical use to me is the ability to discourse wittingly
> > and in an amusing fashion about the actual properties of real software.
>
> Then what's the problem? Haven't we been doing that?

Most of us have been quite witty and amusing but just outside the tent
there are nasty trolls such as Randy Howard who unlike you haven't
separated "bad" from "Commie, terrorist, loser, rents and does not own,
child molestor". Your language attracts the flies.

>
> >>>The problem is that you seem more concerned with biography than
> >>>anything else. OK, the terminology is unfamiliar.
> >>
> >>It is not the terminology that is unfamiliar, it is your non-standard use of
> >>it. I have never heard anyone speak of an O(n^2) algorithm as being "close
> >>to" an O(n) one in performance. I have no idea what you mean by "biography".
> >
> > Then you are unread, for when n<10e1 the O(n) and O(n^2) algorithms are
> > "close" in significant cases.
> >
> > By "biography" I mean wasting time in code reviews and here with an
> > attack on the speaker's or poster's professional biography.
>
> I have made no mention of your professional biography in this thread and know
> nothing about you other than what you have posted about yourself. Perhaps you
> are referring to someone else.
>
You have been restrained which is why I answer. But there is an unused
way of avoiding words which imply moral predicates.
> > Basically, human dignity and economic good sense should require that
> > no-one on a job of work should be either permitted or required to
> > establish his right to be a part of a dialog. The "biographical"
> > approach wastes time in attacks on competence and knowledge.
>
> No one said that you should be excluded from the dialogue. Those that feel
> that way have probably already put you in their kill file. But being admitted
> to the dialogue does not mean that you can say anything you want without fear
> of being contradicted.
>
Of course not. But what is being censored is any positive effort to
focus strictly on the algorithm and its properties which Weinberg
showed was how we make progress.

> >>>But one reason for the 80% failure rate of enterprise
> >>>systems is the unwillingness to speak of code without snap judgements
> >>>and one-upmanship in a field that has been RUINED by laissez faire
> >>>"competition".
> >>
> >>I don't think anyone here made a "snap judgement". I also don't see
> >>how suggesting improvements to algorithms or program designs is going
> >>to lead to failures in enterprise systems.
> >
> > Not in itself. But failure to review because everybody's pulling their
> > pud does cause failures.
>
> No kidding, but that is what you yourself are doing.

Not at all. I have been the prime mover in what is despite the trolling
by Randy Howard and this Pete character in this discussion and once
again I have made it the fabulous discussion it was.

I had the chops to suggest *sans peur* a "bad", Commie, child molestor,
terrorist, doesn't floss algorithm and to not how it could be improved
and had properties. You then contributed in response a most
illuminating discussion on the XOR solution which should have been
posted from day one (it was as I have said nonsense to claim that you
were protecting other people with the answer, because they should not
look at posts until they get their answer).

I then pointed out that by occluding two different topics (Stupid C
Tricks and algorithms) the Xor solution is unacceptable. I also had a
massive brain fart which I immediately corrected in that I confused
Nand and Xor. This contributed the idea that the known Boolean
operators are reducible, not to Xor but to Nand which will be most
illuminating for anybody who hasn't taken Boolean Algebra for
Boneheads.

>
> >>>And even if it makes no sense to speak of the execution time of a
> >>>program approaching a limit, becoming "close to O(n)", approaching an
> >>>infinitesimal but nonzero value for larger N, note that there's a whole
> >>>field of code improvement within compiler optimization in which the
> >>>improvements are beneath the radar screen of complexity theory.
> >>
> >>There is no such thing as an infintesimal integer value. It is also not
> >
> > No, but there are small integer values.
>
> Then why did you use the word "infinitesimal"?

I was wondering out loud whether you could use ideas from calculus to
enrich the boring topic of integers.

>
> >>interesting to claim that algorithms tend to converge in performance as the
> >>sample size goes down because all algorithms are going to perform just about
> >>the same for n == 0. By this argument, an O(n^n) algorithm is "close to"
> >>O(n)!
> >
> > You're not interested in trivial, degenerate, null cases because as a
> > non-mathematician, as at best a techie, you view those cases as members
> > of a bogus moral category.
> >
> > A REAL mathematician, such as Nash, remains I think until the end of
> > his life interested in the fact that when you multiply by 0, you get 0:
> > that when you multiply n times 1, you get n.
>
> Well, I was a math major in college, and even study my old books from time
> to time but indeed would not call myself a "mathematician". But I think you
> are mistaken that "real" mathematicians are especially interested in the
> trivial cases.

My experience was different, because when I worked with John Nash, he
was solving the "trivial" problem of doing simple arithmetic to
arbitrary precision in C so that he could then solve non-trivial
problems in number theory including Riemann.

I also knew at the time John Horton Conway in the sense that I would
encounter him occasionally in the convenience store, both of us being
the type of man who works late and then must buy a hot pocket at WaWa
to survive. I also attended talks at which Nash and Conway were
present.

Conway gave an entire talk on "trivial" problems of recreational math
including calculating the date of Easter which he showed were royal
roads to deep issues. During that talk, he said he would often prove to
his repeated satisfaction "trivial" lemmas because he'd forgotten
whether he'd proved them.

>
> > My own experience is that a program that fails for either zero or one
> > input is a bad program insofar as its text fails thereby to be a
> > recursive proof of the program's correctness.
>
> Sure, but we were not talking about correctness but about execution speed.

Ooops. "Execution speed", properly considered, should NOT be even
considered to be a surd if the program is not bloody correct, in the
strict sense that if the program is correct, a formal calculus or
metaobject for reasoning about programs should return a special value
such as NOT_BLOODY_CORRECT or Null or infinity.

Missing here is a moral grammar of three different subject areas:
Stupid C Tricks, complexity, correctness. Correctness should be, IMO,
prior logically to complexity which should be way prior to Stupid C
Tricks.

Which is why, I think, Princeton's computer science department has no
class in C.

>
> >>I don't really know what you are talking about w/ regard to compiler
> >>optimization, but based on this discussion I wouldn't want you optimizing our
> >>compiler.
> >
> > Note how you reduce everything to the lower-middle class anxiety of who
> > has a job.
>
> Who said anything about who has a job? I assume that you have one. I just
> wouldn't want you working on our compiler (our product is a software platform
> and programming language) based on what you have said in this ng.

You pretend to be oh, so very value free, and then you come up with a
judgement. And I remain uninterested in working for you corporation,
based on what you have said in this ng, because I've walked away from
jobs in which the corporation has destroyed logical dependencies in the
name of profit.

If on the job you bullied me into using an XOR based algorithm where I
in good conscience felt that future changes would in a hidden fashion
make the algorithm no longer valid, I'd walk.
>
> > And note that given your responses, I am not interested in
> > working with you.
>
> I am not suprised. I am sure that you would much rather work
> with people who never disagree with you.

This is a completely invalid inference. You reason from one instance to
a completely unwarranted generalization. However, it is true based on
my views that the set of people who disagree with me is large and
growing. This is part of the problem, and I believe it causes a high
rate of failure in enterprise systems.

>
> >>>But I also like to fight with people who deal with their own
> >>>feelings of incompetence by transferring them onto innocent bystanders.
> >>
> >>I don't know who you are taking about, but it isn't me.
> >
> > Excuse me, you've been harassing me in this thread since day one.
>
> I have been contradicting you. I know that for you that counts as
> harassment, but I don't think that anyone else thinks so. I know
> perfectly well that I have no "feelings of incompetence" in this
> particular domain, and your suggesting that this characterizes me
> is a rude, personal attack, pure and simple.

Well, this is where a refusal to examine the language leads: charges of
rudeness and an absence of collegiality. My claim is that a computing
culture which prizes a utilitarian efficiency above all else produces
this unprofessional charge.

I warned you that the language produces the bad feeling because this
was my observation in the very early 1970s, when I observed
pre-Weinberg that people simultaneously were ego-driven while in denial
about being so, and in the 1990s, when I saw the return of this
nightmare of computing's childhood...in which a false efficiency and
stunts like XOR create nothing more than confusion.

>
> - Christopher

From: CBFalconer on
spinoza1111(a)yahoo.com wrote:
> Christopher Barber wrote:
>
.... snip ...
>>
>> Uh, no they don't. No optimizing compiler I know of would optimize
>> your bad algorithm, but if you know of one, please speak up.
>
> Look, I've discussed how the language of "bad" (as opposed to
> order(x)) misleads people. But if you want to persist in this
> biographical mode, then you are ignorant of optimizing compilers
> which perform operations including constant subexpression
> evaluation and doubling of the effect of for loops.

.... snip much prattling ...

You failed to specify the compiler that chooses algorithms. Can we
assume that you are familiar with the relative penalties and gains
from choosing bubblesort versus mergesort, and the relationship to
the input data size and condition? I want the name and vendor of
the compiler that will replace one of those algorithms with the
other, and the conditions under which it will so do. Please
explain how it knows whether or not the input data is pre-sorted
(or nearly sorted), which may affect the decision. Please also
explain how constant subexpression evaluation affects the
decisions.

I'll even accept a reference to an available compiler that
specializes in bubble vs merge sort decisions. It just has to
choose (accurately) at compile time.

--
Chuck F (cbfalconer(a)yahoo.com) (cbfalconer(a)worldnet.att.net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net> USE worldnet address!


First  |  Prev  |  Next  |  Last
Pages: 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22
Prev: Algorithm book recommendation?
Next: TaskScheduler.dll