From: Seebs on
On 2010-05-12, Daniel T. <daniel_t(a)earthlink.net> wrote:
> The fact is, it's quite hard (though maybe not impossible,) to find a
> justification for using the example provided, even if the example is
> good for justifying SEME. (Even Seebs example of a dungeon game doesn't
> work, because it is highly unlikely that such a game would contain a 3-D
> array that is ragged in each dimension.)

In each dimension, no, but I have seen many in which the rows of locations
were not contiguous for each level, and where width and height could vary
from one level to another. So while it's true that adjacent rows are not
usually of different widths, it is quite common for adjacent rows to be
separate allocations, because the width of rows in different levels is
different, so foo[x][y] would not do the right thing if you tried to make
them all one big hunk.

-s
--
Copyright 2010, all wrongs reversed. Peter Seebach / usenet-nospam(a)seebs.net
http://www.seebs.net/log/ <-- lawsuits, religion, and funny pictures
http://en.wikipedia.org/wiki/Fair_Game_(Scientology) <-- get educated!
From: Lie Ryan on
On 05/12/10 23:46, Nick Keighley wrote:
> On 12 May, 12:44, Richard Heathfield <r...(a)see.sig.invalid> wrote:
>> Nick Keighley wrote:
>>> On 12 May, 03:00, pete <pfil...(a)mindspring.com> wrote:
>>
>> <snip>
>>
>>>> I find SEME to be less objectionable in functions
>>>> which are small enough so that you can't look at one exit
>>>> without noticing the all the rest of the exits.
>>
>>> is there any other sort of function?
>>
>> Hell yes!
>
> I once counted 11 pages (of blue stripey listing) between the "if" and
> it's corresponding "else". And that was 11 pages of dense and tangled
> code. I think it had gotos as well. Wish I'd framed it.

written by imperative programmer who hasn't got accostumed to making
function/method/procedure/whatever?
From: Lie Ryan on
On 05/12/10 21:42, Ben Bacarisse wrote:
> Lie Ryan <lie.1296(a)gmail.com> writes:
>
>> On 05/11/10 09:56, Ben Bacarisse wrote:
>>> Seebs <usenet-nospam(a)seebs.net> writes:
>>>
>>>> On 2010-05-10, Ben Bacarisse <ben.usenet(a)bsb.me.uk> wrote:
>>> <snip>
>>>>> You can't assume that the data is contiguous or, to be a little more
>>>>> general, that a single [] operation means the same as three of them.
>>>>> This is, after all, C++ not C.
>>>>
>>>> In C++ it's at least conceivable that someone could come up with some
>>>> exceptionally sneaky overloading of [] to make it viable.
>>>
>>> That's why I said "you can't assume...". It is conceivable but it is
>>> not a sensible assumption for the poster[1] to make. The post was about
>>> a re-write to simplify come code. If that involves a "sneaky
>>> overloading of []" any argument about simplification is blown away.
>>
>> If it hides the complexity, then it could be argued that it's simpler,
>> at least from the outside.
>
> For the purposes of evaluating the complexity of a solution, you have to
> look at all the code. If a re-definition of is [] involved, then
> that is part of the solution and we never got to see it.

That is objective complexity.

> I said that the argument is blown away -- not necessarily the
> conclusion. All code can be simplified to "find_solution_to_problem()"
> (or some variation thereof) but that's not a sound argument.

If that function exists, then at that specific level of viewpoint, the
specific problem is trivial. In subjective complexity, the current level
of viewpoint is all that matters; the fact that one of the function
being called is complex is considered irrelevant.

In short, if you break your code into many small, trivial functions, it
would generally score very good in terms of subjective complexity, even
though its objective complexity may actually be equal (or even slightly
higher) than when the code is all in one place.

>> C/C++ hides the complexity of assembly, who
>> would argue that?
>>
>> I think there are two separate definition of complexity:
>
> Only two? :-)

correction: "at least two"
From: Nick Keighley on
On 12 May, 09:18, gwowen <gwo...(a)gmail.com> wrote:
> On May 11, 6:44 pm, Seebs <usenet-nos...(a)seebs.net> wrote:
> > On 2010-05-11, Richard Heathfield <r...(a)see.sig.invalid> wrote:
> > > Seebs wrote:

> > >>> You already have a list of foos, properly ordered for fast searching.
> > >> No, I don't.
> > > Then I suggest you get one.
>
> > It's often completely unsuitable to the problem at hand.
>
> If your problem is not amenable to Richard H's potted solutions, the
> fault lies with your problem, not Richard's solution.  You should know
> this by now.
>
> So if you have a disk-backed self-caching nested container storing a
> 2million * 2million * 2million high resolution MRI scan image,
> scrupiously reconstructed using Radon transforms.  You're searching
> for pixels of a certain shade (that indicate a possible tumor, say),
> then the first thing you should do is convert it into a one-
> dimensional in-memory sorted C-array.

how long does the sort take? Presumably you're doing more than one
search.

> The fact that you won't be able
> to obtain the co-ordinates afterwards to display the appropriate slice
> to the physician is irrelevant.

well he might not agree

> Sort and binary search.  Any other
> solution is wrong.

I'm guessing you've actually done this

From: Dann Corbit on
In article <0d8a3e52-4885-4443-9e10-679d56ccd2d3
@a34g2000yqn.googlegroups.com>, nick_keighley_nospam(a)hotmail.com says...
>
> On 12 May, 09:18, gwowen <gwo...(a)gmail.com> wrote:
> > On May 11, 6:44�pm, Seebs <usenet-nos...(a)seebs.net> wrote:
> > > On 2010-05-11, Richard Heathfield <r...(a)see.sig.invalid> wrote:
> > > > Seebs wrote:
>
> > > >>> You already have a list of foos, properly ordered for fast searching.
> > > >> No, I don't.
> > > > Then I suggest you get one.
> >
> > > It's often completely unsuitable to the problem at hand.
> >
> > If your problem is not amenable to Richard H's potted solutions, the
> > fault lies with your problem, not Richard's solution. �You should know
> > this by now.
> >
> > So if you have a disk-backed self-caching nested container storing a
> > 2million * 2million * 2million high resolution MRI scan image,
> > scrupiously reconstructed using Radon transforms. �You're searching
> > for pixels of a certain shade (that indicate a possible tumor, say),
> > then the first thing you should do is convert it into a one-
> > dimensional in-memory sorted C-array.
>
> how long does the sort take? Presumably you're doing more than one
> search.
>
> > The fact that you won't be able
> > to obtain the co-ordinates afterwards to display the appropriate slice
> > to the physician is irrelevant.
>
> well he might not agree
>
> >�Sort and binary search. �Any other
> > solution is wrong.
>
> I'm guessing you've actually done this

I hope not. It would be a travesty to perform a sort to locate somthing
easily found with a linear search.

Exactly N operations (where N is the pixel count) are needed to find
every pixel of a certain color (if K different pixels are needed then
hash the K wanted pixels to find the desired hash signatures, and have K
buckets, discarding all unwanted hash values).
Sorting will take N*log(N) operations.
8 * 10^18 is a lot of operations and will take some time.
Increasing that to C * 43 * 8 * 10^18 {where C is necessarily larger
than 1} is not a good idea, as I see it.

BTW, I guess that sorting this image will be rather problematic because
it is going to be an EXTERNAL sort for sure (at least 8 TB if the pixels
can only hold 256 values, and that is doubtful for a full resolution
medical image). That means it will probably take weeks.

I shudder at the sorting suggesting, and I love to sort things.