From: Brian on
On Feb 14, 7:26 am, James Kanze <james.ka...(a)gmail.com> wrote:
> On Feb 13, 5:42 pm, Brian <c...(a)mailvault.com> wrote:
>
> > On Feb 13, 6:19 am, James Kanze <james.ka...(a)gmail.com> wrote:
> > > On 12 Feb, 22:37, Arved Sandstrom <dces...(a)hotmail.com> wrote:
> > > Logically, I think that most of the techniques necessary for
> > > making really high quality software would be difficult to apply
> > > in the context of a free development. And at least up to a
> > > point, they actually reduce the cost of development.
>
> [I really shouldn't have said "most" in the above. "Some"
> would be more appropriate, because there are a lot of
> techniques which can be applied to free development.]
>
> > I'm not sure what you are referring to, but one thing we
> > agree is important to software quality is code reviewing.
> > That can be done in a small company and I'm sometimes
> > given feedback on code in newsgroups and email.
>
> To be really effective, design and code review requires a
> physical meeting. Depending on the organization of the project,
> such physical meetings are more or less difficult.
>
> Code review is *not* just some other programmer happening to
> read your code by chance, and making some random comments on
> it. Code review involves discussion. Discussion works best
> face to face. (I've often wondered if you couldn't get similar
> results using teleconferencing and emacs's make-frame-on-display
> function, so that people at the remote site can edit with you.
> But I've never seen it even tried. And I note that where I
> work, we develop at two main sites, one in the US, and one in
> London, we make extensive use of teleconferencing, and the
> company still spends a fortune sending people from one site to
> the other, because even teleconferencing isn't as good as face
> to face.)


It hadn't really dawned on me that my approach might be
thought of like that. The rabbis teach that G-d controls
everything; there's no such thing as chance or coincidence.
The Bible says, "And we know that all things work together
for good to them that love G-d, to them who are the called
according to His purpose." Romans 8:28. I get a lot of
intelligent and useful discussion on gamedev.net, here and
on Boost. It's up to me though to sift through it and
decide how to use the feedback. I've incorporated at
least three suggestions mentioned on gamedev and quite a
few more from here. The latest gammedev suggestion was to
use variable-length integers in message headers -- say for
message lengths. I rejected that though as a redundant
step since I'm using bzip for compression of data. I
thought for awhile that was the end of that, but then
remembered that there's a piece of data that wasn't
compressed -- the length of the compressed data that is
sent just ahead of the compressed data. So now, when
someone uses compression, the length of the compressed
data is generally also compressed with the following:
(I say generally because it depends on the length of
data.)


uint8_t
CalculateIntMarshallingSize(uint32_t val)
{
if (val < 128) { // 2**7
return 1;
} else {
if (val < 16384) { // 2**14
return 2;
} else {
if (val < 2097152) {
return 3;
} else {
if (val < 268435456) {
return 4;
} else {
return 5;
}
}
}
}
}


// Encodes integer into variable-length format.
void
encode(uint32_t N, unsigned char* addr)
{
while (true) {
uint8_t abyte = N & 127;
N >>= 7;
if (0 == N) {
*addr = abyte;
break;
}
abyte |= 128;
*addr = abyte;
++addr;
N -= 1;
}
}


void
Flush()
{
uint8_t maxBytes =
CalculateIntMarshallingSize(compressedBufsize_);
uint32_t writabledstlen = compressedBufsize_ - maxBytes;
int bzrc = BZ2_bzBuffToBuffCompress(reinterpret_cast<char*>
(compressedBuf_ + maxBytes),
&writabledstlen,
reinterpret_cast<char*>
(buf_), index_,
7, 0, 0);
if (BZ_OK != bzrc) {
throw failure("Buffer::Flush -- bzBuffToBuffCompress failed ");
}

uint8_t actualBytes = CalculateIntMarshallingSize(writabledstlen);

encode(writabledstlen, compressedBuf_ + (maxBytes - actualBytes));
PersistentWrite(sock_, compressedBuf_ + (maxBytes - actualBytes),
actualBytes + writabledstlen);
index_ = 0;
}


Those functions are from this file --
http://webEbenezer.net/misc/SendCompressedBuffer.hh.
compressedBuf_ is an unsigned char*. I've thought that the
calculation of maxBytes should be moved to the constructor,
but I have to update/improve the Resize() code first.
We've discussed the Receive function previously. I now have
a SendBuffer class and a SendCompressedBuffer class. This is
the SendCompressedBuffer version of Receive --

void
Receive(void const* data, uint32_t dlen)
{
unsigned char const* d2 = reinterpret_cast<unsigned char
const*>(data);
while (dlen > bufsize_ - index_) {
memcpy(buf_ + index_, d2, bufsize_ - index_);
d2 += bufsize_ - index_;
dlen -= bufsize_ - index_;
index_ = bufsize_;
Flush();
}

memcpy(buf_ + index_, d2, dlen);
index_ += dlen;
}



>
> > > So theoretically, the quality of commercial software should
> > > be considerably higher than that of free software.
> > > Practically, when I actually check things out... g++ is one
> > > of the better C++ compilers available, better than Sun CC or
> > > VC++, for example.
> > Maybe now that Sun CC and VC++ are free they'll improve. :)
>
> I doubt it. Making something free doesn't change your
> development process. (On the other hand, if it increases the
> number of users, and thus your user feedback, it may help. But
> I don't think any quality problems with VC++ can be attributed
> to a lack of users.)

I think it changes the development process. If it doesn't
then they probably haven't thought much about the implications
of making it free. They are in a battle of perception. Many
people have thought that Microsoft is a greedy company that
makes mediocre products. Giving away some software, while
going against their nature, is done, I think, to help improve
their image. They are forced into what 25 years ago would
have been unthinkable. I don't really think it will radically
improve their product either, though. As I've indicated I
don't think they are coming to the decision because they've
had a change of heart. It's more of a necessity being
imposed upon them. However, as I often say -- better late
than never.



Brian Wood
http://webEbenezer.net
(651) 251-938
From: James Kanze on
On Feb 14, 4:45 pm, Seebs <usenet-nos...(a)seebs.net> wrote:
> On 2010-02-14, James Kanze <james.ka...(a)gmail.com> wrote:

> > Really. I've not seen any free software which adopted all
> > of the best practices.

> Bespoke software may. But go to a store that sells discs in
> boxes, and tell me with a straight face that any of those
> boxes contain software developed through a development
> operation which adpoted all of the best practices.

I've already stated that most commercial organizations aren't
doing a very good job either. There's a big difference between
what is feasible, and what is actually done.

[...]
> > First, free software doesn't have the highest quality. When
> > quality is really, really important (in critical systems), you
> > won't see any free software.

> I'm not totally sure of this.

I am. If only because such projects require a larger degree of
accountability than free software can offer. I can't see anyone
providing free software with contractual penalties for downtime;
most of the software I worked on in the 1990's had such
penalties.

--
James Kanze
From: James Kanze on
On Feb 14, 4:54 pm, Lew <no...(a)lewscanon.com> wrote:
> James Kanze wrote:
> >> Did you actually try using any free software back in the early
> >> 1990's [sic]?
> Seebs wrote:
> > I did.

> Same here.

> > NetBSD was for the most part reliable and bulletproof during
> > that time; it ran rings around several commercial Unixes. I
> > had no interest in g++; so far as I could tell, at that
> > time, "a C++ compiler" was intrinsically unusable. But gcc
> > was stable enough to build systems that worked reliably, and
> > the BSD kernel and userspace were pretty livable.
> James Kanze wrote:
> >> Neither Linux nor g++ were even usable, and emacs (by

> That's pure fantasy.

> I used a couple of Linux distributions in the early nineties,
> and they worked better than commercial UNIX variants.

And I tried to use them, and they just didn't stop crashing.
Even today, Linux is only gradually approaching the level of the
Unixes back then.

> I used emacs and knew many who used vi back then. They were
> solid.

I used vi back then. It didn't have many features, but it was
solid. It was also a commercial product. Emacs depended on the
version. Some worked, some didn't.

> I used gcc by the mid-90s and it was rock solid, too.

G++ was a joke. A real joke until the mid-1990's. It was usual
to find more bugs in the compiler than in freshly written code.

> I used free software even as far back as the late 80s that
> worked beautifully.

> The facts to back up your assertions are not in evidence.

They are for anyone who is open and honest about it. I did
compiler evaluations back then, so I know pretty well what I'm
talking about. We measured the differences.

--
James Kanze
From: James Kanze on
On Feb 14, 4:56 pm, Seebs <usenet-nos...(a)seebs.net> wrote:
> On 2010-02-14, James Kanze <james.ka...(a)gmail.com> wrote:

> > To be really effective, design and code review requires a
> > physical meeting. Depending on the organization of the project,
> > such physical meetings are more or less difficult.

> Nonsense.

The more channels you have available, the better communication
works.

> > Code review is *not* just some other programmer happening to
> > read your code by chance, and making some random comments on
> > it. Code review involves discussion. Discussion works best
> > face to face.

> IMHO, this is not generally true. Of course, I'm autistic, so
> I'd naturally think that.

There are probably some special exceptions, but other peoples
expressions and gestes are a vital part of communications.

Not to mention the informal communications which occur when you
meet at the coffee pot. I've worked from home, and in the end,
I was frustrated by it because I was missing so much of the
informal communications which make things go.

> But I've been watching a lot of code reviews (our review
> process has named reviewers, but also has reviews floating
> about on a list in case anyone else sees something of
> interest, which occasionally catches stuff). And what I've
> seen is that a whole lot of review depends on being able to
> spend an hour or two studying something, or possibly longer,
> and write detailed analysis -- and that kind of thing is
> HEAVILY discouraged for most people by a face-to-face meeting,
> because they can't handle dead air.

That sort of thing is essential for any review. You do it
before the face-to-face meeting. But the reviewer isn't God,
either; the purpose of the meeting is to discuss the issues, not
to say that the coder did it wrong.

> Certainly, discussion is essential to an effective review.
> But discussion without the benefit of the ability to spend
> substantial time structuring and organizing your thoughts will
> feel more effective but actually be less effective, because
> you're substituting primate instincts for reasoned analysis.

> I really don't think that one can be beaten. If what you need
> for a code review is for someone to spend hours (or possibly
> days) studying some code and writing up comments, then trying
> to do it in a face-to-face meeting would be crippling. Once
> you've got the comments, you could probably do them
> face-to-face, but again, that denies you the time to think
> over what you've been told, check it carefully, and so on.
> You want a medium where words sit there untouched by the
> vagaries of memory so you can go back over them.

> But!

> You do need people who are willing and able to have real
> discussions via text media. That's a learned skill, and not
> everyone's learned it.

> It is not universally true that discussion "works best face to
> face".

Almost universally. Ask any psychologist. We communicate
through many different channels.

--
James Kanze
From: Seebs on
On 2010-02-16, James Kanze <james.kanze(a)gmail.com> wrote:
> And I tried to use them, and they just didn't stop crashing.
> Even today, Linux is only gradually approaching the level of the
> Unixes back then.

I guess it depends on which unixes, and which Linux. When I went from
SVR4 Unix to NetBSD, though, I had a LOT less downtime.

> I used vi back then. It didn't have many features, but it was
> solid. It was also a commercial product. Emacs depended on the
> version. Some worked, some didn't.

The version I used (nvi) was nearly-rock-solid. Which is to say, I
found and reported a bug and it was fixed within a day. And I've been
using the same version of nvi that I was using in 1994 ever since, and
I have not encountered a single bug in >15 years.

>> I used gcc by the mid-90s and it was rock solid, too.

> G++ was a joke. A real joke until the mid-1990's. It was usual
> to find more bugs in the compiler than in freshly written code.

I said gcc, not g++. And while, certainly, it has bugs, so has every
other compiler I've used. I had less trouble with gcc than with sun
cc. I used a commercial SVR4 which switched to gcc because it was
noticably more reliable than the SVR4 cc.

> They are for anyone who is open and honest about it. I did
> compiler evaluations back then, so I know pretty well what I'm
> talking about. We measured the differences.

I do not think it is likely that implying that anyone who disagrees
with you is being dishonest will lead to productive discussion. My
experiences with free software were apparently different from yours --
or perhaps my experiences with commercial software were different.

Whatever the cause, the net result is that by the mid-90s, I had a strong
preference for free tools and operating systems, because they had
consistently been more reliable for me.

-s
--
Copyright 2010, all wrongs reversed. Peter Seebach / usenet-nospam(a)seebs.net
http://www.seebs.net/log/ <-- lawsuits, religion, and funny pictures
http://en.wikipedia.org/wiki/Fair_Game_(Scientology) <-- get educated!