From: Pascal J. Bourguignon on
Patricia Shanahan <pats(a)acm.org> writes:

> BGB / cr88192 wrote:
>> "Patricia Shanahan" <pats(a)acm.org> wrote in message
>> news:gPqdncAsILnoyoDWnZ2dnUVZ_tudnZ2d(a)earthlink.com...
>>> BGB / cr88192 wrote:
>>> ...
>>>> for example, pointer arithmetic, ... is just another part of the job...
>>> This, to my mind, is the very worst thing about C.
>>>
>>> I have seen far too many hard-to-debug operating system bugs due to
>>> widespread pointer arithmetic. It would be far better if C pointer
>>> arithmetic were constrained to conform to the standard. That way, when a
>>> data structure gets clobbered the code doing the clobbering would have
>>> to be something related to the data structure.
>>>
>> bit, it is also essential to pull off many of the things C is able
>> to do.
>> otherwise, people would have to resort to assembler to do these things...
>
> Option 1: Maintain hundreds of thousands of lines of source code any one
> of which could corrupt any data structure.
>
> Option 2: Maintain hundreds of thousands lines of source code with
> limited, controlled access to data structures, and a few assembly
> language functions to do specific jobs such as turning the address
> calculated by a memory allocator into a pointer.
>
> I have spent plenty of time coping with option 1 in UNIX-derived
> operating systems. I can't help thinking that option 2 might be easier.

That's what I think too. I prefer a bigger spread between two
programming languages used in a single project rather than using a
single middle ground programming languages that's neither good at
assembly level programming nor at high level programming.

--
__Pascal Bourguignon__
From: Bill Cunningham on

"Pascal J. Bourguignon" <pjb(a)informatimago.com> wrote in message
news:87aaxuegia.fsf(a)galatea.local...
> "[Jongware]" <sorry(a)no.spam.net> writes:
>
>> Bill Cunningham wrote:
>>> One reason why I am so attracted to C and not just markup
>>> languages scripts and java is that C is for designing OS's. [...]
>>
>> As I recently converted to Mac OS -- how does Objective-C fit in this?
>> From what I've seen it gives the programmer back what he looses with
>> plain C -- an abstraction away from low-level stuff ("how do I create
>> a new window for my app, insert it in the Windows menu, and allow
>> tabbing through that set of windows?").
>> Its KVC/KVO typing is so powerful that I shudder to introduce my own
>> sloppy programming style :-)
>
> Indeed. Basically, Objective-C = C + Smalltalk.
>
> When you program in Objective-C, you will most of the time work at the
> level of the Smalltalk side, dealing only with objects and sending
> message around. This object system is entirely dynamic: you can
> create or load new methods at run-time, you can create new classes at
> run-time, all dispatch is dynamic, you can catch unknown message to
> process them as you want (ie. you can trivially implement proxies,
> etc).
>
> But you still have C at the tip of the finger at all time, so if you
> need to implement a fast loop, you can fall back to C data structures
> and C functions.
>
> So, compared to C, or to C++, Objective-C is a very nice OO language.
>
> Even more so nowadays that Apple provides frameworks (GUI libraries)
> compatible with a garbage collector, so that you can activate the
> garbage collector instead of doing the (semi-) manual (refcount-based)
> memory management of OpenStep.
>
>
> It is to be noted that NeXT Computer Inc. implemented drivers in
> NeXTstep in Objective-C, publishing the IOKit which allowed to mere
> programmers to write NeXTstep drivers easily (eg. subclassing an
> existing generic driver to implement a device specific one). I've not
> followed closely the evolution with Apple, it seems that they have
> dropped the IOKit. (They dropped a lot of goodies from NeXT, and put
> back in slowly only a few of them in the successive versions of
> MacOSX).
>
> But once upon a time I wrote a video chip driver with the IOKit in
> Objective-C.
>
> It is to be noted that the significant Objective-C runtime didn't
> prevent this code to run in kernel space...

I have read that the writer (who I don't know) is the guy who has since
written ruby. I know nothing about ruby but was it designed to write OSs ?
Can an OS be written in markup languages and scripts using web design tools?

Bill


From: BGB / cr88192 on

"Pascal J. Bourguignon" <pjb(a)informatimago.com> wrote in message
news:87aaxuegia.fsf(a)galatea.local...
> "[Jongware]" <sorry(a)no.spam.net> writes:
>
>> Bill Cunningham wrote:
>>> One reason why I am so attracted to C and not just markup
>>> languages scripts and java is that C is for designing OS's. [...]
>>
>> As I recently converted to Mac OS -- how does Objective-C fit in this?
>> From what I've seen it gives the programmer back what he looses with
>> plain C -- an abstraction away from low-level stuff ("how do I create
>> a new window for my app, insert it in the Windows menu, and allow
>> tabbing through that set of windows?").
>> Its KVC/KVO typing is so powerful that I shudder to introduce my own
>> sloppy programming style :-)
>
> Indeed. Basically, Objective-C = C + Smalltalk.
>

this is both a good point and a bad point...

they also mix them at the syntax level, which IMO is not so good for ones'
aesthetic sensibilities.

it is very possible that the look of Obj-C is one of its major hinderances:
it just does not look like C++ and Java and friends...


> When you program in Objective-C, you will most of the time work at the
> level of the Smalltalk side, dealing only with objects and sending
> message around. This object system is entirely dynamic: you can
> create or load new methods at run-time, you can create new classes at
> run-time, all dispatch is dynamic, you can catch unknown message to
> process them as you want (ie. you can trivially implement proxies,
> etc).
>
> But you still have C at the tip of the finger at all time, so if you
> need to implement a fast loop, you can fall back to C data structures
> and C functions.
>
> So, compared to C, or to C++, Objective-C is a very nice OO language.
>

in general, agreed, although granted I have not used it so much for other
reasons.


> Even more so nowadays that Apple provides frameworks (GUI libraries)
> compatible with a garbage collector, so that you can activate the
> garbage collector instead of doing the (semi-) manual (refcount-based)
> memory management of OpenStep.
>

GC has good and bad points, and similar goes for ref-counts.

for my project I use a MM/GC that allows several different "styles":
pure-manual (via gcmalloc/gcfree), however, gcmalloc does allow the
conservative GC to trace through these objects;
hybrid manual/conservative GC, which is my main mode, where objects may be
allocated and freed, or left for the GC to reclaim later (usually if the
lifespan is non-obvious);
precise GC mode, which thus far has gone largely unused (since it would
essentially "split" the heap), and in retrospect was a misguided feature;
conservative+refcounts, where ref-counts may be used on objects specifically
allocated to be ref-counted (for non-refcount objects, the gcincref and
gcdecref functions are ignored).

thus far, I have not much used the refcount mode, since it is rare I write
code where I can ensure that the objects are used in a refcount safe manner.


technically, the GC is using concurrent mark/sweep, where the GC runs in its
own thread.
the main weak point was that this was not added in very well, and so is not
always reliable (mostly since I was using a software write-barrier, and not
all code consistently manages to use the write-barrier function when setting
references...).

from what I remember, I think I made it so that an object has to not have
been seen for 2 GC passes before it is reclaimed though, ...

in general though, it works...


>
> It is to be noted that NeXT Computer Inc. implemented drivers in
> NeXTstep in Objective-C, publishing the IOKit which allowed to mere
> programmers to write NeXTstep drivers easily (eg. subclassing an
> existing generic driver to implement a device specific one). I've not
> followed closely the evolution with Apple, it seems that they have
> dropped the IOKit. (They dropped a lot of goodies from NeXT, and put
> back in slowly only a few of them in the successive versions of
> MacOSX).
>
> But once upon a time I wrote a video chip driver with the IOKit in
> Objective-C.
>
> It is to be noted that the significant Objective-C runtime didn't
> prevent this code to run in kernel space...
>

granted, I have not personally looked much into how ObjC works on the
backend, but it is possibly the case that they included the basic runtime
facilities in the kernel.



From: bartc on

"Pascal J. Bourguignon" <pjb(a)informatimago.com> wrote in message
news:87r5r6cunp.fsf(a)galatea.local...
> Patricia Shanahan <pats(a)acm.org> writes:

>> Option 1: Maintain hundreds of thousands of lines of source code any one
>> of which could corrupt any data structure.
>>
>> Option 2: Maintain hundreds of thousands lines of source code with
>> limited, controlled access to data structures, and a few assembly
>> language functions to do specific jobs such as turning the address
>> calculated by a memory allocator into a pointer.
>>
>> I have spent plenty of time coping with option 1 in UNIX-derived
>> operating systems. I can't help thinking that option 2 might be easier.
>
> That's what I think too. I prefer a bigger spread between two
> programming languages used in a single project rather than using a
> single middle ground programming languages that's neither good at
> assembly level programming nor at high level programming.

That would be my approach too.

I'd use a 'hard' language of my own, one comfortable with inline (and only
inline) assembler.

And a 'soft' language of my own too, to do the bulk of the work.

(And, for good measure, a scripting language of my own to use with a
command/terminal window.)

But in reality an OS would require cooperation with all sorts of other
languages and software and stuff written by other people. And do we get to
design the OS, or implement someone else's? It sounds a messy project in any
case.

--
Bartc

From: Pascal J. Bourguignon on
"BGB / cr88192" <cr88192(a)hotmail.com> writes:

> "Pascal J. Bourguignon" <pjb(a)informatimago.com> wrote in message
> news:87fx7nf6hu.fsf(a)galatea.local...
>> "BGB / cr88192" <cr88192(a)hotmail.com> writes:
>>
>>>> That C is not better than high level programming languages for system
>>>> programming since it needs the same extensions to be effective.
>>>>
>>>
>>> not really, since most of these "extensions" are part of the core
>>> language
>>> by default, and so not seen as extensions by those using the language.
>>>
>>> for example, pointer arithmetic, ... is just another part of the job...
>>
>> This is not a discriminating example. I've been doing pointer
>> arithmetic with languages such as Pascal or Common Lisp with no
>> problem.
>>
>
> Pascal has pointers, yes...
>
> but, I am not debating about Pascal (FWIW, most Pascals fall into a similar
> "implementation bracket" as C and C++, in most regards).
>
> the main problem with Pascal is that it is unlikely one would be able to
> gain much of any real community or industry support if using such a language
> in a project.

On the other hand, Apple has been quite successful with an OS and
Toolbox programmed in Pascal (and assembler), and earned with it
enough money to buy TWO other OS (Taligent, a failure written in C++)
and NeXT Computer Inc (a success, written in C and Objective-C, but
nowadays, they're adding code written in Java, python and scheme).


> granted, the issue is not strictly with CL, only that CL has its own issues
> which I doubt will ever gain it widespread acceptance...

By the way, since you're mentionning the popularity question, by
personnal bet is that lisp being a language that existed for 50 years,
will go on existing for at least 50 years.

Other programming languages created more recently are already
forgotten, or superseded by even more recent programming languages. I
prefer to spend my time programming in a stable environment, rather
than running an arms race with "language designers".



> but, code is needed to run in order to be able to bootstrap itself in the
> first place...
>
> unless you are proposing to use an HLL just to write a compiler to
> trans-compile itself to C or ASM in order to get the thing up-and-running,
> but it is hard to see the "achievement" of this (if the whole point is to
> use the HLL for all its HLL glory in something like kernel development...).

Yes, that's what I propose. That's what is successfully done in
several projects, such as Squeak. Meta programming is so easy in
programming languages such as Lisp or Haskell, that it is a
competitive way of proceeding. One advantage, when you're after
efficiency, is that you're not limited by the C compiler constraints,
and you can generate more efficeint specialized binary code, if you
need to.



> so, what do we have then:
> an OS written in a mix of C, ASM, and Java?...
> AFAIK, this is fairly close to what is used on a lot of cell phones...
>
> but, this is no real achievement here, since in this case the C has not had
> its usefulness "disproved"...

You do not need to generate C or even "assembler" code! You directly
compile the HLL to binary, and boot that. You generate the kernel
image from the HLL.


> the JVM's are typically themselves written in large part in C, only that
> much of the non-core machinery is Java...

There's no need to, if you don't have a C compiler available, you can
generate directly the binary.

When you are developping a kernel you're not in the same situation as
when you're developping an "application" running over a set of
different systems.

> so, the JVM is typically some glob of support machinery:
> the GC, object system, classloader, ...
>
> then the whole rest of the island is Java...
>
> as for the general "Unix-like architecture", this is not likely to go away
> any time soon, as it is currently the most practical architecture on modern
> CPU's (they fit almost hand-in-glove).

Yes, this is a chicken-egg situation, unix/C evolve to match the
processors and the processors evolve to amtch unix/C.

But this is a technological cul-de-sac. We're turning in circles
since 1969, we're trapped like in The Matrix.

That's why we need new kinds of systems, which hopefully the users
will like, so that it get some traction and so that Intel & AMD start
to design their processors taking into account the new needs.


>> Are we discussing popularity? I though we were on a technical newsgroup.
>>
>
> programming language life or death is their popularity...

Then the second most popular programming language is Lisp (the first
is Fortran).


> ofterwise, they fall off into the land of essential non-existance...

Lisp, 50 years and counting...


--
__Pascal Bourguignon__