From: jmfbahciv on
In article <dqhet218geimesbosiht3bequ5s09jtmmg(a)4ax.com>,
MassiveProng <MassiveProng(a)thebarattheendoftheuniverse.org> wrote:
>On Sat, 17 Feb 2007 16:56:11 +0000 (UTC), kensmith(a)green.rahul.net
>(Ken Smith) Gave us:
>
>>In article <er6rf6$8ss_002(a)s994.apx1.sbo.ma.dialup.rcn.com>,
>> <jmfbahciv(a)aol.com> wrote:
>>>In article <d1ict2h4c5s3m3e5unsu4aagl2fpj0s49n(a)4ax.com>,
>>> MassiveProng <MassiveProng(a)thebarattheendoftheuniverse.org> wrote:
>>>>On Fri, 16 Feb 07 12:25:03 GMT, jmfbahciv(a)aol.com Gave us:
>>>>
>>>>>> Right now I also have LTSpice running on another
>>>>>>desktop. I'm typing this while if figures.
>>>>>
>>>>>My point is that you should not have to have another computer _system_
>>>>>to do any other task.
>>>>
>>>>
>>>> He said on another desktop. If you had any modern brains, you would
>>>>know that that IS the SAME computer, that has multiple "desktops". A
>>>>feature of Linux GUIs.
>>>
>>>IF that is true, the renaming of this term is going to cause a lot
>>>of problems.
>>
>>It may have been better if a new term was invented. All the existing
>>terms had meanings:
>>
>>"another screen" is bad because many Linux systems have more than one
>>screen
>>
>>"another virtual screen" is bad because many Linux systems have a non
>>graphics virtual screen along with the graphics one. You can configure
>>for only one "desk top" and still have "another virtual screen". Also
>>the "virtual screen" may be larger than the physical hardware screen.
>>
>>"another window" won't do because the term window is used for a part of
>>what is on the screen.
>>
> DESKTOP was always the right term. It refers to the PC's desktop,
>not another physical machine's desktop, and she is an idiot to think
>it would.

Desktop is a new computing term and referred to the physical
dimensions of the computer you were using.

/BAH
From: jmfbahciv on
In article <87mz3csv1x.fsf(a)nonospaz.fatphil.org>,
Phil Carmody <thefatphil_demunged(a)yahoo.co.uk> wrote:
>jmfbahciv(a)aol.com writes:
>> In article <87fy94udes.fsf(a)nonospaz.fatphil.org>,
>> Phil Carmody <thefatphil_demunged(a)yahoo.co.uk> wrote:
>> >jmfbahciv(a)aol.com writes:
>> >> In article <aaict21nu9t1faaiodh912qu7en2240379(a)4ax.com>,
>> >> MassiveProng <MassiveProng(a)thebarattheendoftheuniverse.org> wrote:
>> >> >On Fri, 16 Feb 07 12:25:03 GMT, jmfbahciv(a)aol.com Gave us:
>> >> >
>> >> >> Other
>> >> >>than instrumentation, there usually isn't any computing task that
>> >> >>has to have the CPU pay attention to it *right now*.
>> >> >
>> >> >
>> >> > A/V stream decoding does. Hell, even MP3 stream decoding does.
>> >> >
>> >> > When I watch Lost episodes on ABC.com, those streams get a LOT of
>> >> >CPU time slices simply because the stream MUST be processed
>> >> >continually.
>> >>
>> >> Son, it is time you learned about buffered mode I/O.
>> >
>> >Idiot. Presume the stream is all buffered in memory - how does that
>> >affect the fact that the processor must constantly be throwing
>> >up frame after frame to the screen?
>>
>> The CPU isn't doing that work. That's what the video card
>> does.
>
>So the CPU doesn't do bistream parsing, DCTs, deblocking,
>interpolation, etc?

I wouldn't put that work on the CPU. The video card should be
doing that because not all video cards are the same.
But that would require a standard which apparently verboten in
certain areas of this biz.
>
>> > It doesn't. So the buffering
>> >or otherwise is irrelevant. You're completely hatstand.
>>
>> Why does the CPU have to be latched with the video card painting?
>> Not even your computer games work this way. The CPU does not
>> say throw this pixel at that TTY x,y address and then get back to me
>> when you have lit it.
>
>Was that supposed to make sense?

yes.
>
>> >Sure, a reasonably capable processor will only spend a fraction
>> >of the time doing the decoding/filtering/scaling/whatever, but
>> >for that timeslice, it's working on something that must be
>> >processed in real time.
>>
>> Why real time?
>
>Because watching vids is a real time process. Sheesh.

No, it is not a real time computing application. It is a
sequential task. It doesn't matter how long the movie
takes to get to your screen; all that matters is that it's
displayed sequentially.

>
>> The CPU is sitting idle most of time.
>
>That's what I said.
>
>> The idle
>> time can be used for other stuff.
>
>I am fully aware of this.
>
>However, when there is a new field for processing, it needs to
>be processed, it can't be ignored without loss of quality.
>
>> This is not a new concept; it's
>> been around since females had to cook, rear kids, and entertain
>> the males so they would stick around for a while.
>
>Females do not have to do that.

You have a lot to learn.

/BAH
From: jmfbahciv on
In article <cj2ka4-ise.ln1(a)sirius.tg00suus7038.net>,
The Ghost In The Machine <ewill(a)sirius.tg00suus7038.net> wrote:
>In sci.physics, MassiveProng
><MassiveProng(a)thebarattheendoftheuniverse.org>
> wrote
>on Sat, 17 Feb 2007 09:59:49 -0800
><tiget2h5auga6jl46gn46oisadv8ckr322(a)4ax.com>:
>> On Sat, 17 Feb 07 14:08:30 GMT, jmfbahciv(a)aol.com Gave us:
>>
>>>The CPU isn't doing that work. That's what the video card
>>>does.
>>
>>
>> WRONG. The cpu is what the video playback applets run, and THAT is
>> 100% cpu intensive for EACH AND EVERY FRAME of video PASSED to the
>> video card.
>
>http://www.youtube.com/watch?v=sHzdsFiBbFc
>
>is what I used for metrics. CPU utilization appears to be about 50%
>according to my CPU monitor. (Athlon XP 1600++, 1.4 MHz. 512 MB.
>BT5500 RV250-based video system. OS: Linux 2.6.20 Gentoo 2006.1.
>DSL line incoming. No skipping noted on this particular video
>during initial stream. Playback was possible without network IO.
>Note that this was in "tinyscreen mode".
>
>(This video is safe for work: "Spiders On Drugs".)
>
>Another test case
>
>http://www.youtube.com/watch?v=xhd2lnCTWQM
>
>skipped horribly on initial load, but that looks to be
>more of a bandwidth problem than a CPU one. CPU utilization was
>slightly lower.
>
>SFW. Its main themes are apparently music, a school
>bus, and dancing. Replay was possible without skipping.
>Full screen utilized almost 90% of CPU, so that might be
>an issue.
>
>FWIW.
>
If this becomes a common usage, it sounds like a dedicated
processor will be installed.

/BAH
From: jmfbahciv on
In article <9lget29959rd2oo2ubspo6l0nuqgpgm8ef(a)4ax.com>,
MassiveProng <MassiveProng(a)thebarattheendoftheuniverse.org> wrote:
>On Sat, 17 Feb 07 14:08:30 GMT, jmfbahciv(a)aol.com Gave us:
>
>>Why does the CPU have to be latched with the video card painting?
>>Not even your computer games work this way. The CPU does not
>>say throw this pixel at that TTY x,y address and then get back to me
>>when you have lit it.
>>>
>
>
> The applet does. and the applet tracks window position and size,
>etc. and the codec processes the stream as it is read from the file
>which got buffered from the online stream, and that codec is 100% CPU
>intensive. That processed frame set gets passed to the video card's
>space as an OVERLAY. That's why a screenshot with a player running
>does not capture the player's frame.

Barf. This kind of video will need a dedicated processor. It
the streaming interruptable?

/BAH

From: jmfbahciv on
In article <5sget2h5v9vgso9ekm63run3pn8dm2vf26(a)4ax.com>,
MassiveProng <MassiveProng(a)thebarattheendoftheuniverse.org> wrote:
>On Sat, 17 Feb 07 14:08:30 GMT, jmfbahciv(a)aol.com Gave us:
>
>>Why real time?
>
> Because it is processed, compressed video data.
>It has to be processed to be rendered by the video card.

That's not real time. Real time implies that the image has
to be display in the same instant that the image was first
made. What you guys are talking about is a sequential process.
It doesn't matter when the bits are created on your system as
long as they are sequential.
<snip--you just can't stop being a snot, can you?>

/BAH