Prev: mp3 player will not play songs
Next: Gale turntable
From: Phil Allison on 26 Feb 2010 17:59 <Meat Plow> >>** And if you put the remark back into its context - what it IS relevant >>to becomes obvious.; > > No it doesn't. ** Yes it does - you ASD FUCKED TENTH WIT !
From: Arfa Daily on 26 Feb 2010 20:30 <Meat Plow> wrote in message news:3i5job.5qb.19.13(a)news.alt.net... > On Fri, 26 Feb 2010 02:32:38 -0000, "Arfa Daily" > <arfa.daily(a)ntlworld.com>wrote: > >> >>"William Sommerwerck" <grizzledgeezer(a)comcast.net> wrote in message >>news:hm7241$21g$1(a)news.eternal-september.org... >>> First, the only televisions that use LEDs use OLEDs. There are none >>> using >>> conventional LEDs. >>> >>> Second, there are no strict definitions of what these refresh rates >>> mean. >>> In >>> some cases, the set generates an interpolated image at that rate, in >>> others, >>> a blank (black) raster is inserted. Some sets combine both. >>> >>> I don't like this enhancement (which was one of the reasons I bought a >>> plasma set). It has a nasty side-effect -- it makes motion pictures look >>> like video. This might be fine for a TV show; it isn't when you're >>> watching >>> movies. Be sure that whatever set you purchase has some way of defeating >>> it >>> the enhancement. >>> >>> You need to actually look at the sets you're considering with program >>> material you're familiar with. >>> >>> >> >>Seconded on all counts, and also the reason that I recently bought a >>plasma >>TV (Panasonic, 50" full HD panel, 400Hz). I have not seen a single thing >>about this TV that I don't like so far, unlike the LCD TVs that I have in >>the house, and the LCDs that cross my bench for repair, all of which >>suffer >>from motion artifacts, scaling artifacts, and motion blur ... >> >>This plasma TV has produced absolutely stunning HD pictures from the >>Winter >>Olymics, with not the slightest sign of motion artifacts of any >>description, >>even on the fastest content like downhill skiing, and bobsleigh etc. In >>contrast, the same content that I have seen on LCDs, has been perfectly >>dreadful. >> >>Arfa >> > > Maybe I'm not picky but those motion artifacts just aren't important > enough for me to want to spend thousands on something that doesn't > produce them. I have a fairly cheep 32" and while it does produce some > artifacts they are insignificant to the overall performance. But the point is that you no longer have to pay thousands to get that performance. The plasma that I recently bought was little more to buy than a 'good' LCD, but the performance is easily a whole order of magnitude higher. CRT sets did not suffer from motion artifacts, and I wasn't prepared to 'downgrade' my viewing experience by buying something which did. The LCD that I have in my kitchen, also 32" and also 'fairly cheap', does suffer from motion artifacts which are particularly noticeable on high speed stuff like the winter olympics. I actually do find these significant and annoying, and I would not consider having to put up with such a picture on my main TV. Fortunately, the latest generation of affordable plasmas, means that I don't have to :-) Arfa
From: Sylvia Else on 26 Feb 2010 21:59 On 27/02/2010 1:22 AM, William Sommerwerck wrote: >> LCDs don't flicker anyway, regardless of their framerate. The frame >> rate issue relates to addressing the judder you get as a result of >> the image consisting of a sequence of discrete images, rather than >> one that continously varies. > > Not quite, otherwise the issue would occur with plasma displays. Indeed, it > would with any moving-image recording system. > > The problem is that LCDs don't respond "instantaneously". They take a finite > time to go from opaque to the desired transmission level, and then back > again. The result is that the image can lag and "smear". (25 years ago, the > first pocket LCD color TVs from Casio had terrible smear, which added an > oddly "artistic" quality to sports.) > > For reasons not clear to me, adding interpolated images reduces the smear. > This makes absolutely no sense whatever, as the LCD now has /less/ time to > switch. I've never gotten an answer on this. > Many years ago (using a Sinclair Spectrum no less) I noticed an effect whereby if a small character sized square was moved across the screen in character sizes steps, the eye perceived phantom squares at intervening positions. Since the computer was not displaying these intermediate squares, their appearance must have been due to the eye. The likely explanation was that the eye was traversing the screen smoothly to follow the square, but the square itself was moving in discrete steps. So the eye was causing the image of the square to be smeared across the retina. I was seeing this effect on a CRT screen, but the longer the persistence of the image on the screen the worse the effect would be. Interpolating the position of the image on the screen would reduce that effect. However, I can't explain why this would be less pronounced on a plasma screen. > >> It doesn't help that much TV material that was recorded on film is >> transmitted with with odd and even interlaced frames that are scans of >> the same underlying image (or some variation thereon), so that the >> effective refresh rate considerably lower that the interlaced rate. > > Interlaced images can be de-interlaced. Note that most product reviews test > displays for how well they do this. > They have to be deinterlaced for display on any screen with significant persistence, but deinterlacing doesn't increase the underlying frame rate. Sylvia.
From: Arfa Daily on 27 Feb 2010 08:12 <snip> "Sylvia Else" <sylvia(a)not.at.this.address> wrote in message news:4b888aaa$0$32078$c3e8da3(a)news.astraweb.com... > > Many years ago (using a Sinclair Spectrum no less) I noticed an effect > whereby if a small character sized square was moved across the screen in > character sizes steps, the eye perceived phantom squares at intervening > positions. Since the computer was not displaying these intermediate > squares, their appearance must have been due to the eye. The likely > explanation was that the eye was traversing the screen smoothly to follow > the square, but the square itself was moving in discrete steps. So the eye > was causing the image of the square to be smeared across the retina. I was > seeing this effect on a CRT screen, but the longer the persistence of the > image on the screen the worse the effect would be. Interpolating the > position of the image on the screen would reduce that effect. > > However, I can't explain why this would be less pronounced on a plasma > screen. Because LCD cells are painfully slow at switching, which equates to a long persistence phosphor on a CRT, which as you say yourself, makes the effect worse. Plasma cells are very fast, particularly now that 'pre-fire' techniques are used to 'ready' them for switching. This is the equivalent of having a very short persistence phosphor on a CRT. If you arrange for the drive electronics to be able to deliver the cell drives with no more delay than the cells themselves are contributing, then the result will be smooth motion without any perceivable blur, which is pretty much how it was with a standard domestic CRT based CTV. Arfa > >> >>> It doesn't help that much TV material that was recorded on film is >>> transmitted with with odd and even interlaced frames that are scans of >>> the same underlying image (or some variation thereon), so that the >>> effective refresh rate considerably lower that the interlaced rate. >> >> Interlaced images can be de-interlaced. Note that most product reviews >> test >> displays for how well they do this. >> > > They have to be deinterlaced for display on any screen with significant > persistence, but deinterlacing doesn't increase the underlying frame rate. > > Sylvia. >
From: Sylvia Else on 27 Feb 2010 22:18
On 28/02/2010 12:12 AM, Arfa Daily wrote: > <snip> > > > "Sylvia Else"<sylvia(a)not.at.this.address> wrote in message > news:4b888aaa$0$32078$c3e8da3(a)news.astraweb.com... >> >> Many years ago (using a Sinclair Spectrum no less) I noticed an effect >> whereby if a small character sized square was moved across the screen in >> character sizes steps, the eye perceived phantom squares at intervening >> positions. Since the computer was not displaying these intermediate >> squares, their appearance must have been due to the eye. The likely >> explanation was that the eye was traversing the screen smoothly to follow >> the square, but the square itself was moving in discrete steps. So the eye >> was causing the image of the square to be smeared across the retina. I was >> seeing this effect on a CRT screen, but the longer the persistence of the >> image on the screen the worse the effect would be. Interpolating the >> position of the image on the screen would reduce that effect. >> >> However, I can't explain why this would be less pronounced on a plasma >> screen. > > > > Because LCD cells are painfully slow at switching, which equates to a long > persistence phosphor on a CRT, which as you say yourself, makes the effect > worse. Plasma cells are very fast, particularly now that 'pre-fire' > techniques are used to 'ready' them for switching. This is the equivalent of > having a very short persistence phosphor on a CRT. If you arrange for the > drive electronics to be able to deliver the cell drives with no more delay > than the cells themselves are contributing, then the result will be smooth > motion without any perceivable blur, which is pretty much how it was with a > standard domestic CRT based CTV. > > Arfa It seems to me that the effect would be visible on any display that has any degree of persistence. Even if LCDs switched instantaneously, they'd still be displaying the image for the full frame time and then instantaneously switching to the next image. This would produce the smearing effect in the way I've described. To avoid it, one needs a display that produces a short bright flash at, say, the beginning of the display period, and remains dark for the rest of the time. As I understand plasma displays, that's not how they work. Sylvia. |