Prev: mp3 player will not play songs
Next: Gale turntable
From: Arfa Daily on 28 Feb 2010 20:17 "Sylvia Else" <sylvia(a)not.at.this.address> wrote in message news:006912a3$0$2891$c3e8da3(a)news.astraweb.com... > On 28/02/2010 12:12 AM, Arfa Daily wrote: >> <snip> >> >> >> "Sylvia Else"<sylvia(a)not.at.this.address> wrote in message >> news:4b888aaa$0$32078$c3e8da3(a)news.astraweb.com... >>> >>> Many years ago (using a Sinclair Spectrum no less) I noticed an effect >>> whereby if a small character sized square was moved across the screen in >>> character sizes steps, the eye perceived phantom squares at intervening >>> positions. Since the computer was not displaying these intermediate >>> squares, their appearance must have been due to the eye. The likely >>> explanation was that the eye was traversing the screen smoothly to >>> follow >>> the square, but the square itself was moving in discrete steps. So the >>> eye >>> was causing the image of the square to be smeared across the retina. I >>> was >>> seeing this effect on a CRT screen, but the longer the persistence of >>> the >>> image on the screen the worse the effect would be. Interpolating the >>> position of the image on the screen would reduce that effect. >>> >>> However, I can't explain why this would be less pronounced on a plasma >>> screen. >> >> >> >> Because LCD cells are painfully slow at switching, which equates to a >> long >> persistence phosphor on a CRT, which as you say yourself, makes the >> effect >> worse. Plasma cells are very fast, particularly now that 'pre-fire' >> techniques are used to 'ready' them for switching. This is the equivalent >> of >> having a very short persistence phosphor on a CRT. If you arrange for the >> drive electronics to be able to deliver the cell drives with no more >> delay >> than the cells themselves are contributing, then the result will be >> smooth >> motion without any perceivable blur, which is pretty much how it was with >> a >> standard domestic CRT based CTV. >> >> Arfa > > It seems to me that the effect would be visible on any display that has > any degree of persistence. Even if LCDs switched instantaneously, they'd > still be displaying the image for the full frame time and then > instantaneously switching to the next image. This would produce the > smearing effect in the way I've described. To avoid it, one needs a > display that produces a short bright flash at, say, the beginning of the > display period, and remains dark for the rest of the time. As I understand > plasma displays, that's not how they work. > > Sylvia. I think you are mis-understanding the principles involved here in producing a picture perceived to have smooth smear-free movement, from a sequence of still images. Any medium which does this, needs to get the image in place as quickly as possible, and for a time shorter than the period required to get the next picture in place. This is true of a cinema picture, a CRT television picture, an LCD television picture, or a plasma or OLED picture. Making these still images into a perceived moving image, has nothing to do with the persistence of the phosphor, but is a function of retinal retention, or 'persistence of vision'. Black and white TV CRTs used a phosphor blend known as 'P4', and tricolour CRTs typically used 'P22'. Both of these are designated as being short persistence types. The green and blue phosphors used in a colour CRT, have persistence times of typically less than 100uS, and the red around 2 - 300uS. The switching time of modern LCD cells is around 1 - 2 mS, and plasma cells can switch in around 1uS. This means that the plasma cell can be switched very quickly, and then allowed to 'burn' for as long or short a period as the designer of the TV decides is appropriate - typically, I would think, of the same order of time as the persistence of a P22 phosphor, thus allowing the plasma panel to closely match the fundamental display characteristics of a typical P22 CRT. A good description of why the slow switching time of LCD cells is still a problem in terms of motion blur, and what the manufacturers do to try to overcome this, can be found at http://en.wikipedia.org/wiki/LCD_television#Response_time Arfa
From: Sylvia Else on 28 Feb 2010 20:43 On 1/03/2010 12:17 PM, Arfa Daily wrote: > "Sylvia Else"<sylvia(a)not.at.this.address> wrote in message > news:006912a3$0$2891$c3e8da3(a)news.astraweb.com... >> On 28/02/2010 12:12 AM, Arfa Daily wrote: >>> <snip> >>> >>> >>> "Sylvia Else"<sylvia(a)not.at.this.address> wrote in message >>> news:4b888aaa$0$32078$c3e8da3(a)news.astraweb.com... >>>> >>>> Many years ago (using a Sinclair Spectrum no less) I noticed an effect >>>> whereby if a small character sized square was moved across the screen in >>>> character sizes steps, the eye perceived phantom squares at intervening >>>> positions. Since the computer was not displaying these intermediate >>>> squares, their appearance must have been due to the eye. The likely >>>> explanation was that the eye was traversing the screen smoothly to >>>> follow >>>> the square, but the square itself was moving in discrete steps. So the >>>> eye >>>> was causing the image of the square to be smeared across the retina. I >>>> was >>>> seeing this effect on a CRT screen, but the longer the persistence of >>>> the >>>> image on the screen the worse the effect would be. Interpolating the >>>> position of the image on the screen would reduce that effect. >>>> >>>> However, I can't explain why this would be less pronounced on a plasma >>>> screen. >>> >>> >>> >>> Because LCD cells are painfully slow at switching, which equates to a >>> long >>> persistence phosphor on a CRT, which as you say yourself, makes the >>> effect >>> worse. Plasma cells are very fast, particularly now that 'pre-fire' >>> techniques are used to 'ready' them for switching. This is the equivalent >>> of >>> having a very short persistence phosphor on a CRT. If you arrange for the >>> drive electronics to be able to deliver the cell drives with no more >>> delay >>> than the cells themselves are contributing, then the result will be >>> smooth >>> motion without any perceivable blur, which is pretty much how it was with >>> a >>> standard domestic CRT based CTV. >>> >>> Arfa >> >> It seems to me that the effect would be visible on any display that has >> any degree of persistence. Even if LCDs switched instantaneously, they'd >> still be displaying the image for the full frame time and then >> instantaneously switching to the next image. This would produce the >> smearing effect in the way I've described. To avoid it, one needs a >> display that produces a short bright flash at, say, the beginning of the >> display period, and remains dark for the rest of the time. As I understand >> plasma displays, that's not how they work. >> >> Sylvia. > > > I think you are mis-understanding the principles involved here in producing > a picture perceived to have smooth smear-free movement, from a sequence of > still images. Any medium which does this, needs to get the image in place as > quickly as possible, and for a time shorter than the period required to get > the next picture in place. This is true of a cinema picture, a CRT > television picture, an LCD television picture, or a plasma or OLED picture. > Making these still images into a perceived moving image, has nothing to do > with the persistence of the phosphor, but is a function of retinal > retention, or 'persistence of vision'. The fact that a sequence of still images are perceived as a moving picture is clearly a consequence of visual persistence. And it's obvious that things will look bad if the images actually overlap. But that's not what we're discussing. We're discussing why certain types of display don't do such a good job despite having a reasonably sharp transition from one image to the next. The Wikipedia article you cited said that even LCD switching times of 2ms are not good enough "because the pixel will still be switching while the frame is being displayed." I find this less than convicing as an explanation. So what if the pixel is switching while the frame is being displayed? It's not as if the eye has a shutter, and the transition time is much less that the eye's persistence time anyway. Sylvia.
From: Geoffrey S. Mendelson on 1 Mar 2010 03:49 Arfa Daily wrote: > Making these still images into a perceived moving image, has nothing to do > with the persistence of the phosphor, but is a function of retinal > retention, or 'persistence of vision'. Black and white TV CRTs used a > phosphor blend known as 'P4', and tricolour CRTs typically used 'P22'. Both > of these are designated as being short persistence types. The green and blue > phosphors used in a colour CRT, have persistence times of typically less > than 100uS, and the red around 2 - 300uS. Short and long persistance are relative terms. Compared to the P1 phosphors of radar screens and osciloscopes, P4 phosphors are relatively short persistence. Compared to an LED they are long persistance. Note that there is a lot of "wiggle room" in there, supposedly the human eye can only see at 24 frames per second, which is 50ms. Also note that there are relatively few frame rates in source material, NTSC TV is 30/1001 frames per second, PAL TV is 25. Film is 24, which was stretched to 25 for PAL TV and reduced to 24/1001 for NTSC TV. Film shot for direct TV distribution (MTV really did have some technological impact) was shot at 30/1001 frames per second. Digital TV could be any frame rate, but they have stuck with the old standards, US digital TV is still the same frame rate as NTSC and EU, etc. digital TV is still 25 FPS. Lots of video files online are compressed at lower frame rates because of the way they are shown. The screens still operate at their regular frame rate, the computer decoding them just repeats them as necessary. Geoff. -- Geoffrey S. Mendelson, Jerusalem, Israel gsm(a)mendelson.com N3OWJ/4X1GM New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or understanding, as in he has a sub-wikipedia understanding of the situation. i.e possessing less facts or information than can be found in the Wikipedia.
From: Arfa Daily on 1 Mar 2010 04:44 "Sylvia Else" <sylvia(a)not.at.this.address> wrote in message news:4b8b1bc5$0$5339$c3e8da3(a)news.astraweb.com... > On 1/03/2010 12:17 PM, Arfa Daily wrote: >> "Sylvia Else"<sylvia(a)not.at.this.address> wrote in message >> news:006912a3$0$2891$c3e8da3(a)news.astraweb.com... >>> On 28/02/2010 12:12 AM, Arfa Daily wrote: >>>> <snip> >>>> >>>> >>>> "Sylvia Else"<sylvia(a)not.at.this.address> wrote in message >>>> news:4b888aaa$0$32078$c3e8da3(a)news.astraweb.com... >>>>> >>>>> Many years ago (using a Sinclair Spectrum no less) I noticed an effect >>>>> whereby if a small character sized square was moved across the screen >>>>> in >>>>> character sizes steps, the eye perceived phantom squares at >>>>> intervening >>>>> positions. Since the computer was not displaying these intermediate >>>>> squares, their appearance must have been due to the eye. The likely >>>>> explanation was that the eye was traversing the screen smoothly to >>>>> follow >>>>> the square, but the square itself was moving in discrete steps. So the >>>>> eye >>>>> was causing the image of the square to be smeared across the retina. I >>>>> was >>>>> seeing this effect on a CRT screen, but the longer the persistence of >>>>> the >>>>> image on the screen the worse the effect would be. Interpolating the >>>>> position of the image on the screen would reduce that effect. >>>>> >>>>> However, I can't explain why this would be less pronounced on a plasma >>>>> screen. >>>> >>>> >>>> >>>> Because LCD cells are painfully slow at switching, which equates to a >>>> long >>>> persistence phosphor on a CRT, which as you say yourself, makes the >>>> effect >>>> worse. Plasma cells are very fast, particularly now that 'pre-fire' >>>> techniques are used to 'ready' them for switching. This is the >>>> equivalent >>>> of >>>> having a very short persistence phosphor on a CRT. If you arrange for >>>> the >>>> drive electronics to be able to deliver the cell drives with no more >>>> delay >>>> than the cells themselves are contributing, then the result will be >>>> smooth >>>> motion without any perceivable blur, which is pretty much how it was >>>> with >>>> a >>>> standard domestic CRT based CTV. >>>> >>>> Arfa >>> >>> It seems to me that the effect would be visible on any display that has >>> any degree of persistence. Even if LCDs switched instantaneously, they'd >>> still be displaying the image for the full frame time and then >>> instantaneously switching to the next image. This would produce the >>> smearing effect in the way I've described. To avoid it, one needs a >>> display that produces a short bright flash at, say, the beginning of the >>> display period, and remains dark for the rest of the time. As I >>> understand >>> plasma displays, that's not how they work. >>> >>> Sylvia. >> >> >> I think you are mis-understanding the principles involved here in >> producing >> a picture perceived to have smooth smear-free movement, from a sequence >> of >> still images. Any medium which does this, needs to get the image in place >> as >> quickly as possible, and for a time shorter than the period required to >> get >> the next picture in place. This is true of a cinema picture, a CRT >> television picture, an LCD television picture, or a plasma or OLED >> picture. >> Making these still images into a perceived moving image, has nothing to >> do >> with the persistence of the phosphor, but is a function of retinal >> retention, or 'persistence of vision'. > > The fact that a sequence of still images are perceived as a moving picture > is clearly a consequence of visual persistence. And it's obvious that > things will look bad if the images actually overlap. But that's not what > we're discussing. > > We're discussing why certain types of display don't do such a good job > despite having a reasonably sharp transition from one image to the next. > > The Wikipedia article you cited said that even LCD switching times of 2ms > are not good enough "because the pixel will still be switching while the > frame is being displayed." I find this less than convicing as an > explanation. So what if the pixel is switching while the frame is being > displayed? It's not as if the eye has a shutter, and the transition time > is much less that the eye's persistence time anyway. > > Sylvia. Well, I dunno how else to put it. I'm just telling it as I was taught it many years ago at college. I was taught that short persistence phosphors were used on TV display CRTs, to prevent motion blur, and that the individual images were integrated into a moving image, by persistence of vision, not phosphor. I was also taught that the combination of POV and the short decay time of the phosphor, led to a perceived flicker in the low frame-rate images, so the technique of splitting each frame into two interlaced fields, transmitted sequentially, was then born, which totally overcame this shortcoming of the system. Always made sense to me. Always made sense also that any replacement technology had to obey the same 'rule' of putting the still image up very quickly, and not leaving it there long, to achieve the same result. If you think about it, the only 'real' difference between an LCD panel, and a plasma panel, is the switching time of the individual elements. On the LCD panel, this is relatively long, whereas on the plasma panel, it is short. The LCD panel suffers from motion blur, but the plasma panel doesn't. Ergo, it's the element switching time which causes this effect ... ?? Actually, looking into this a bit further, it seems that POV is nothing like as simple as it would seem. I've just found another Wiki article http://en.wikipedia.org/wiki/Persistence_of_vision which would seem to imply that flat panel display techniques leave the first image in place until just before the second image is ready to be put up, and then the third, fourth and so on. If this is the case, and the reasoning behind the fast 'frame rates' that are now being used by manufacturers, then it would seem reasonable (to me at least) that in order for this technique to work correctly, the element switching time would have to be as short as possible, which would explain why a device with a switching time 1000 times faster than another, would produce 'sharper' pictures with smoother blur-less motion. Arfa
From: Geoffrey S. Mendelson on 1 Mar 2010 05:49
Arfa Daily wrote: > Well, I dunno how else to put it. I'm just telling it as I was taught it > many years ago at college. I was taught that short persistence phosphors > were used on TV display CRTs, to prevent motion blur, and that the > individual images were integrated into a moving image, by persistence of > vision, not phosphor. I was also taught that the combination of POV and the > short decay time of the phosphor, led to a perceived flicker in the low > frame-rate images, so the technique of splitting each frame into two > interlaced fields, transmitted sequentially, was then born, which totally > overcame this shortcoming of the system. Always made sense to me. Always > made sense also that any replacement technology had to obey the same 'rule' > of putting the still image up very quickly, and not leaving it there long, > to achieve the same result. It's more complicated than that. You only see one image, which has been created in your brain from several sources. The most information comes from the rods in your eyes, they are light level (monochromatic) sensors, as it were and they are the most prevalent. This means most of what you see is from the combination of two sets of monochrome images with slightly to wildly different information. Then there are the cones, or color sensors. There are far less of them and they are less sensitive to light, which is why night vision is black and white. There are also blind spots where the optic nerves attach to the retina. None of these show up on their own, they are all integrated into the one image you see. You never notice that you have two blind spots, you don't notice the lack of clarity in colors (due to the fewer number of spots) and rarely, if ever do you notice the difference between your eyes. If you were for example to need glasses in one eye and not the other, or have not quite properly prescibed lenses, your image will appear sharp, not blurred on one side and sharp on the other. Lots of tricks have been used over the years to take advantage of the limitations of the "equipment" and the process. For example, anything faster than 24 frames a second is not perceived as being discrete images, but one smooth image. The 50 and 60 fields per second (a field being half an interlaced frame) were chosen not because they needed to be that fast (48 would have done), but to eliminate interefence effects from electrical lights. Color is another issue. The NTSC (and later adopted by the BBC for PAL) determined that a 4:1 color system was good enough, i.e. color information only needed to be changed (and recorded) at 1/4 the speed of the light level. In modern terms, it means that for every 4 pixels, you only have to have color information once. Your eye can resolve the difference in light levels, but not in colors. This persists to this day, MPEG type encoding is based on that, it's not redgreenblue, redgreenblue, redgreenblue, redgreenblue of a still picture or a computer screen, it's the lightlevel, lightlevel, lightlevel, lightlevel colorforallfour encoding that was used by NTSC and PAL. In the end, IMHO, it's not frame rates, color encoding methods, at all, as they were fixed around 1960 and not changed, but it is display technology as your brain perceives it. No matter what anyone says here, it's the combination of exact implementation of display technology, and your brain that matter. If the combination looks good, and you are comfortable watching it, a 25 fps CRT, or a 100FPS LED screen, or even a 1000 FPS display, if there was such a thing, would look good if everything combined produce good images in YOUR brain, and bad if some combination produces something "wrong". > If you think about it, the only 'real' difference between an LCD panel, and > a plasma panel, is the switching time of the individual elements. On the LCD > panel, this is relatively long, whereas on the plasma panel, it is short. > The LCD panel suffers from motion blur, but the plasma panel doesn't. Ergo, > it's the element switching time which causes this effect ... ?? There is more to that too. An LCD is like a shutter. It pivots on its axis and is either open or closed. Not really, there is a discrete time from closed (black) to open (lit) and therefore a build up of brightness. Plasma displays are gas discharge devices, they only glow when there is enough voltage to "fire" them until it drops below the level needed to sustain the glow. That depends more upon the speed of the control electronics than any (other) laws of physics, visocsity of the medium the crystals are in, temperature, etc. That's the aim of LED backlit TV screens (besides less power consumption, heat, etc). They only are lit when the crystals are "open", so there is no time where you see partially lit "pixels". Geoff. -- Geoffrey S. Mendelson, Jerusalem, Israel gsm(a)mendelson.com N3OWJ/4X1GM New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or understanding, as in he has a sub-wikipedia understanding of the situation. i.e possessing less facts or information than can be found in the Wikipedia. |