Flicker is a visible fading between cycles displayed on video displays, especially the refresh interval on cathode ray tube (CRT) based computer screens. Flicker occurs on CRTs when they are driven at a low refresh rate, allowing the brightness to drop for time intervals sufficiently long to be noticed by a human eye – see persistence of vision and flicker fusion threshold. For most devices, the screen's phosphors quickly lose their excitation between sweeps of the electron gun, and the afterglow is unable to fill such gaps – see phosphor persistence. A similar effect occurs in PDPs during their refresh cycles.
For example, if a cathode ray tube's vertical refresh rate is set to 60 Hz, most screens will produce a visible "flickering" effect, unless they use phosphor with long afterglow. Most people find that refresh rates of 70–90 Hz and above enable flicker-free viewing on CRTs. Use of refresh rates above 120 Hz is uncommon, as they provide little noticeable flicker reduction and limit available resolution.
Since the shutters used in liquid crystal displays for each pixel stay at a steady opacity, they do not flicker, even when the image is refreshed. The backlights of such displays typically operate in the range of 150–250 Hz. But, to save the crystals from deterioration caused by constant current, voltage is constantly reversed, which may cause flicker. "In a pixel on an LCD monitor, the amount of light that is transmitted from the backlight depends on the voltage applied to the pixel. For the amount of light, it doesn't matter whether that voltage is negative or positive. However, applying the same voltage for a long period would damage the pixel. For example, electricity decomposes water into oxygen and hydrogen gas. A comparable similar effect could happen inside the liquid crystals that are in the pixels. In order to prevent damage, LCD displays quickly alternate the voltage between positive and negative for each pixel, which is called 'polarity inversion'. Ideally, the rapid polarity inversion wouldn't be noticeable because every pixel has the same brightness whether a positive or a negative voltage is applied. However, in practice, there is a small difference, which means that every pixel flickers at about 30 hertz."
The lighting used in film projectors is typically an incandescent lamp or arc lamp, which does not flicker, but some degree of flicker is desirable to help decrease the flicker fusion threshold comfortably below film's typical framerate of 24 fps. This is usually accomplished with a shutter which causes the lamplight to apparently strobe on and off at a multiple of the framerate, most often 48–96 Hz.
The exact refresh rate necessary to prevent the perception of flicker varies greatly based on the viewing environment. In a completely dark room, a sufficiently dim display can run as low as 30 Hz without visible flicker. At normal room and TV brightness this same display rate would produce flicker so severe as to be unwatchable.
Another factor in detecting flicker is peripheral vision. The human eye is most sensitive to flicker at the edges of human field of view, and least sensitive at the center of gaze (the area being focused on). As a result, the greater the portion of our field of view that is occupied by a display, the greater is the need for high refresh rates. This is why computer monitor CRTs usually run at 70 to 90 Hz, while TVs, which are viewed from further away, are seen as acceptable at 60 or 50 Hz (see Analog television Standards).
- Software artifacts 1
- Health effects 2
- References 3
- External links 4
Flicker, a flashing effect displeasing to the eye, often occurs through flaws in software, with no hardware faults involved. Flicker in software is caused by a computer program's failure to consistently maintain its graphical state. For example, the practice of blanking an area directly in the frame buffer, then drawing 'on top' of it, makes it possible for the blank region to appear momentarily onscreen.
When it is not feasible to set each pixel only once, double buffering can be used. The method involves creating an off-screen drawing surface, drawing to it, and then copying it all at once to the screen. While this technique cuts down on software flicker, it can also be very inefficient.
Flicker is also used intentionally by developers on low-end systems to create the illusion of more objects or colors/shades than are actually possible on the system, or as a speedy way of simulating transparency. While typically thought of as a mark of older systems like 16-bit game consoles, the flicker technique continues to be used on new systems, such as the temporal dithering used to fake true color on most new LCD monitors.
The constant refreshing of a CRT monitor can cause various symptoms in those sensitive to it such as headaches in migraine sufferers and seizures in epileptics, if they are photosensitive. Screen filters are available to reduce these effects. A high refresh rate (above 75 Hz) also helps to negate these particular effects. Flat-screen monitors do not have this problem as the pixels change directly from one color to the next without going black, and the light behind a TFT monitor (backlight) usually is driven at 40–50 kHz, which is unnoticeable to the human eye.
As the flicker is most clearly seen at the edge of our vision there is no obvious risk in using a CRT, but prolonged use can cause a sort of retinal shock where the flickering is seen even when looking away from the monitor. This can create a sort of motion sickness, a discrepancy between the movement detected by the fluid in the inner ear and the motion we can see. Symptoms include dizziness, fatigue, headaches and (sometimes extreme) nausea. The symptoms usually disappear in less than a week without CRT use, and usually only last a few hours unless the exposure has been over a long period.
- "Answer Geek:How Chewing Affects TV Flicker".
- Predicting flicker thresholds for video display terminals