Toby Hema-Taylor, of Ardgowan School, asks :-

Why does the computer screen appear to flicker when filmed on video or TV?

Phil Bones, an Electronic Engineer at the University of Canterbury, responded.

In order to understand the reason, you need to know a little about how pictures are formed on screens and how video cameras take pictures. All television screens and most computer screens use a scanning beam of electrons to make a layer of phosphorous (on the inside of the glass surface of the screen) glow brightly. The beam is made to go dim and bright as it scans to form the dim and bright parts of the picture (various schemes are used to produce the colour). Each spot in a bright part of the picture is therefore 'fired up' every time the beam passes over it.

TV screens complete 50 scans every second, just fast enough that we don't notice the dimming of the picture spots in between scans. Video cameras have shutters which also open 50 times per second. Computer screens, however, tend to have faster scanning rates (so that they can display finer detail). So when a video camera is pointed at a computer screen, every time the video camera shutter is open, it will only 'see' part of a computer screen's scan. Because the part of the scan seen is different each time, the picture appears to flicker.

The flicker can be prevented by the use of special equipment. In effect such equipment synchonises the camera shutter with the computer screen's scan. The process is known as 'scan conversion'.