Originally Posted by Supamax
I have a question, though. Some years ago I recorded, with a common VHS Panasonic VCR, the output from an Amiga game (lores). And it was recorded fine (I don't remember if the scan lines were visible on the video registration too, though)! Some of you said that 288p signal cannot be broadcasted without being first converted to interlaced and suffer a quality loss, but this souldn't be true since I was perfectly able to record that game (using the A520 modulator)...
The video recorder just takes in each field and stores it on tape, then replays it. So it is capable of recording and playing back 288p.
Broadcast is a different story, the problem there is that most of the processors in the signal path do not accept noninterlaced signals.. The transmitter itself could probably broadcast 288p just fine, but all of the hardware in between is very fussy about getting standards compliant signals.
Also, am I right if I'm saying that Amiga interlaced (hires) modes flicker so badly not only because they really are - well - interlaced, but even because the Amiga is not fast enough to produce them at the correct PAL display rate?
No, it's not about that. The Amiga is fast enough, but the problem here is that the picture is too sharp for the technology. You can have a horizontal thin black line on only one scanline on the odd field and then a grey line on the adjacent line on the even field, where a TV frame is blurrier, the contents blend between the adjacent lines better. The slow frame rate of PAL naturally doesn't help either. NTSC is a bit better and interlaced SVGA screens were almost usable in the PC world. :-)
There were little hacks that blurred the graphics by adding some anti aliasing to the screen (and more bitplanes to your workbench) that alleviated the interlace flicker a lot at the cost of some speed and sharpness.
Commodore also made (well Philips did ;-) a high persistence monitor (the 2080) for interlace use, where the flicker wasn't as bad, but all moving objects left a trail.