Originally Posted by Mr B
So some screens are more picky then others, and show these lines, while others doesn't?
Yup, it's all to do with pixel clocks and uniformity of signal. In a perfect world the signal on the RGB lines would transition cleanly from one pixel to the next, and remain perfectly uniform for the duration of each pixel. In reality the signal doesn't behave quite like that, and a badly-constructed DVI-I->VGA adapter makes it significantly worse.
My understanding is that CRT monitors pretty much display the signal as is, so you don't notice such effects on a CRT, but TFTs sample the incoming VGA signal and break it up into a specific number of pixels. (They tend to guess how many pixels to expect, based on the scanrates, and since a lot of Amiga modes are non-standard, even if they're displayable the monitor will sometimes assume a weird number of pixels.)
If you can get the monitor's pixel clock perfectly in sync with the output pixel clock then you won't see the stripes, but if they're not perfectly in sync, there'll be an interference pattern as the position of the sampling window with regard to the incoming data will vary from pixel to pixel. If the signal's perfectly uniform for the duration of each incoming pixel (which of course is impossible in the real world) then that doesn't matter (though you can sometimes get a drifting effect from sharp to fuzzy to sharp to fuzzy over the width of the screen). If the transitions from pixel to pixel aren't clean, though - messed up by a bad DVI-VGA converter, for instance - then those interference patterns will manifest as stripes. (You can test this by fiddling with the monitor's pixel clock setting - you'll probably see the pitch of the stripes changing.)
That's my understanding anyway - I might well have oversimplified - or be flat-out wrong!