So what exactly is this issue to which you refer?
Indeed. I will explain my beef in as few words as possible, but will assume everybody has some familiarity with the following keywords:
NTSC: (AKA 480i) TV standard for North America and Japan - 480 lines at 30 interlaced frames per second.
PAL: (AKA 576i) Predominantly European TV standard - 576 lines at 25 interlaced frames per second.
720p: International high definition TV standard #1: 720 lines at variable progressive frames per second.
1080i: International high definition TV standard #2: 1080 lines at variable interlaced frames per second.
Lines: The horizontal scan lines that make up a TV picture.
Interlacing: Separation of the odd and even scan lines of a TV picture into two fields that are stored/transmitted/displayed consecutively.
Progressive: Not interlaced, i.e. all scan lines are stored/transmitted/displayed from top to bottom/all at once.
The Common misconception
Interlacing is often described as being to an extent essential due to technological restrictions of old TV sets, and limits to transmission bandwidth. This much is widely known, but not in itself a problem all the time interlacing is merely breaking a frame into two separate fields. For example, TV films where action is captured on film at 30 frames per second do just that. 30 complete frames broken into 2 halves for transmission and display, with the TV sort of stitching them back together with a little help from your own perception. Images captured by video however go way beyond this, and into the realm of largely overlooked. Put bluntly, NTSC video sources do not capture 30 frames per second. In fact they don't even capture 1 frame, ever. They actually capture alternate lines of the frames of 60 distinct moments of time per second, which basically screws up everything that comes after. To elaborate, if the camera or subject is moving (and not moving kind of defeats the object of not using a photograph), then the even lines of each "frame" are actually captured a full 60th of a second after the odd line of what is supposedly the same frame, during which time things have moved. This results in a ghastly comb effect, where you clearly have two distinct images overlapping.
How this is a good thing
When displayed at full size and normal speed on an interlaced cathode ray television that isn't too big, you get amongst the flickering a smoother sense of movement.
How this screws us right up
There are no complete frames. None. When you pause an image on a video or DVD, it has to either cut one of the fields and up-sample (i.e. invent new in-between lines based on the surrounding lines) the other to fill the screen which looks crap, or merge the two fields into a slightly blurry mess while up-sampling both to fill the screen which looks crap. If you have a modern LCD, plasma or projector TV, all of which are progressive by design, then interlaced TV looks crap. If you want to resize the screen for picture in picture etc. to anything other than an exact half/quarter etc., it looks crap. You get the picture. Interlacing doesn't get the picture. Its pictures are left wanting.
Crap crap crap
Crap crap crap crap crap
Crap crap crap crap crap crap crap
And then along comes High Def
A new global TV standard with ultra-modern technology used in capture, storage, transmission and display surely means we can finally retire and put to rest the ridiculously outdated practice of interlacing at source, right? Right? Sadly not. Of the two officially recognised HDTV standards, one of them requires sodding interlacing to match a completely new set of transmission bandwidth limitations. Ironically, the one they call "full high-def" is actually of lower quality than the other for this very reason. Here in Japan, most of the regular channels are broadcast in HDTV, all using the interlaced one, and while it's not as bad as NTSC or PAL, it's still noticeable close up or during fast action such as the women's volleyball which Japan didn't win.
Why is everybody ignoring this?
This whole issue seems to be ignored to the point that even some experts have no idea the issue even exists. Trying to find out about this is near impossible, as there doesn't seem to be any kind of terminological distinction between an interlaced transmission of a progressive frame and video that's interlaced at source, and nothing written about the latter. Non-HD video cameras do not have the option to progressively segment the frames, even though to do so would be very easy. NTSC/PAL quality video taken from digital cameras is progressive by default with no interlace option, nor any attention drawn to this. I have DVDs and want to know if they're captured from an interlaced or progressive source. Can't do it via DVD software, and the box frequently LIES claiming a film source when it was actually captured on interlaced video. Some DVDs supposedly from a film source are even stored pre-interlaced complete with 3:2 pulldown so it'll only ever look right on a non-widescreen cathode ray television (film source DVDs are supposed to be stored at 24 frames per second, with the player upconverting to 30 frames per second on the fly so that it can look great on newer progressive displays).
Then there's relatively simple things I wish to do, like create a 320 by 240 60 fps video from an NTSC source. All the material is there - 60 moments in time captured as 240 lines each - they'd just need to be reduced by half horizontally and be done. But no software I have will do that. It'll happily make 30 frames at that size after blending two frames and down-sampling them, and if I try to set the output to 60, it just gives me two blended frames twice, one after the other. Maybe I'm asking too much of my professional software, but the fact it doesn't even ASK what I want done about interlacing, other than whether I should blur the whole thing or cut half off in the case of higher level software, is very telling indeed. Trying to make a DVD from a 1080i High Def source? It'll happily anamorphisise it for me, but only to 30fps progressive. If I want it interlaced so it plays nicer on a cathode ray tube TV I'm out of luck because the option just doesn't appear to have been thought of.
Interlace makes good video bad, and stabs you in the back when you want to use it for good.