Wednesday, November 22, 2006

The Abomination of Interlacing

It never ceases to astound me that so few people with an interest (professional or otherwise) in television or film-making etc. ever want to talk about what could well be the single most important factor in lowering the overall quality of TV images. Seeing that other phenomena such as the 3:2 pulldown trick used to time 24 frame per second film onto a 30 frame per second display device are well known about and documented, I wonder if the reason interlacing is ignored is simply a case of Emperor's new clothes?

So what exactly is this issue to which you refer?
Indeed. I will explain my beef in as few words as possible, but will assume everybody has some familiarity with the following keywords:

NTSC: (AKA 480i) TV standard for North America and Japan - 480 lines at 30 interlaced frames per second.
PAL: (AKA 576i) Predominantly European TV standard - 576 lines at 25 interlaced frames per second.
720p: International high definition TV standard #1: 720 lines at variable progressive frames per second.
1080i: International high definition TV standard #2: 1080 lines at variable interlaced frames per second.
Lines: The horizontal scan lines that make up a TV picture.
Interlacing: Separation of the odd and even scan lines of a TV picture into two fields that are stored/transmitted/displayed consecutively.
Progressive: Not interlaced, i.e. all scan lines are stored/transmitted/displayed from top to bottom/all at once.

The Common misconception
Interlacing is often described as being to an extent essential due to technological restrictions of old TV sets, and limits to transmission bandwidth. This much is widely known, but not in itself a problem all the time interlacing is merely breaking a frame into two separate fields. For example, TV films where action is captured on film at 30 frames per second do just that. 30 complete frames broken into 2 halves for transmission and display, with the TV sort of stitching them back together with a little help from your own perception. Images captured by video however go way beyond this, and into the realm of largely overlooked. Put bluntly, NTSC video sources do not capture 30 frames per second. In fact they don't even capture 1 frame, ever. They actually capture alternate lines of the frames of 60 distinct moments of time per second, which basically screws up everything that comes after. To elaborate, if the camera or subject is moving (and not moving kind of defeats the object of not using a photograph), then the even lines of each "frame" are actually captured a full 60th of a second after the odd line of what is supposedly the same frame, during which time things have moved. This results in a ghastly comb effect, where you clearly have two distinct images overlapping.

How this is a good thing
When displayed at full size and normal speed on an interlaced cathode ray television that isn't too big, you get amongst the flickering a smoother sense of movement.

How this screws us right up
There are no complete frames. None. When you pause an image on a video or DVD, it has to either cut one of the fields and up-sample (i.e. invent new in-between lines based on the surrounding lines) the other to fill the screen which looks crap, or merge the two fields into a slightly blurry mess while up-sampling both to fill the screen which looks crap. If you have a modern LCD, plasma or projector TV, all of which are progressive by design, then interlaced TV looks crap. If you want to resize the screen for picture in picture etc. to anything other than an exact half/quarter etc., it looks crap. You get the picture. Interlacing doesn't get the picture. Its pictures are left wanting.

Left as is
Crap crap crap

Two fields blended
Crap crap crap crap crap

One field cut altogether
Crap crap crap crap crap crap crap

And then along comes High Def
A new global TV standard with ultra-modern technology used in capture, storage, transmission and display surely means we can finally retire and put to rest the ridiculously outdated practice of interlacing at source, right? Right? Sadly not. Of the two officially recognised HDTV standards, one of them requires sodding interlacing to match a completely new set of transmission bandwidth limitations. Ironically, the one they call "full high-def" is actually of lower quality than the other for this very reason. Here in Japan, most of the regular channels are broadcast in HDTV, all using the interlaced one, and while it's not as bad as NTSC or PAL, it's still noticeable close up or during fast action such as the women's volleyball which Japan didn't win.

Why is everybody ignoring this?
This whole issue seems to be ignored to the point that even some experts have no idea the issue even exists. Trying to find out about this is near impossible, as there doesn't seem to be any kind of terminological distinction between an interlaced transmission of a progressive frame and video that's interlaced at source, and nothing written about the latter. Non-HD video cameras do not have the option to progressively segment the frames, even though to do so would be very easy. NTSC/PAL quality video taken from digital cameras is progressive by default with no interlace option, nor any attention drawn to this. I have DVDs and want to know if they're captured from an interlaced or progressive source. Can't do it via DVD software, and the box frequently LIES claiming a film source when it was actually captured on interlaced video. Some DVDs supposedly from a film source are even stored pre-interlaced complete with 3:2 pulldown so it'll only ever look right on a non-widescreen cathode ray television (film source DVDs are supposed to be stored at 24 frames per second, with the player upconverting to 30 frames per second on the fly so that it can look great on newer progressive displays).

Then there's relatively simple things I wish to do, like create a 320 by 240 60 fps video from an NTSC source. All the material is there - 60 moments in time captured as 240 lines each - they'd just need to be reduced by half horizontally and be done. But no software I have will do that. It'll happily make 30 frames at that size after blending two frames and down-sampling them, and if I try to set the output to 60, it just gives me two blended frames twice, one after the other. Maybe I'm asking too much of my professional software, but the fact it doesn't even ASK what I want done about interlacing, other than whether I should blur the whole thing or cut half off in the case of higher level software, is very telling indeed. Trying to make a DVD from a 1080i High Def source? It'll happily anamorphisise it for me, but only to 30fps progressive. If I want it interlaced so it plays nicer on a cathode ray tube TV I'm out of luck because the option just doesn't appear to have been thought of.

Interlace makes good video bad, and stabs you in the back when you want to use it for good.

Friday, November 17, 2006

PLAYSTATION®3 - Warning! Inconvenience!

Update As most of the people coming to this page seem to be after the same answer I was initially looking for, here it is answered by a kind poster below:

Hold the power button for 5 or more seconds when starting the PS3 and it will reset the system to use the default resolution.

Now this of course does not excuse Sony for this frankly lousy implementation. How can the system not know what cable is plugged in, especially when (with the exception of HDMI) they're special Sony cables with connector interfaces designed exclusively for use with the PS3?! The elaborate cable select screen which has detailed diagrams of each possible cable you could have plugged in was probably intended to be “helpful” but the user should never need to be presented with such a screen.

Nintendo figured it out. If the cable plugged in is a composite or S-video type, then Progressive is greyed out leaving the interlace option selected. If the cable is component or D-Terminal, then the Progressive option becomes selectable. You can only choose possible and meaningful options. This is how it should be.

The other issue that keeps coming up is "why won't my PS3 output PAL?" It won't unless it's a) a debug kit for developers, or b) it was made for release in a PAL region. The latter will not be available until March at current forecast, and the former you can pretty much forget about, so if you only have a PAL TV then DON'T BOTHER GETTING A PS3 YET! WAIT FOR THE OFFICIAL RELEASE. Or better still, get an Xbox 360 and Wii combination - it'll cost the same, and you'll thank me for it later!

Update ends

Although this is the first time I've mentioned the PS3, it's not like this is the first time I've discovered problems with it. It's simply that my dislike for Sony is such that I'd never stop if I started, so I normally start from the assumption that Sony and their products will be problematic, and only blog when I'm proven wrong. Which is why Sony has yet to mentioned.

Nonetheless, I shall make an exception here with an warning of dire inconvenience to all potential PS3 owners.

For clarification of how things should be done, let's take the scenario that you're in a PAL region, and have PS2 with a game disc that allows 60Hz play. For the unfamiliar, 50Hz (i.e. the screen refreshes 50 times per second) is part of the European PAL standard, but some PAL TVs can also display 60Hz (part of the Japanese and American NTSC standard) which arguably allows smoother animation, and some game discs accommodate these TVs. Invariably, when you put the disc in, it defaults to the PAL 50Hz standard, but asks if you wish to change to try out 60Hz play. If you can't see the picture, it reverts back to the 50Hz without causing any problems. Everybody's happy!

The PS3 of course takes this all into a whole new dimension. Not only can it theoretically handle both PAL (50Hz) and NTSC (60Hz) (untested), it also supports a number of High Definition TV standards that are all completely backwards incompatible, and require separate cables. Now, out of the box, the PS3 is set to NTSC, or PAL as standard depending on your region. If you want to connect it to an HDTV set, the HOME menu is happy to talk you through the process - make sure it's connected via the correct cable, choose a setting (PAL/NTSC, 765P/1125i/P etc.), test, confirm, it's actually more complicated than it could be (after all, there's no reason it shouldn't be able to detect for itself what cable you put in...) but once done it works rather well. The problem comes later.

Let's say that I was behind the times enough to be using an old regular NTSC cathode ray tube based TV like so many people born in the 20th century still are, but for some reason have been futuristic enough to have invested in a PS3. I take it round to a mate's house who has a super 65" full high def plasma display, we connect via HDMI and set the output to hi-def 1080P, and it is glorious! We can really see the sweat on Mario's face as he leaps over that rotting toadstool! That evening, my mate leaves the country forever, and I take my PS3 back home where I switch it back to the regular AV cable and plug it into my NTSC monitor. Hang on?! Why's there no picture? Cables plugged in, check! TV switched on, check! Ah... I remember, I set the output to Hi-Def via the HDMI cable, and now there is NO OTHER OUTPUT AVAILABLE. I could try to bluff my way through the HOME menu to switch it back to NTSC, but it was so damned convoluted that it's hard work doing so when you CAN see it! So not only did Sony lack the foresight to use a system of auto-detection to see what cable is connected (after all, at the PS3 side, all the cables share the same socket, and it would be astoundingly easy to do), they also failed to provide any kind of simple auto-reset, or a constant NTSC output via composite cable (the yellow one).

I'm sure other products such as digital cable boxes share similar faults, but I'm picking on Sony specifically because I don't like them, and I just had to walk all around my company offices several times carrying the 5 kilo beast in one hand and wasted 30 minutes waiting for an HD set to free up just so I could enable the damned thing to work with the industry standard NTSC box on my own desk. Shame on you Sony! Sort it out!

Update Just to add the following taken from the new Playstation website...

Copyright-protected Blu-ray video discs can only be output at 1080p using an HDMI cable connected to a device that is compatible with the HDCP (High-bandwidth Digital Content Protection) standard.

Your eyes could almost skip right over that, sitting there so innocent at the bottom of the page, looking all common sense and sweet. On the contrary, what this sentence really means is "WE DO NOT TRUST YOU!"

"... can only be output at 1080p using an HDMI cable ...(and a compatible telly)" seems innocuous enough - if you want da higher def, ya gotta get da higher def cables! But look again at the chart. Component cables too are capable of outputting 1080p, so why is it that certain kinds of discs (i.e. all retail movies in the Blu-ray format) will not output at HD definition without you purchasing an [b]expensive cable[/b] (and expensive compatible TV) that utilises a secret encryption protocol in an attempt to [b]prevent people from casually taking high quality digital copies[/b]? The mind boggles. It certainly isn't one of those issues of technology that will sort itself out over time. This is absolutely intentional and here to stay, because as far as they're concerned if you have the ability to even read a disc, then you will steal the movie you already bought! Makes you wonder what you're paying for when you buy these things, because it used to be that being able to play the movie was kind of the reason - and now they've made that impossible without first chaining you to your armchair. What's next? An optional disable on the fast forward function? A watermarking system to prevent you taking video captures of the screen? A system for making your movies completely unwatchable by remote on the off-chance a machine from the same batch gets hacked? Meantime, all the poor sods who invested in the earlier pre-HDMI HDTVs are left out in the rain, as Blu-ray movies played on their new PS3s will look no better than a regular DVD.

Just in case you think I'm exaggerating a bit here, take a look at the Boycott HD-DVD link further up the page. For those that have been following this, none of it is news I'm afraid.

Saturday, November 11, 2006

New Apple Commercials

Featuring my good mates the Rahmens! Not sure Jin Katagiri really comes across as a Windows box - more Linux I would have said. Kentaro Kobayashi does an interesting Uniqlo Mac too... Still, an interesting take on the series nonetheless. NOW With subtitles for the hard of Japanese.

In other Rahmens related news, a new Japanese Tradition was broadcast this week in Hi-Def on BS Japan. It's called "Utage" or "The art of after hours office parties" and no doubt coming to a YouTube near you soon, no thanks to me. No, I'm not putting it up.

UPDATE See also the new year special, another three, and the latest one.