I forget exactly when it was, but I made an allusion to the fact that seeing and seeing, as it were, are two different things. Because there is an ever-growing threat to the increase in quality of visual presentation, the time is good to explain what this difference is.
I grew up in a family that wavered up and down between the middle three income brackets. As was often the case with such middle-class families, ours had a VCR or three in the home. Laserdisc notwithstanding, such was the only real means available to see a film in the convenience of one’s home. But as with early essays in any craft, it came with profound limitations.
Anyone who knows what low-level noise is (basically, random noise inserted into the black signal of a picture) knows that true black is not possible on VHS because of it. And then there’s the fact that widescreen pictures tend to look like ass on VHS because of how poor VHS’ base resolution is.
Contrary to what those who wish to keep us stuck in the twentieth century would like you to think, DVD is not the optimum home video experience, either. Whilst its limitations are less profound than those of its predecessor, there are limitations, and some of them are quite severe.
The first example is in the MPEG-2 video compression codec that is used to make the video data fit onto the disc. Yes, boys and girls, they have to compress the video data in order to make it fit. Even when you leave out the fact that DVD has a maximum resolution of 720 by 576 and is interlaced, the reduction rate and inefficiency of MPEG-2 compression makes it impossible for the film being presented on it to look exactly like the master. One of the primary complains about the MPEG-2 codec that led to it being sidelined on the Blu-ray Disc specification is that it does not handle noise or grain well. And as anyone who knows anything about film can tell you, feature films can be very noisy, grainy things. Noise can come from using film that has too high an ISO rating, and grain is basically a film’s equivalent of beard stubble. A freshly-twenty man might show a slight shadow of it, but an eighty-year-old will never be able to get rid of it. Features that were shot on film simply cannot be transferred to video without exhibiting at least some degree of grain. This rant piece by Bill Hunt explains in terms anyone can understand why indiscriminately scrubbing grain out of your image is a bad thing. The executive summary is that indiscriminately scrubbing grain out of your image also scrubs details out of your image, and on a format that aspires to show more detail than any home format that has come before, that is bad.
One of the reasons why digital noise reduction has been applied so excessively in the past is because MPEG-2 compression cannot handle grain very well. MPEG-2 is based on rendering frames based on how they differ from those that came before. If the images being compressed basically consist of a person talking in front of static screen, then MPEG-2 squeals in delight because the changes from one image to the next tend to be very subtle. Unless the person speaking tends to wave their hands a lot as they speak, and even then, it is pretty easy to track such changes unless the hand movements are occurring at a speed more like those of Kung Fu films. But the thing about film grain, video noise, high ISO noise, and other forms of random patterns in the picture is that not only does it move in an unpredictable pattern, it is a large number of unsubtle changes, and the changes are as quick as they are large. MPEG-2 simply cannot cope with that. This is just one reason why DVD has to have so much of the detail scrubbed out of the source material before it can be put into production.
Another serious problem that was even more detrimental than grain intolerance is called interlacing. Wind the clock back to the beginning of the twentieth century, when experiments began in broadcasting and receiving messages combining picture with sound. Two things became apparent to the engineers that were developing what would become the NTSC standard (if you ever wondered what that acronym stands for, here it is: National Television Standards Committee). One, the resolution of the broadcast signal would be very low. Originally, it was not even a tenth of what could be achieved with 35mm film. And more detrimental to the format was the fact that the bandwidth was just not available to broadcast one whole image at a time. So the powers that be decided that the image should be broken up into lines and broadcast in a fashion that defies rational explanation. First, the odd-numbered lines (1, 3, 5, all the way up to 575 in PAL) are shown. Then the even-numbered lines (2, 4, 6, up to 576) are shown.
So what’s the problem, I hear you say? Well, I am glad you asked. This is the problem. When your formatting decision has an entire artefact that derives solely from it, and one of the most annoying (possibly medically deleterious, too) artefacts at that, that is bad. That is a “wish I could go back and undo this” kind of mistake. (While we are on the subject, the Human eye is not designed to see pictures as a pair of unconnected slats that change out of sync with each other. And if your brain happens to be very visually-inclined, watching a video signal that has had excessive levels of interlacing introduced at the transfer stage can induce seizures. This, by the way, is what medically deleterious means.) DVD is interlaced. That anyone thinks we should keep it around in favour over a format that is progressive is yet another indication that our planet has too many Humans on it.
I will be brief about this one. In context of video, “progressive” means that rather than splitting the image into halves and updating the picture one half at a time, the image updates one whole image at a time. This means that when the transfer is truly progressive, you can kiss goodbye most of the single most annoying family of artefacts in home video.
You will notice I have not said anything about resolutions, or dot-counts yet. It is worth addressing this because during some conversations, people have asked me about how they “upconvert” “old movies” for BD. Are you fukking serious, people?
Although the methodologies of making films has changed dramatically, and in large steps, over the past century, there is one truth of shooting with celluloid that remains in effect today. Actual film, celluloid, has far more resolution than (almost) any digital solution that has been thrown at it. In order to have the same resolution as 35mm film, experts generally agree that a digital image has to be 4000 pixels tall. Assuming a 1.33:1 aspect ratio for a second, this means a pixel count of about 21,332,000 … or in other words, 21.332 megapixels. Although cameras exist now that can take stills in this resolution and more, such an image quality is still an impossible dream for digital video cameras that can take twenty-four whole frames every second. And given how much data that many pixels being updated every 24th or 48th of a second would entail, it is not going to happen anytime soon.
Although Blu-ray Disc is an amazing medium in terms of visual (and aural) quality, 1920 by 1080 only adds up to 2,073,600 pixels. This is not even a tenth of the rough calculation for 35mm film. Which brings me to the most fundamental truth of images, whether still or moving. You can convert them downwards to your heart’s content. But if you seriously expect to be able to convert something up from a lower-quality source and say to people “this is it, this is the same quality”, you are in for a serious rebuke.
I also have… acquaintances, I will just say for now, who have poor vision. So I pose to them the following: if you filter a fluid through a series of hoses that progressively weaken the filter’s active ingredient, do you expect that fluid to taste exactly the same when it comes out of the other end? This is no different. If you start with a poor-quality source, the picture your eyes filter into your brain will not look any better than the higher-quality source. People who have worked in visual sciences of all sorts have known this for eons.
And then there is the simple fact that even if you were completely (ie not merely legally) blind, you would also have to be stone deaf in order to not hear the difference. The highest-quality signal that can fit on a DVD in audio terms is the DTS codec. DTS, an acronym for Digital Theatre Systems, emphasise quality over economy with their audio codecs. Unfortunately, DVD was not designed with quality in mind. The maximum total bitrate per second of an encode on DVD is 10.0 mb/s, and every audio stream on the disc subtracts from this. The two versions of the DTS codec that can be placed on a DVD have bitrates of .768 mb/s and 1.536 mb/s, respectively. If the film being compressed is a mindless actioner lasting a total of 90 minutes or less, this is not too much of a problem. Attempt to use this codec with a film that is around three hours in length, however, and you have a problem. And do not get me started on the standard Dolby Digital codec, please. There is not a codec in existence that can achieve a 12:1 compression ratio and have the reconstructed audio signal sound exactly the same as it was before it was compressed.
On Blu-ray Disc, the option exists to use what is known as a lossless audio codec. Lossless, in very simple terms, means that when you reconstruct the audio signal out of the compressed data, the audio signal sounds exactly the same as it did before it was compressed. When dealing with films that have an overabundance of sound effects and people attempting to talk over them, that makes a very big difference.
For these reasons, and more, I submit that there is a very big difference between having “seen” a film and having really seen it. An idiot who tells me that the 1982 production of The Thing is no good is not going to get taken very seriously if he has only seen one of the aliasing-ridden DVDs that were graced with recycled laserdisc transfers. In essence, because he has only seen the film with poor resolution and the presence of an artefact that has been known to make serious videophiles turn the playback off in disgust, he has not really seen the film at all. No amount of babyish whining that the DVD is “good enough” will change that. Recent television series like Game Of Thrones or True Blood are no different. Progressive video, lossless audio… let me put it this way, in season two of True Blood, when the primary antagonist is using her powers to torment her moronic slaves for one of their spectacular failures, the lossless audio really lets one in on the slaves’ discomfort. Something that both film and television producers have been crying out for a way to achieve for years now. Lossy audio clues the listener in too much that the sound is entirely artificial.
This is the difference between “seeing” and really seeing a film on home video. Hopefully, this gives some idea as to why those who are helping to condemn us to inferior formats should go to hell. If not, go to hell.
Powered by Qumana