If you are serious about audio and video, know this: The BETTER the quality of your audio and video setup the LESS forgiving it will be of crappy content! ALL of the defects in your poorer quality content WILL now be seen and heard.
Upgrade your audio system? You may find some of your favorite music tracks are no longer listenable. Upgrade your video system? You may find some of your favorite movie discs are no longer watchable. And no amount of "enhancement" features in your electronics can "fix" such problems. The creation of poor quality content results in a loss of information; a PERMANENT loss of information. The most you can do for it now is "blur" the defects so they become less annoying -- at the risk of also blurring whatever's left that's GOOD in the content.
Things that strike you as odd or wrong in a piece may, of course, just be due to mistakes in the performance, or how it was captured, or in the authoring of the version you are playing. Such mistakes do happen. This is one of the reasons you want to confirm your setup using content of known correctness. E.g., Calibration Discs.
You will also, from time to time, discover cases where the "Artistic Intent" of the material is simply not to your taste. Just as with the story itself, and the acting, you may not like how the art and sound design of a piece are used to tell that story.
But that's not our topic for today. Today's post is about trained professionals, intentionally doing stupid things because they thought they were a good idea! This is Art from the School of Shoddy. Let's check out the Curriculum, shall we?
Louder is... I say LOUDER IS... I SAY *LOUDER* *IS* *BETTER* !!!
Anyone here remember Compact Discs? They were all the rage a few weeks ago. What you MIGHT not remember is CDs were once considered a wonderful medium for delivering Audiophile quality music! Indeed there were studios, like Telarc, which worked to push the limits of high quality on CDs.
You don't hear much about CDs these days, and if you do it's most likely a comment to the effect they don't sound all that great. Might as well just play an MP3 file! Sounds just as good, and so much more convenient. What the heck happened?
Well a lot of what happened is the result of what's been dubbed, "The Loudness Wars".
As with many concepts adopted by The School of Shoddy, this too starts with a kernel of truth. If you are trying to compare audio systems -- each playing the same track -- you must take particular care both systems are matched in Volume. Why? Because the brain has a built-in bias in favor of the one that is louder. This is not purely psychological. The human ear is most sensitive in the mid range frequencies. A bass or treble sound has to be played louder to be perceived as having the same level as a mid range sound. But this bias in favor of mid range goes away the louder you are playing! So if one system plays the same track louder than the other, the ear will pick up MORE bass and treble components from it (in comparison to its mid range). The brain will then say the audio has more complexity and sounds "better". This bias is astoundingly sensitive! Folks who compare systems professionally know they must be Volume matched to within a fraction of 1dB for fair comparison.
Well what happened next is some artist got the bright idea, "If we mix my next song louder, it will sound better. And more people will buy it!"
And then some other artist said, "Two can play at this game!" (Or perhaps, more profane words to the same effect.) And mixed HIS next song EVEN LOUDER.
And thus the Loudness Wars began.
The PROBLEM is, no matter how you record your music, there's going to be some upper limit on just how loud it can get. This is not the playback volume. It's the upper limit of the recording itself. Beyond some level, the recording just can't GET any louder. And so what you are forced to do is raise the AVERAGE volume of the music, so that even its softer parts sound loud! Technically, this is known as Compressing Dynamic Range, or Compression, for short.
And Boy Howdy, did CDs get Compressed!
Now keep in mind there's no logical reason to do this. If the user wants to play his music louder he can just turn up the playback Volume. Compressed CDs were simply competing against EACH OTHER for attention. If you played tracks from two CDs, the theory was the one which was more Compressed would be more appealing. It would sound louder -- and thus better -- at the same playback Volume level.
The one sure upside was, if you don't record any soft passages, the music works better when played in a noisy environment, such as during a party. I.e., any environment where you couldn't possibly hear soft passages in the first place.
The downsides, on the other hand, should have been obvious to everyone. But War is Hell and all that:
- There are NO soft passages. All of your Art MUST be loud
- Listening at high volume is tiring
- Listening at high volume leads to permanent loss of hearing over time
- The quality of the music is reduced
That last one needs some explanation. If you play one of these highly Compressed tracks at REDUCED Volume -- because otherwise it is too dang loud -- that frequency bias of the human ear comes into play. The mix of bass, mid range, and treble constructed for the extremely high Volume loses in perception of bass and treble when played at a more normal (even if still loud) volume. The track now sounds "flat" and "muddy".
Which put CDs at a disadvantage compared to what should have been seen as inferior recording formats -- like MP3. When BOTH sound bad, the convenience advantages of the MP3 win out.
It took almost no time at all for the Loudness is Better trope to spill over to films. I won't name any names, but it is well known certain Directors just LOVE making their films REALLY loud. Action flicks, in particular, seem to be in their own version of the Loudness Wars. And TRAILERS? Yikes! I recently attended a screening at an IMAX theater which required sitting through EIGHT Trailers for upcoming flicks. And each and every one of them was PAINFULLY loud! (I'd guess they were calibrated at 95dB.) That's about 20+ minutes of near ear-bleed territory.
Sometimes you really have to wonder what the heck these folks are thinking! For example, Warner's Blu-ray of "Edge of Tomorrow: Live. Die. Repeat." (2014 -- both the 2D and 3D version), includes a ten second blast at time code 0:00:22. Yes, during the Opening Credits! There is NOTHING going on, on-screen, to prepare the viewer for this. That blast has its dominant tone at 10Hz. At, and I kid you not, 125dB! It includes components up above 20Hz, so just about any Subwoofer is going to get hit with this. What's more, the blast is authored in BOTH the LFE and Center channels! If you have been so foolish as to configure your Center speaker for "Full Frequency Range" playback, THIS is the sort of stuff which can fry that speaker.
But hey! It's The School of Shoddy! Crank it Up to 11!
Auto-Tune Your Fingernails!
Suppose you've just recorded what you hope will be your new, hit single, but the singer was just a little off key. You COULD just live with it, but if you've got pride in your work, most likely you'll try another take. And another. Until you get it right!
Then, in 1997, Antares Audio Technologies introduced their first "Auto-Tune" product. This was a system which could analyze and CORRECT pitch errors in instrumental and vocal tracks. Used judiciously, you could get everything pitch-perfect and nobody listening would know the original recording had minor pitch errors. "Judiciously" was the key word here, since pushing Auto-Tune too far could result in some distinctly odd sounds!
No sooner was it out than pop musicians started experimenting with just those mistakes! Letting Auto-Tune loose to do abrupt pitch corrections produced interesting vocal effects, which became a feature of new, hit tracks. This happened so quickly the term "Auto-Tune" started to be used as a generic for any sort of digital vocal manipulation, regardless of whose hardware was actually used.
So far so good. Whether you like those sorts of effects in your music or not, we were still in the realm of artistic intent. But that's when the School of Shoddy took an interest! Because if exaggerated use of Auto-Tune was now "acceptable", that meant you could record tracks using bad singers! The resulting corrections -- now ALL too audible -- would be touted as art rather than incompetence! So what if it sounded like fingernails on a chalkboard? Everybody was doing it. Right?
There was backlash of course. Quality singers bemoaned the least common denominator effect of over-dependance on such tools. Everybody sounded the same. And critical listeners wondered whether there was anyone left out there who could actually still sing?
So when you listen to an annoyingly over-processed audio track, don't assume the problem is in your ears or in your system. It may just be The School of Shoddy, cutting corners again.
Orange Aliens from the Planet Teal
Most movie viewers don't realize just how much sophisticated, technical work goes on AFTER a film is shot and BEFORE it shows up on your screen. They've probably at least heard of "Editing", as a thing. And of course, somebody has to go record the music, I guess. But if you ask most of them what they thought of the Color Grading, they'd likely get a puzzled look on their face, and say something like, "Oh, I'd give it maybe a B+ ?"
From the very beginning, filmmakers had to deal with variations in what they photographed. Different batches of film stock might react slightly differently during development, for example. And when shooting all the elements of a scene outdoors -- taking an establishing shot, then maybe a two person shot, then individual closeups -- you'd have to deal with the fact that you only have so much control over the lighting. A passing cloud shadow might only happen during one of those. Some of those shots might not even happen on the same DAY. With the advent of COLOR filmmaking, all these problems only got worse.
And so developed the professional specialty of the Color Timer, or Colorist. This person did arcane, photochemical magic during the process of handling the developing and printing of the film elements at the photographic laboratory. This was not something that could be tweaked in real time, so substantial skill was needed to predict just how much adjustment was appropriate at each processing step. Which, of course, limited the type of adjustments which could be attempted.
Then, in 2000, the Coen Brothers made "Oh Brother, Where Art Thou", with the aid of a Digital Intermediate. For the first time (in Hollywood; it had already been tried overseas) an entire feature film was scanned into a computer and color adjustments -- now called Color Grading -- were performed DIGITALLY. The result was then put BACK on film with the aid of a laser recorder.
This gave them a unique level of control over the result! Think Photoshop, but for a Movie. They could not only balance scenes (as with photochemical Color Timing). They could now, also, tweak them into something which never existed in real life, or on the film negative. They could even adjust PORTIONS of a Frame. And repeat this, Frame by Frame!
Digital Intermediate Color Grading took Hollywood by storm -- pretty much limited only by the need to develop sufficiently powerful processing tools and people who knew how to use them.
Now we need to talk Color for a moment. Orange and Teal are what are called Complementary Colors. If you list all the colors around a circle, they would be on opposite sides of that circle -- opposite in "Chroma". Complementary colors are interesting because when you put any pair of them side by side the human eye perceives "color pop": Their contrast in color really stands out. Indeed, some people will describe paired Complementary Colors as "vibrating"! So if you want a dramatic contrast in color -- something that will really catch the eye -- you turn to Complementary Colors.
The pairing of Orange and Teal is important in another way. Since the beginning of color photography, camera operators have doted on that period known as "The Golden Hour" just before Sunset or after Sunrise. The sun, filtered through the atmosphere low on the horizon, casts an orange glow on anything directly lit. Meanwhile, anything shaded from direct sunlight gets ITS light from the rest of the sky. THAT light has a blueish tint due to what happens to light as it bounces around in the atmosphere -- the same thing that makes the sky Blue in the first place. So you have AUTOMATIC Complementary Colors. The resulting vistas look fabulous; mythical even. It is magic time. Now add in the fact that skin tones lie in the oranges and you've got the perfect scenario for deeply emotional character photography. Just make sure your actors are directly lit and that you have plenty of gorgeous scenery in shade around them! Orange and Teal. A match made in film heaven!
All of this has been known forever, and filmmakers, even in the photochemical Color Timing era, have worked to create their own Orange and Teal scenes elsewhere in their filmmaking -- i.e., when The Golden Hour was not readily at hand. Used judiciously, and with an artistic eye, this natural combo of flesh tones and their Complement can add wonderful pop to key scenes.
But then, The School of Shoddy got their hands on it! With the aid of Digital Intermediate Color Grading, they could now impose Orange and Teal WHEREVER and WHENEVER they wanted in their films. And that's just what they did! ALL THROUGH ENTIRE FILMS! So what if all your actors now looked Oompa-Loompa orange? So what if your backgrounds were completely devoid of Reds? And Greens? That Orange and Teal combo REALLY popped! You know, Crank it Up to 11!
The backlash was muted because even blockbuster films were doing this. Hollywood dotes on "whatever works", almost to the level of superstition. If a hit film happened to be all Orange and Teal, then the next five films ALSO had to be Orange and Teal. It was in vogue, in a huge way.
The professionals knew it was weird. The professionals even made fun of it. If you go out on the Internet you can find examples of famous works of art -- converted to this Orange/Teal color palette. The smart ones started to try taking advantage of it. You need a shock effect in your horror movie? Try a splash of saturated Red (maybe blood, even though that's not its natural color, maybe something else). That will stun the audience because they haven't SEEN any Red for the past hour.
Eventually even audiences started to notice all these darn films looked alike! It's that lowest common denominator phenomena, just as with Auto-Tune.
So in your Home Theater, if you suddenly notice the film you are watching is populated with Orange Aliens from the Planet Teal, do NOT adjust your TV. It's just The School of Shoddy, showing off!
Catering to Torch Mode TVs
In my post on Turning Off the Torch Mode Settings in Your New TV, I pointed out the Factory Default, out-of-box settings in just about every TV ever sold are flat out wrong for best quality viewing. It's not all that hard to take the basic steps to deal with this problem. But the reality is, MOST TV buyers don't ever get around to doing that. Partly that's due to lack of understanding of what those settings actually mean, but largely it's due to simple fear: They don't know what they can change in the TV without breaking it!
The folks who author content for home viewing -- whether via broadcast TV, or delivered on physical media like Blu-ray discs, or played via Internet streaming -- know, of course, how it is SUPPOSED to look when viewed. What they CAN'T know is the viewing conditions in each viewer's room, nor how each viewer has configured his TV!
If the home viewing content is authored well -- retaining the full quality of the original -- it should look fantastic in a properly set up Home Theater. But what about folks who've still got their TVs on those Torch Mode settings?
How about making a version FOR THEM! One that won't look as good as the "correct" content if played in a proper Home Theater setup, but which MIGHT look better on a misconfigured TV -- i.e., by COUNTERING some of those incorrect settings in the TV!
Do you see the typical thinking of The School of Shoddy creeping in here? Yep! Let's deliberately screw up our product to cater to people who don't know how to use their TV!
It's about this point most movie Studios would put their foot down. You are going to do WHAT to our movie?
But as with most things from The School of Shoddy, there's enough insane logic to this suggestion there are actually cases where you might want to go that way! For example, suppose you have a flop on your hands. Rather than release it to movie theaters, maybe the best approach is to release it Direct to Disc. This is the sort of retail release that's destined for a quick trip to the Remainder Bins. So it is going to be bought by folks more interested in a bargain than in quality. Now THERE's a likely demographic for folks with improperly set up TVs!
Or perhaps you have a deal with Netflix to sell them rental discs at a bargain price, so long as they let you author a reduced quality version of the disc -- a "Rental Special". That's the ticket! And besides, you want to convince those renters to go BUY retail discs, don't you?
However this Marketing discussion evolves, the upshot is that discs like this DO GET MADE. And if you view one in a properly set up Home Theater, it is going to look Godawful.
Sometimes these show up in truly unlikely places. A case in point is the retail, Blu-ray release of Paramount's, "Rango" (2011). "Rango" is not only a highly enjoyable film, it is a flat-out masterpiece of computer animation! It was recognized immediately for having raised the bar significantly. Not only is the "acting" of these animated characters terrific, the characters also fully "inhabit" the created world of the animation. If you've not yet looked at this film just focusing on the quality of the animation itself, I would highly recommend you do so. This film won the Oscar for Best Animated Feature, and it wasn't even close.
And the Blu-ray disc in the retail package is equally fantastic! Reference Quality both for Picture and Audio. This was the first feature animation produced by Industrial Light & Magic, and they and Paramount clearly spared no expense making sure its Blu-ray release was top notch.
BUT, the retail package ALSO includes the SD-DVD of the film. I was particularly looking forward to that -- expecting the SD-DVD also to be top notch -- as it would have made a wonderful test bed for exploring the REAL differences between what the SD-DVD and Blu-ray formats could accomplish when matched with first rate material and expert authoring of the transfers to disc.
Alas!
The SD-DVD in this retail package is actually a mess! Now there are LOTs of ways to screw up SD-DVD authoring, but in this case I think it's simply Paramount assumed the only people playing this SD-DVD would be folks using Torch Mode TVs. Chalk up another one for The School of Shoddy!
New Frontiers in Shoddy: Misuse of HDR and WCG!
Advanced devotees of The School of Shoddy are always on the lookout for new things to get wrong. The most fertile ground for such research is always at the introduction of new filmmaking technologies.
Good filmmakers look at a new technology and ask, "How can I use this to tell my story better?"
The School of Shoddy says, "Hot puppies! New toys! How can we Crank it Up to 11 and show what this baby can REALLY do?"
The most recent case in point is the introduction of High Dynamic Range (HDR) and Wide Color Gamut (WCG) as part of UHD (4K) media for Home Theaters. There are two separate parts to this problem: How to use these as part of the creation of NEW films, and how to use these to make more money off of old, catalog titles, by re-issuing them as UHD discs?
You would THINK the second case would be the one more fraught with chances to screw things up. These films were never INTENDED to be shown in HDR/WCG. (Think of the mess that happened when technology was first introduced to "colorize" Black & White films!) Their composition, lighting, and color palette were all optimized to tell the story a particular way. If you gussy up the colors or add random, distracting highlights you stand a significant chance of steering viewers away from what they NEED to see for the story to work.
There are already some examples of bad catalog conversions to UHD, but surprisingly, not as many as I had feared! Studios have, for the most part, taken a pretty conservative approach to what gets done to these catalog jewels.
The problem seems to be much more prevalent at this point in NEW films, created in the knowledge they are going to be shown in HDR/WCG, and with the original filmmakers likely supervising those transfers!
What's the problem? The PROBLEM is whenever the use of HDR and WCG gets so distracting, you end up seeing only the TOYS instead of the STORY! "Look what WE can do!" * half of the screen starts strobing full brightness white * "You ain't seen nothin' yet!" * a forest glade turns into fully saturated neon colors *
Yes, and it's probably chockablock with neon teal shrubs (NEVER Green!), being tended by orange, Auto-Tuned crooning, Oompa-Loompas! Meanwhile the plaster is cracking off the ceiling from the ridiculously loud sound effects of this supposedly, gentle gardening.
Seriously. These guys aren't going to stop until half the audience goes blind, and the other half deaf!
All seriousness aside, the point of this post is: Poor content DOES get made for Home Theater viewing -- yes, even today. So if you SEE problems like this, have FAITH in the work you've done setting up your Home Theater correctly. The problem is not in your TV set. The problem really is in the content.
Oh, and consider never ever buying movies from The School of Shoddy ever again!
--Bob