The combination of overly advanced 4K TVs (projectors) and an over-reliance on “second-rate special effects” in movies and series often makes for a poor home theater experience.
If you’re wondering why some movies don’t deliver a groundbreaking visual experience on your brand new TV or home theater projector, it could be because your display device is simply ahead of its time. Lane Brown from Vulture believes, for example, that 4K televisions deliver such detailed representation that it can – but doesn’t have to – ruin a film’s imagery. As positive examples, Brown cited film classics such as 2001: A Space Odyssey and The Blues Brothers (which will finally be released in Germany on September 15, 2022), whose 4K restorations look very good on a new TV/projector. Newer films that rely on a lot of CGI, such as “The Revenant” (the scene with the bear attack) or “Avengers: Endgame”, on the other hand, look a bit flat, almost cheap. What can be the reason?
According to Lane Brown, many blockbuster productions, including those in the Marvel Cinematic Universe (MCU), have problems with visual effects. Special effects created for the cinema just don’t look as detailed and realistic on the home TV. Most CGI (Computer Generated Imagery) is rendered at 2K resolution, which is a quarter of the resolution of 4K (4096 x 2160 in cinema or 3860 x 2160 in home theater). Live action shots, on the other hand, are often recorded in 4K or 5K, 6K sometimes even in 8K.
Real shots vs. CGI effects
Exactly this discrepancy, between what we now call “real” recordings and the computer-generated special effects, is often clearly visible. Older productions such as Star Wars Episodes I and II, most of whose film scenes were shot using the blue screen method, provide a good negative example here. The result is an absolutely unrealistic look, regardless of whether the content of the film moves in real time or in the science fiction genre. One would think that a lot has changed for the better over the years. And yes, there are good new approaches, such as a virtual film studio that works with micro-LED displays. Unfortunately, you still rely on the tried and tested – even more so than at the beginning of the CGI revolution.
Hollywood film productions use blue/green screen processes, which in itself is not objectionable. However, because the focus is on “second-rate CGI”, due to a tight schedule or the manageable budget, the composition of film recordings and visual effects cannot look good at all. Special effects still fail to compete with real production sets and models after so many years. As an example for “We’re going to run out of budget towards the end of the film” By the way, we think of “Deadpool” (2016), whose CGI had definitely deteriorated towards the finale.
Do filmmakers cede too much creative responsibility?
It’s relatively convenient for filmmakers to film the scenes in front of a blue/green screen, only to then hand off much of the creative responsibility (although this is often still overseen by the directors and producers) to the CGI artists. There is already strong criticism from employees in the VFX branch (Visual Effects), who are now presented with a workload that cannot possibly be completed in the agreed time. As a negative example mostly Marvel productions, whether it’s a film or a live-action series. But the specifications, the budget and the way of working have still not changed. So what can you do?
Solutions to this problem are already being discussed. In order to achieve consistent quality for moviegoers and home cinema fans, a 2K downgrade process has already been discussed. In this case, all scenes are scaled to 2K and then scaled back to 4K. This is how 2K CGI could be rendered better (than it actually is). In addition, this saves time and money, because the rendering of 4K effects takes much longer to optimally display rain or skin textures, for example. You don’t want to think about the time it takes to render special effects for an 8K production like Das Boot Season 3. But that doesn’t solve the problem and rightly annoys those who afford an expensive television with the latest picture and sound technology, which is in no way exhausted – at least not by Hollywood.
Angela Barson, co-founder of VFX studio BlueBolt, to Vulture: “It takes experts, and there aren’t enough of them in the world.” And that’s exactly what will probably get stuck in the next few years. There are experts, but not enough. And the existing expertise is currently being trampled on by offering employees impossible goals and working conditions.
Hollywood thinks in “quantity” and not in “quality”
With new technology and new features, the “chicken and egg comparison” is constantly used. Without consumer products that can handle new standards and features, there is no point in the industry offering them. In this case there are enough eggs (TV sets, projectors), but the right chicken (films, series, cinema productions) simply cannot be found. There are certainly positive examples that show what home cinema technology is capable of. However, we usually find these in the streaming services (e.g. The Mandalorian. In the live-action series, many real sets were deliberately used) and not necessarily in the big Hollywood productions. Those responsible are still largely thinking in the direction of “cinema” and “mass market”, although far more attention should actually be paid to the home cinema segment.
What do you think about this topic? We appreciate your comments!