I didn’t see this article make any mention of the most common criticism of this notion, which is that you’re comparing the films of the past which have stood the test of time and are widely considered among the greatest films ever made with just any old blockbuster film released today. The same thing comes up all the time with pop music. If you really want to do this analysis you should have some mechanism of selecting historical films randomly instead of just considering the masterpieces that you remember off the top of your head.
The author is not saying that modern movies are artistically worse than old movies. He's saying that CGI, despite decades of progress, seem _fake_ in a way that subtly destroys our sense of immersion:
"Consider Ridley Scott’s 1985 Legend, wherein Tim Curry’s onscreen presence as a horned devil is heavy, weighty—despite playing a fantastical devil, his clopping around in hooves comes across as ontologically solid. Your eyes are not deceived. And moreover, under the slathered makeup and prothestics, he is still acting—when his lip curls, that’s a real lip. Compare that to Steppenwolf, from the 2017 Justice League. After almost a quarter-century of “progress” in cinema he comes across as digital dust, a void where physicality should be. His superstrength is unthreatening, because deep in our occipital lobes we know Steppenwolf weighs nothing, impacts nothing. Therefore the entire movie has no stakes. Of course the bad guy will be defeated, he’s a hologram."
Yes, of course the article ignores the hundreds of old movies with terrible prosthetics and costumes. I think the author tries his best to compare similar things. For example, when he discusses the latest Dune with Lawrence of Arabia - both acclaimed movies. Specifically, he discusses how the desert scenes _look_ to our eyes:
"Lawrence of Arabia from 1962 looks so much more real than Dune, even when just comparing the shots of people in Bedouin costumes hanging out in a desert ... For instance, the shadows both characters cast to their right. In the 2021 Dune, it is a monochrome chunk, despite the billowing fabric around the character. In the 1962 Lawrence of Arabia, the shadow of the fabric is actually see-through. It has dimensionality, thinness and thickness. Because it’s a real shadow. So despite both being filmed in similar locations, one scene unconsciously triggers more belief in the ontology of the fictional world—the directionality of the light, the blue above the dune—while the other comes across as an ersatz reality."
I felt this coming out of the latest Star Wars movies: the space battle scenes felt so... flat, despite being technically amazing. There's something to be said for actually shooting on location and using physical props.
I haven’t seen 2017 justice league, but I have seen enough modern day superhero films to be sceptical that’s why his strength is unthreatening.
Of course he’s going to be defeated, he’s the villain. Heroes don’t get injured. Every strength based character is infinite strong.
Superman could leap tall buildings in a single bound, then fly, then fly in space, then lift a continent made of kryptonite which is “Superman’s Kryptonite”, if you will.
Tobey MacGuire Spider-Man hurt his knee trying to slow down a train, Tom Holland held a flying aircraft carrier together with his hands. Yesterday’s Doctor Octopus would be no match for today’s Spider-Man and his small scale plots no interest. Even so, Alfred Molina’s Doc Oc was a good and threatening character despite having CGI assisted arms.
Hulk used to throw tanks about, now he punches alien spaceships into the ground.
Iron man was a guy in a cyborg suite, with hulk buster he’s hulk strong.
Magneto was a strategist who could push away wolverine, now he can lift sunken battleships and stadiums.
Wolverine was an eternal soldier now he can survive a hit from a nuclear bomb(!)
Batman was a strategist who worked out, now he’s strong enough to fight superman hand to hand.
Nobody gets injured unless they can walk it off, nobody dies unless they’re about to come back to life[mostly]. Nothing is damaged unless it’s background scenery buildings and cars.
Of course people aren’t thinking it’s hush stakes, it isn’t. That’s not CGI fault, it’s poor story telling papered over with CGI, bad character writing with CGI energy beam weapons instead of personality.
I think the really interesting counterpart to this is the anime One Punch Man. The main character, Saitama, can defeat any opponent, typically with one punch. It's the title for a reason; you know he'll always win, and it'll be disappointingly easy pretty much all of the time.
The point of the series is that you can take such an uninteresting overpowered character, in a superheroes vs supervillains setting, and still end up with an interesting plot. Of course none of the plot hinges on the question of whether Saitama will be powerful enough to win, that'd be stupidly bad writing.
Mainstream US superhero movies are mostly plots about whether their equivalent of Saitama will win. The action & graphics are the point, the plot is an afterthought.
I haven't seen any One Punch Man, but I don't doubt good writers can make any character interesting. Doctor Manhatten in Watchmen was basically a deity and didn't ruin the film with the snap of his fingers. Dr Who never loses and can still have good stories.
I was more going for: with Tobey Macguire's Spiderman, Doctor Octopus was holding Aunt May from a building and taunting Spiderman with dropping her; the world was not at stake, the city was not at stake, almost nobody else in the city knows or cares who she is. I didn't think Spiderman would lose, but I felt like it mattered and that it was in some way difficult for Peter Parker to deal with, he had to take a chance, risk something, "only just" caught her. CGI tentacles didn't look realistic because they aren't a real thing, and that wasn't a dealbreaker. Nowadays a cloud opens up into an interdimensional portal and a flood of metal-living-skeleton-alien invasion spaceships fly through and it couldn't matter less if it tried; they are a swarm of whocares for the Avengers to punch through, which they do easily and uninterestingly and they didn't miss out on anything happening really. CGI isn't ruining it, bad plot is ruining it. Building turns to dust as thing crashes and explodes, is so overdone that's not even interesting by itself.
Compare to early Hulk who was on his knees in a field against two(!) sonic beam tanks owned by the US Army, with only Banner's love of Liv Tyler to make him stand up and fight. Small scale and he was never going to lose, but this was the US Army against the unknown, Banner and the General's daughter, and it mattered. When Hulk threw the tanks, they were CGI and a bit unrealistic, but so what, that isn't immersion breaking. Watching Hulk fight now is like seeing Star Trek's Lt Worf punch a Romulan spaceship while it explodes over The Fonz jumping a shark on a surfboard. It's not the CGI which makes it bad.
On the YouTube Pitch Meetings channel, one of the recurring jokes critiquing films is the 'writer' describing the plot of a film and getting to a point where a character dies, and the 'executive' does a "who was that? Should I care, because I don't feel anything right now" reply. I don't need the hero to lose, or the CGI to be unnoticable. I want to feel like something is at stake which matters. CGI punchups may as well be watching a 2 hr demo of Street Fighter II.
I think we're sort of aggressively agreeing with each other. It's not the CGI (or lack thereof) that makes the modern superhero films bad (except as a sort of demoscene-with-infinite-budget visual spectacle where they're great). It's the writing. To a lesser extent the cinematography is poor, which makes it hard to really feel what's going on. There's lots of unnecessary camera movement, frenetic jump cuts during fights, and all the colors are dull and muted.
Every Frame a Painting has a good video on Kurosawa's use of movement in his films, and actually uses The Avengers as a contrasting example and how the cinematography & direction makes the scene in question feel meaningless.[1]
So I agree that it's not the CGI that's the problem. It's the writing, the direction, the cinematography, the editing, the music[2] (though they improved since the linked video came out, directly in response to it), etc. The films are bad in many ways, but the CGI isn't really one of them.
I've been trying to figure out why I don't like the new Star Trek shows and I think your thoughts explain it. The old shows had plots. The new ones are just pretentious and try to get us to like them because "look, I mentioned Picard and Voyager, so I'm cool, right?".
Spiderman: Aunt May is at risk
Avengers: New York is at risk
Avengers 2: The Earth is at risk
Avengers 3: The Universe is at risk (The Snap)
Current Marvel era: The MULTIVERSE is at risk (ie all possible universes)
One Punch Man originated as a very poorly drawn web comic (as in, the author had no training in arts whatsoever[0]). Its claim to fame is that it's a parody of the superhero genre, in which most characters worry about status, except the overpowered-but-low-key main character.
I'd argue the plot of one punch man is about as predictable as it gets, (it's basically a series of boss arcs where side characters struggle and Saitama saves the day but somehow doesn't get the credit). Notably, Saitama has just about zero character development throughout the series. What makes it interesting is the comical relief provided by the constant social commentary on the notion of "strength = recognition".
Interestingly, among superhero movie fans, it is often said that DC movies are worse than Marvel movies, because DC movies lack character development. Infinity war notoriously doesn't even have a happy ending. So I think saying that american superhero movies lack substance isn't necessarily accurate.
> Nobody gets injured unless they can walk it off, nobody dies unless they’re about to come back to life[mostly]. Nothing is damaged unless it’s background scenery buildings and cars.
This is probably the biggest thing. The biggest mistake of hero films is that characters who are strong are not very strong. They do big hits but its all clearly irrelevant. It doesn't really matter how strong they are absolutely. Getting thrown into walls is my particular pet peeve of damaging moves that never matter. There's an over reliance on suspense where its not really deserved. I feel Captain Marvel and Thor Ragnarok did this decently. They just shrugged and made it explicitly clear that the heroes were 100x stronger than the trash mobs and released any tension in the fights to just have a good time with them.
I will grant some of what you've said, and so I will take a different tack.
I have not seen a Daniel Craig Bond film that didn't have noticably CGI elements bringing me out of the moment. Some of the old (especially Roger Moore) bond films had pretty dire effects where it is obviously a man in front of a video screen, but I don't remember finding them as immersion breaking at the time.
But, hard to say what is recency bias. I will say that Bourne stands out as having no such issues where CGI was obviously used... but maybe it was done better
It reminds me of the old EE 'Doc' Smith (Science Fiction) set of seven Lensman books. The last is an outlier, but for the first six books each feels like it is a similar tale as the previous ones but with the scales, powers, and threat levels turned up a notch.
Almost like having used up the ideas available, the only place to go now is incrementally growing the constituent parts. And that's how I feel about most modern action/adventure/fantasy/scifi stuff - to quote another genre: "this one goes up to 11".
But I don't want it going to 11. Even for my SF/Fantasy I want it to feel real. For me (and I know it is subjective) the author makes a good point.
The term for this trope is “power spiral” and Gurren Lagaan is a fantastic satire of it. The main character’s power is literally “spiral power” (a drill) and the plot revolves around the fundamental danger of exponential growth. Spoilers: the series ends with the main character piloting a mech that’s piloting a mech that’s piloting a mech that’s piloting a mech that’s throwing galaxies at the bad guy.
The last of these recent years’ superhero movies I saw had the protagonist fly from the ground on earth into space and fly through a fleet of supposedly extremely advanced and capable and enormous spaceships in order to win the fight. Literally, simply flying through as if playing a connect the dots game.
One was left wondering how there could be any further movies with this person available to fly through anything.
They are typically just written out of the conflict. In this case I assume that you are talking about Captain Marvel whom I believe is always needed on other planets in other galaxies except for the rare occassion when the writers needs a deus ex machina.
Same for the Green Lantern Corps. They are always needed but there's far too few of them so 99.999% of the time the heroes in these stories kind of hold the front until a GL shows up and beats the threat relatively easily.
I quite enjoyed that universe for a bit of time but the ever-inflating (and absurd) power levels and lack of actual stakes eventually ruined it for me. As it did with most mainstream superhero movies.
This is cherry-picking. There are countless characters from comic books who are so powerful that they would be considered laughable in modern superhero movies the same way you seem to think a few cherry-picked modern superhero movie characters are laughably overpowered compared to a few cherry-picked characters from older superhero movies.
That misses the point of what I was saying. I was replying to a claim "X character is so strong they should be threatening, and it's CGI's fault that they aren't" with a counter claim "they aren't threatening because so many characters are so strong and their strength is written to be meaningless and inconsequential, bad writing not CGI is why X character is unthreatening". Powerful characters are not what I objected to, as evidenced by my comment mentioning Dr Manhatten. Poorly written characters where their strength is incredible but also uninteresting are what I object to.
In Game of Thrones, The Hound outpowers all normal people and can take damage no ordinary human can, and is still threatening. He has his own demons and is outpowered by The Mountain. He is taken down by poison and becomes Cersei's slave, she uses him to demonstrate that strength isn't power, "power is power". The Hound sacrifices himself to kill The Mountain. What would be bad - and more like some of the stories I cherry pick from - is if The Hound had been blessed by the Lord of Light followers and was now borderline invincible, had taken out The Mountain in a 2 minute flashy punch up, then one-man-army'd his way through Ramsay Bolton's army, then Daenerys flies in and inexplicably Hound is now immune to dragon fire and punches a dragon to death, then goes to the thousand foot high magically reinforced Ice Wall and pushes it over, and then is inexplicably immune to the zombies and...
In one of Christian Bale's Batman films, there is a setup of two passenger ferries full of civilians who are stranded and rigged with explosives. Each boat has the detonator to the other boat. The civilians are put into a Prisoner's Dilemma situation by The Joker. The options are: do nothing, Joker kills everyone. Blow up the other boat, Joker lets your boat live. Both groups rush to do that, everyone dies. We don't know these people, neither Joker nor Batman is enormously strong, but the scene is good. Now Affleck's Batman fights and guns down a stream of generic villains like any scene from The Matrix or John Wick or a Jason Statham movie, and then fights Superman, and the scenes aren't good. It's not his increased strength which makes them bad. Dr Strange is hugely powerful, I've only seen one movie with him in it but it was decent - he was struggling against his demons of losing his great surgeon skills, humbling himself, rising to the challenge, etc. etc. and he defeats Dormammu in a way that is more clever than forceful (yet requires a high power level to even try it).
Speaking of The Matrix, Morpheus vs Agent Smith, Neo vs Agent Smith, were good scenes with inhumanly powerful characters and CGI. Later on, a zillion CGI Neos fighting a zillion CGI Agent Smiths were dreadful scenes with boringly powerful characters where nothing felt tense, nothing mattered, and CGI.*
A common and worthwhile counterpoint to that is that people only notice bad or rushed CGI. Properly good CGI is never criticized because nobody can tell its CGI. The digital artists have the same problem as IT staff. People only notice you when something goes wrong. When everything is going right, you might as well not exist.
A timely example is Disney's Book of Boba Fett. The show is almost entirely CGI except for the actors. There have been some criticisms about particular assets like the biker gang's speeder bikes. But I've yet to hear anyone complain about the environments looking fake. Mos Espa is almost 100% CGI and it looks like a real set.
When it works your brain just accepts it. CGI is incredibly good at environments. I'd go as far to say that it has been indistinguishable from reality for the better part of a decade now. Where its gets noticed is in places where it doesn't do so well or when there were time or budget limitations. Fully CGI humans are still not believable and will probably be the last thing to get right. CGI non-human creatures have varying success. Compositing can still be difficult. Often its better to replace an actors entire body, save for the face, and create a CGI version in a fully CGI environment. Compositing a real person in a CGI scene is prone to floating body syndrome and lighting issues.
The real answer isn't necessarily that one is better than the other. But these are tools which should be used together to make a better product. And I think you'll find that modern films with the best overall effects do blend the two. Mad Max Fury Road is a good example, the vehicles are real with real stuntmen climbing on them but the environment is artificial. Shooting a film in the middle of a desert with nothing in sight is not practical. But shooting the scene on a sandy path and compositing in the CGI desert is.
You don’t have to not notice it for the cgi to make the film worse.
You could certainly for example, film ‘once upon a time in the west’ in a green room and overlay the actors upon a western themed cgi environment, and have it look believably realistic.
But are the actors going to act as if they were under the burning sun as believably ? Will they avoid looking at the sun ? Will they tend to seek the shadows ? Will they sweat ? Probably not, and the feeling you’ll get from the film will be different, without you being able to know why
Not to defend the practice of "CGI everything," but there's a new-ish technique which I first became aware of from the Mandalorian, where the actors actually see the rendered CGI on massive screens around them - think the holodeck from Star Trek TNG. The physical props exist between the characters and the huge display, and the projection is even bright enough to (partially) light the scene. You can see this technique in the "Making of the Mandalorian" documentary series.
I do think CGI is often overutilized, much to the detriment of many shows and films, especially in genres where practical effects get the job done (i.e. - not sci-fi). However, these techniques are evolving, and the actors can at least start to see visual cues and interact with the same digital elements that the viewer will ultimately be shown.
It's called "The Volume" and it's a true game-changer.
Because the environment projects its own lights, reflections are automatically correct and don't need to be added in post. It also gives huge freedom to the DP for shots because they don't really have to light anything specifically.
And the best thing is that it's comparatively cheap so they can punch way over their budget effects-wise.
To be fair, the volume looks extremely goofy if you're there. The perspective shifts with the camera, and apparently can be close to nauseating if the camera moves too fast, and doesn't feel like a holodeck but instead a room covered in screens. It's game changing for being able to integrate vfx, but it's arguably more distracting for actors than a green screen.
I think you are confusing CGI with post effects. You can have CGI "in camera" as they call it. The Mandalorian pioneered this. The system uses massive LED displays and Unreal engine. They call it "The Volume". https://ascmag.com/articles/the-mandalorian
On the other hand you can do non-CGI effects in post as well. In fact its super common. Bluescreens, compositing, rotoscoping, etc were all common techniques well before computers had anything to do with film making and can all introduce the problems you describe. It was actually a common problem. People would spend a lot of time tryng to get lighting to match and to line up composited elements.
So its not a CGI problem at all. Its an in-camera versus post effects problem.
Arrakis was mostly filmed in a desert, how did it not feel real? The CGI in Dune was pretty decent IMO and the highlight for me was the ornithopters wing flapping. The motion blur of the wings was simply amazing.
From the making of I can see online it seems most of it is actually filmed in studio, especially when actors are involved up close.
It is noticable that it’s not filmed in a hot place because the actors are never sweating, they move too fast, their skin doesn’t have the color and texture of a strong tan, they don’t seem affected by transitioning from hot to cold, they touch sand with naked hands, etc. Compare with western movies that are actually filmed on location, they somehow feel like they’re filmed on a hotter planet than arakis
EDIT: seems that most of the scenes in the desert filmed in situ were also in relatively cold deserts as the filming crew is wearing down jackets
Would you sweat on Arrakis? I've never been to a dry desert, but I've heard that sweat can evaporate there very quickly and it's very hard to become sweaty.
Yes! I remember a Game Of Thrones CGI breakdown reel, can't find it now, which showed the Lannister camp and it was very interesting how much of it was actually real but at the same time how much was augmented by CGI. You absolutely couldn't tell.
> A common and worthwhile counterpoint to that is that people only notice bad or rushed CGI.
And similarly, people don't bring up the vast amounts of bad or rushed practical effects being produced at the same time films like Star Wars and Lawrence of Arabia were produced.
There are also far less obvious examples of CGI not being noticeable. These days a filmmaker, especially a highly opinionated one like David Fincher with the necessary resources, can decide that they want something like an extra car parked on a street and add it in post with CGI. The audience never even considers the car wasn't there the day of filming.
So, you're saying that there is tons and tons of CGI that never gets noticed. I can totally see that. Something like replacing a missing button on a characters coat, or adding a clock on a wall in the background. As CGI gets better, bad CGI stands out more.
That's not necessarily a counterpoint: it doesn't matter how much great CGI doesn't destroy the reality of a film, so long as there is any bad CGI which does. It's like how nobody talks about all the oysters that didn't make them sick.
I don't really think that bad CGI ruins films in most cases, but I think the author of this article does.
Lawrence of Arabia is also one of the most expensive and most insanely massively epic films ever shot - I am not sure it is a fair comparison. Dune is a massive massive movie today but I am not sure I agree the comparison.
Realistically yes we should just be doing more practical SFX but you have to realize it is not just some big hollywood exec deciding they want to ONLY do VFX - the available skill market for these things has drastically changed. In the 1980s there were lots of very skilled SFX people and studios that could do some incredible things. The amount of people with those skills has dwindled.
I think the best example of this sort of thing is the OG Jurassic Park. Everyone has an armchair evaluation of why it holds up so well and many are correct but there is a subtle difference most people never bring up. Years ago I worked with a guy much more experienced than me who had originally been a talented stop motion artist and then as the industry progressed moved on to doing animation for VFX. He did a lot of animation for the CGI dinosaurs - imagine the scene where the T-Rex is chasing the jeep for example. The big difference is that because his back story was in stop motion he had a much more direct and real world experience of how things and people move and how their weight is thrown around. One of the underappreciated things in Jurassic Park and why it seems to hold up so well today is simply that the dinosaurs move like they should as huge, lumbering creatures - it ends up a subtle difference that ends up making a huge impact on what we unconsciously accept as "real".
Also, Dune is a science fiction movie. Looking as real as possible was not necessarily the goal, but rather to maintain a stylized glean appropriate for a movie with hunter-seekers, force-fields, and sand worms.
I think the VFX approach worked well for the movie, and movie critics seem to universally agree that the cinematography and effects were magnificent.
A better comparison would be a modern period epic like Dunkirk.
There is a great YouTube video breaking down how Dune's special effects were created to look as real as possible, as opposed to what you get in the average Marvel movie:
Thanks for the link! I didn't mean Dune looked intentionally fake, but I think there was some cohesiveness in the creative decisions behind the VFX, the cinematography, the set and costumes designs, the editing, etc. that intentionally creates an enhanced dramatic aesthetic, along the lines of what you might see in a Baz Luhrmann movie.
As per the original author's point, the shot included from Dune looks like something out of a fantasy world, whereas the shot from Lawrence of Arabia just looks like a dude walking around in the desert. But I think this difference was intentional and goes deeper than just CGI
Anyone who's interested in this topic should be watching Corridor Crew. They spend a lot of time analyzing why good CGI is good and bad CGI is bad, how great CGI can be invisible, and how, yes, practical effects are grounded in a way that can be difficult (but not impossible) to replicate with computer imaging.
Not sure how much VFX Dunkirk had though - I have not looked into it - I know they did a ton of it practical with an imax camera on the actual planes. Impressive as well but maybe not the best comparison for VFX heavy vs SFX heavy
Lawrence of Arabia is also one of the most expensive and most insanely massively epic films ever shot
Yes, Lawrence of Arabia was one of the most expensive at the time. The budget was $15 million, which adjusted for inflation would be $138 million today. The most expensive movie was released the same year, Mutiny on the Bounty, which had a budget of $17 million.
Cleopatra blew that away the following year, with an estimated cost of $35-44 million. Which goes to show you money doesn't buy you a great looking movie.
I'd argue it's CGI's ambitions. A lot of old practical effect and green screen work doesn't hold up at all, either. CGI just made it easy to completely overdo it.
Also, the only reason that we think of recent ambitious CGI attempts as "overdoing it" is because they are by definition at the bleeding edge of our abilities now. Old practical effects and green screen work was probably "overdoing it" at the time too, because it was also at the bleeding edge of our abilities then.
IMO, CGI is no different than any other effects: the purpose is to enhance “suspension of belief” to advance some plot line.
For example, look at the original Star Trek. The effects were very unrealistic, everyone knew they filmed planet scenes in the desert (not on a different planet), etc. But nobody cared, it’s still a cult classic.
Star Trek TNG is pretty much the same. Despite more advanced effects, everyone knows Data is just a guy with makeup, but it was much harder to think the Enterprise was not a real ship.
At the end of the day, suspension of belief depends on delivery of lines, acting, etc, and much less on CGI or “classic” effects. CGI just makes (arguably) easier to achieve the effects.
The difference I see is that before, when they had to use practical effects, they had a more earnest appreciation of their limitations. The toy planes and ships in old movies appeared only for a few seconds since they knew they didn't fool anybody. On the first days of CGI they used to do the same, keeping the effects short. Then... somehow someone lost perspective and though scenes like this one didn't look like a joke[1].
What's worse, the abuse of retouching is making real things look fake. I remember watching the ending of Indiana Jones and the Crystal Skull at the cinema and thinking "why did they chose to make the top of the pyramid CGI? It's so cheap." Later I discovered it was a practical effect, color "corrected" to the point of removing any resemblance with reality.
> the purpose is to enhance “suspension of belief” to advance some plot line.
I think that's overly restrictive. There's nothing inherently wrong with flashy CG environments, elaborate action sequences, etc. even if they're largely "for their own sake." When conceived and executed masterfully, I want to see the flashy stuff too! Of course, it's also true that sometimes very rudimentary effects can work great to advance themes or plots (like in the Star Trek series)!
Agree very much, and that's really the triumph of a movie like Jurassic Park— judicious, minimal use of CGI supplemented by lots of practical puppet work, miniatures, etc.
Hmm, I will check it out, but I don't think my take is an uncommon one, for example:
"Spielberg and his effects teams succeeded in creating convincing visual effects because they didn't rely on CGI as their go-to tool. In fact, only four to five minutes of the 14-15 total minutes of dinosaur scenes were entirely computer generated. All the other visual effects were created using Stan Wintson's various physical dinosaur models. The legendary technician, responsible for creating practical models for Predator and Terminator, built everything from animatronic puppets, to to a full-size animatronic T-Rex, and even Velociraptor suits that could be worn by stuntmen. CGI was then used to build on top of what the production team had already created on-set. In other words, Jurassic Park's CGI still looks so good today because there isn't much of it in the film, which makes sense considering the limitations of the technology at the time.
...
But perhaps even more important than the breakthrough graphics was Spielberg's decision to still keep the CGI to a minimum. According to Dennis Murren, visual effects supervisor at ILM, once the director saw how convincing the digital dinosaurs looked, he actually rewrote the ending of the film to include the now-famous full-body shot of the T-Rex letting out a final roar in the Jurassic Park visitor center rotunda. But despite clearly being impressed by the capabilities of ILM, Spielberg still limited the CGI elements and retained the use of Stan Winston's puppets for much of the memorable dinosaur scenes viewers still love today. In fact, there's said to be a total of just 63 visual effects shots in the whole film, a tiny number compared to Jurassic World's more than 2,000 VFX shots."
You should check it out and watch the netflix series about it. Of course today movies have way more VFX shots than before. The technology isn't as expensive and is much more available to be used. They were just inventing it back then.
And the point is that perhaps CGI being a rarity back then, and in the hands of the best production studios, meant that it was used more judiciously and to higher quality.
This is an excellent point.
There are films where the CGI was intentionally kept limited - the film equivalent to someone sticking to a healthy diet and not overeating.
I'm struggling to think of really recent examples but one I recall when CGI was starting to have this problem, that was pretty subtle (to the point that it's easy to watch it and not realise how many small CGI effects are used, along with some physical ones) is What Lies Beneath. It's mainly to cover up the impossible camera angles and you'd hardly know.
There's so much CGI nowadays that even things you don't think are CGI, are actually CGI. Like buildings, glasses, doors, the most mundane things are done with CGI in many TV shows and movies.
Jon Favreau was a person who was very much against CGI, until he was reviewing some of the first iron man work and couldn't actually differentiate CGI from non CGI shots, he gave up on that.
There's bad cgi and good cgi, but 99% of the movies nowadays have a lot of cgi that you don't even notice, because it's good cgi :)
I enjoy this example from television, a name change for a popular baking show when it was brought to the USA led to editing the logo on a glass plate in many shots.
Yep. Basically any time you think "it must have been difficult or expensive to shoot in that location," you're probably right, and it's probably CG. And I'm not just talking about obvious examples like "abandoned Times Square." I mean pretty much any public area (especially a popular recognizable public area) where it would be logistically tricky to shoot.
I don't know if it was just me or the theater I was in but I felt like some of the effects in Dune were second rate. Was a bit disappointed, especially during desert scenes. I wish I could think of a counterexample of good CGI from modern cinema. Actually, I remember certain parts of Interstellar, for example the black hole and the 5-dimensional book shelf, were really well done. I think I was able to suspend my disbelief and just have my mind blown by them.
It is hard to think of a good example of CGI from modern films because good CGI is invisible. There are an incredible number of effects shots in modern films and TV, everything from removing backgrounds to replacing stuntpeoples faces with the stars. Most of it you will never even notice unless you look.
I don't disagree with the article entirely but I think the landscape of film has changed so much over the years that it is hard to make a comparison. The rise of superhero films has meant that blockbusters these days have to include comic-book style CGI characters, spaceships, and citywide destruction that is always going to look fake because it is literally incredible (as in non-credible) to our eyes.
Dune is an interesting case because we can directly compare Dune 2021 to Dune 1984. Both SFX powerhouses of their times. Dune 1984 does contain some very impressive shots but nobody is going to say that it looks better than 2021.
> It is hard to think of a good example of CGI from modern films because good CGI is invisible. There are an incredible number of effects shots in modern films and TV, everything from removing backgrounds to replacing stuntpeoples faces with the stars. Most of it you will never even notice unless you look.
But there is a distinction between CGI that is truly invisible to the average viewer, and CGI that is extremely photorealistic but is depicting events that very clearly could not be shot practically. I think there's a lot of scenes set in space, or in extremely bizarre environments, that simply must be CGI but which looks as photorealistic as I can imagine (but of course, there are plenty of funny quotes of people saying that about imagery from the 90s that looks laughable now).
Funnily enough, I felt some of the practical effects in Dune were lackluster relative to its acclaim and high production, as well. The orinthopter in the sandstorm looked too much like a child's toy. The aerial shots of Arrakeen City looked neat as a display model, but not enough like an actual city.
The blue-eye effect looked to me like the first colourised movies where they hand-painted the actual film. It really put it me off. I don't know if they were meant to be "glowing" or if the colour just "bled" outside the contours of the eyes. Either way, I wasn't a fan.
The rest was fine though. I've often thought that it's silly to spend so much time creating a huge digital scene with lots of detail only to pan by it in less than a second. Dune got that part right. Linger!
> The blue-eye effect looked to me like the first colourised movies where they hand-painted the actual film.
That's how they did the blue-eye effect on David Lynch's 1984 version (with hand-painting, probably not drawing directly on the film). Don't know if that maybe influenced Villeneuve's version.
Having seen that movie and a bunch of similar movies with well-done effects a lot of color work, the quality of the theater is extremely important. It, like Mad Max, shines on high-quality projectors with lots of illumination, and both fall flat if you can't have the contrast levels high enough. In theaters with modern digital projectors, it's a feast.
I don't want to act all offended and everything, but I honestly don't understand why a list of rewards is a counterpoint to my personal opinion. I'm aware that everyone loved the movie. I don't care and I'm not just trying to be a troll.
Update: Actually, your response makes sense if you assumed, reasonably, that I actually wasn't sure how I felt and wanted expert opinions. Left some other similar comments further down in the thread. Sorry to snap back like I did.
> I don't know if it was just me or the theater I was in but I felt like some of the effects in Dune were second rate.
Your comment starts with a statement that seem to imply that you doubt your own assessment. The list of awards is a strong signal that the effects were not "second rate". Thus my comment confirming that seems to be an unpopular opinion.
[This was a bunch of needless snark so I voluntarily removed it]
Update: Actually, I think you're right. I misrepresented my opinion as weakly held as though I was asking for a more authoritative source on the matter. So your response was appropriate and appreciated. I'm realizing now that I feel more strongly about it; like it wasn't just a matter of the theater having a bad projector.
the strength of the new Dune was in its sound design. At least that was the decisive factor that gave me goosebumps (I think). But the CGI deserves a shoutout wrt the design of the flying vehicles.
On the whole I agree with the article, earlier movies (and Tarantino movies) felt more real due to them being filmed in real life.
I hope the sound design you're referring to doesn't include the dialogue, because me and the two other folks I watched it with in theaters couldn't understand half the dialogue. It was like the Dark Knight rises all over again.
My favourite point of comparison is car chase scenes.
To this day, it's just so blatantly obvious when they're CGI cars that don't actually exist. Though action is pumped up to 11 (not to mention occasional Michael Bay syndrome of fast zooms,a whole separate gripe), it's not as believable and impactful than physical cars at their actual limit.
That being said, that's me, and a few fellow car chase geeks and possibly film nerds.
I know that at least one vehicle in this[1] car chase scene does not exist, but only because I've heard a director's commentary. I'm guessing most people wouldn't spot it either. And it wouldn't surprise me at all if there were others. I don't think it's a problem.
I mean, it's a 6 min clip; and the red subaru definitely exists (they went on sale after production ended if I recall, a bit out of my price range:)... although why they took a canonical AWD rally car and made it RWD is beyond my understanding.
There are what - hundred or more individual shots linked in there? So we'd have to narrow on which one is CGI augmented and how much. But it's a lot more "Weighty" of a setup either way than most car chase scenes these days, and received praise at the time accordingly :). I just watched Red Notice last night, and beyond all the other horribleness, the car chase scenes had the classic shiny, unbelievable Star Trek Warp Space feel to them that's common in so many movies today - and with 5-6 times the budget of Baby Driver, it's not that it's a "cheap movie with cheap effects".
edit after a brief Google: Indeed Edgar Wright takes tremendous pride on the car stunts and practical effects in the movie:
Legend and Justice League are both mediocre movies, but Legend won a ton of technical awards, including being nominated for Oscars for Makeup, BAFTAs for visual effects, and won an award for cinematography [0]. Justice League's list of awards is much more meager, and nothing for technical achievements. [1]
The Dune and Lawrence of Arabia comparison is also flawed. Sure they're both "acclaimed", But Dune is acclaimed as "One of the best films of the year", and Lawrence of Arabia is acclaimed as "One of the best films OF ALL TIME."
Also, Lawrence of Arabia is a real story of a real person set in the real world. So realism is bound to be important. Dune is a fictional story set thousands of years in the future in a different part of the universe. And specifically deals with a bit of mysticism and fantastical elements closer to fantasy than science fiction. Comparing the "reality" of their effects is apples and oranges.
I'll give you Star Wars for sure. Star Wars most assuredly did NOT benefit from "improved" technology.
I think another great example is Jackson’s LotR vs. The Hobbit films. The first trilogy was way more immersive, primarily because of the different balance of practical and special effects.
To be fair, though, the Hobbit had a lot more issues than just that, chief among them being a ridiculously large and undifferentiated ensemble cast + the fact that there was nowhere near enough main story material to fill a 10 hour runtime.
It would have been better to make one film, and treat it like the fantasy it was (more geared towards children). Have a framing sequence where Bilbo is telling a very young Frodo about his adventure "there and back again." But alas, that's not the world we live in.
What?! You should check your facts. It was way more immersive because they had way more time to do it and didn't had turmoil in the direction of the movie. LOTR is an example of great CGI done in almost every shot of the movie
That is really one of the best examples there is for this particular topic.
Same director, same setting and lore, presumably a lot of the same people worked on both sets of films. And yet for many people the Hobbit trilogy falls flat. The CGI often seems at fault here; for me the immersion was lost in the larger battle scenes, where the LOTR trilogy rarely interrupted the suspension of disbelieve.
I recently watched Knowing (2009) based on Ebert's review. One of the things he wrote in that review was: "The film has sensational special effects, which again I won't describe."
I found myself wondering if we watched the same movie. I thought the special effects were pretty awful. Comparing the scene the subway crash to say, the train crash in Silver Streak (1976)[1] or Speed subway crash (1994)[2], I thought Knowing[3] was inferior and just looks terribly fake to me.
Effects that looked amazing in the past often no longer look amazing. I remember thinking the dinosaurs in Jurassic Park looked real. Now they look like bad CGI with bad compositing. Part of that is the first time I see something, especially in the theater, if the story is good, I get sucked in and am not looking for flaws. Where as, years later when it's no longer exciting, because I already know the story or because someone said "go watch these effects", then those issues stick out. Knowing's effects were good when it came out and good enough if you were into the story and not trying to look for flaws. I agree though that watching them now they look subpar, especially the subway.
>Fun fact: this car scene was kinda an actual car chase. The director made the stunt man drive the car in the middle of street of Brooklyn without any acknowledge from the people and drivers there so they can capture real reactions. (It's illegal but apparently nothing happened to them)
(!) If that's true that's definitely why that scene is so good but hard to replicate. Though there are some props getting destroyed so maybe it's only true for part of the scene.
It was shot illegally (er, without permits). It really could not be done today:
“The car chase was filmed without obtaining the proper permits from the city. Members of the NYPD's tactical force helped control traffic. But most of the control was achieved by the assistant directors with the help of off-duty NYPD officers, many of whom had been involved in the actual case. The assistant directors, under the supervision of Terence A. Donnelly, cleared traffic for approximately five blocks in each direction. Permission was given to literally control the traffic signals on those streets where they ran the chase car. Even so, in many instances, they illegally continued the chase into sections with no traffic control, where they actually had to evade real traffic and pedestrians. Many of the (near) collisions in the movie were therefore real and not planned (with the exception of the near-miss of the lady with the baby carriage, which was carefully rehearsed). A flashing police light was placed on top of the car to warn bystanders. A camera was mounted on the car's bumper for the shots from the car's point-of-view. Hackman did some of the driving but the extremely dangerous stunts were performed by Bill Hickman, with Friedkin filming from the backseat. Friedkin operated the camera himself because the other camera operators were married with children and he was not.”
> . Comparing the scene the subway crash to say, the train crash in Silver Streak (1976)[1] or Speed subway crash (1994)[2], I thought Knowing[3] was inferior and just looks terribly fake to me.
Speed probably looks best of the three. Knowing's train doesn't look real - maybe too fast, over blurred, missing reflections, whatever. Silver Streak looks terrible though, and is poorly directed.
Note that Ebert might be using "sensational" with either the first definition ("causing great surprise or excitement") or second definition ("presenting information with the intent to provoke intense excitement at the expense of other goals") given in most English dictionaries, rather than the third definition ("very good").
Interesting— I actually found Dune 2021 extremely immersive and weighty. Though that was without any knowledge of which parts were practical vs CGI. Hopefully it holds up to multiple viewings.
I had another experience. I had the experience that everything was there (checkbox style), but seemed to miss the gravitas. Or maybe I was just mad that they didn't include the dinner party scene...which was pivotal to the story because it gave the reader the sense that there WAS hope, that the game was just starting...and only THEN betrayal. but that's what you get for being a fan.
Yeah, I don’t think anybody except thr OP has ever complained about Dune not feeling visually weighty and powerful, so it almost comes off as a disingenuous criticism of the film. OP is comparing stills from two movies and essentially pointing out that one is obviously digitally color-graded and the other is obviously filmed on celluloid and not color-graded. The difference is obvious in a side-by-side comparison. However, if you actually watch Dune beginning to end, your brain accepts the color palette as part of the visual language, you don’t even notice it.
OP didn't complain about it not feeling "weighty and powerful", but feeling less real:
> digital post-processing seems to have ruined Villeuneve’s more realistic vision
> the complex colors of the world have been bleached out
And I agree – Dune definitely feels less real, less believable. Dreamy, in fact. And I think it plays into that, they use the blandness as a strength, creating a feeling of bleak otherness.
> He's saying that CGI, despite decades of progress, seem _fake_ in a way that subtly destroys our sense of immersion
True, though many older movies without CGI also had terrible special effects the killed "suspension of disbelief". Comparing to just the best isn't really fair, unless the assumption is that all CGI is done poorly.
> For example, when he discusses the latest Dune with Lawrence of Arabia - both acclaimed movies.
But Dune is barely out for a few months - it's a blockbuster, for sure, with enormous production budget, lots of public interest, etc. But so was Jurassic World, the Star Wars sequels or the Hobbit trilogy. Whether Dune will become an all-time classic like Laurence of Arabia remains to be seen.
Maybe a fairer comparison would be something like Lord of the Rings vs Dragonheart or Conan the Barbarian or whatever else iconic fantasy movie there was before the CGI era?
Or for Scifi, maybe the old Star Trek movies vs Avatar? (Blue cat people, not the other Avatar)
> But Dune is barely out for a few months - it's a blockbuster, for sure, with enormous production budget, lots of public interest, etc. But so was Jurassic World, the Star Wars sequels or the Hobbit trilogy. Whether Dune will become an all-time classic like Laurence of Arabia remains to be seen.
The weight of special effects is also lowered by just the sheer quantity of them. I would like to see a comparison of how much time is spent on battle scenes in star wars 4-6 vs 7-9. I turned 9 off because it felt like a 10 minute cgi fighting scene was needed between each sparse bit of plot forwarding dialogue. It may be a bit overdone when people fast forward through the action to try and find some kind of meaningful character interaction.
> Lawrence of Arabia from 1962 looks so much more real than Dune, even when just comparing the shots of people in Bedouin costumes hanging out in a desert
I don't quite feel that's a like-for-like comparison. All of the Denis Villeneuve's films I have watched have a surreal/unreal quality to them: from the top of my head Sicario, Arrival, and especially the one with the weird deer, whose name escapes me. That's just his "style" - I suppose it's how scenes are framed, color grading, and how his shots tend to linger; there's a dream-like quality to it, even for the "grittier" scenes - Sicario has a lot of those.
I don't necessarily disagree with any of the few specific comparisons the author makes. Maybe Tim Curry's Steppenwolf does look better. But this small set of specific comparisons, even if they're uncontroversial, don't provide evidence that a CGI effect on average will look worse than a practical effect. I can think of plenty of examples of practical effects that looks terrible and CGI effects that look great (and, crucially, aren't even noticeably CGI to most viewers).
> He's saying that CGI, despite decades of progress, seem _fake_ in a way that subtly destroys our sense of immersion
Other than Black Panther, I can't think of any Marvel movie that destroyed my sense of immersion. Maybe I have become used to it so I don't even see it anymore unless it's really really bad (like Blank Panther)?
> I felt this coming out of the latest Star Wars movies: the space battle scenes felt so... flat, despite being technically amazing. There's something to be said for actually shooting on location and using physical props.
I think that's a fair criticism for music where there is so much of it that the filtering of time makes a real difference to our perception of past music.
But cinema is a smaller world. Yes, there are plenty of independent films, but most audience attention has always been focused on the small number of tentpole films. And if you look at just those over time, everything the author says here still holds up.
Watch some of the top ten box office grossing films from 30, 20, 10 years ago, and today. The change in style, tone, and quality is enormous.
Pick any random comedy or drama from the 80s or 90s and it will have more realness, more sense of being there than movies made today for 10x the budget. This being despite the fact that the older films were shot literally on film and the grain is visible.
Digital color, an obsession with CGI, and a fixation on storytelling that plays well across an internation audience (which means violence, action, physical comedy, and simplistic morality tales) has made today's movies a shadow of what they used to be.
I watched Voyage of the Rock Aliens the other day which didn't stand the test of 1984, let alone the test of time. It's a great example of practical effects that can stand up to modern scrutiny... and those that can't. The flying V spaceship looks better than CGI in B-movies today, and the talking fire-hydrant robot had more character than BB-8 in the most recent Star Wars films.
But the sheet of plexiglass used as a "force field" was comically terrible, even for 1984
Agreed, but to be fair, the comparison is also with high profile "modern" movies.
"The Thing" to most people today is not "one of the greatest movies" but a horror classic and -- visual quality wise -- on par with "Phantom Menace" or "Jurassic Park", if not worse.
I think the criticism is valid: that the physicality of classic SFX is still mostly unmatched by CGI, and the eye and mind notice this.
You're not under the impression that most big budget movies nowadays are just attempts at milking the same character or universe to the maximum possible extent?
'The Thing' wasn't greenlit because it used existing IP with a guaranteed audience. It's just a short story that recieved two very different adaptations 30 years apart.
I am, but at the same time I'm reminded it's always been like this.
But isn't it an entirely separate conversation? Of course CGI hasn't "ruined the movies", that part of the title is hyperbole. But the actual conversation about practical SFX vs CGI doesn't have much to do with originality, I think.
I don't think these hold up as well as nostalgia would suggest, the exception being the alien chest buster scene, which is amazing. The others are much less ambitious and simply don't look as good as modern movies. Many of the practical effects look small or toy like. Maybe there's a sweet spot in the 90's?
The chest burster scene holds up, up until the baby alien scurries off at the end. But there's a hilariously terrible effect later on where it cuts from a model of John Hurt's head to the real thing.
Using the low quality trailer for Star Wars is pretty disingenuous since it was intended to be viewed on small TV at 480i compared to 35mm (or 70mm) in the theater.
Sci-fi is a weird genre because there was arguably more interest in it during the space race--Star Trek happened at the same time as early Apollo missions. There was just a lot more content and a lot more imaginative content.
Are you controlling for things like, say, the age you were when you first saw a certain movie? It's kinda unfair to rank a brand new movie against the older movie that you watched a hundred times because it was one of the few VHS tapes at your house when you were growing up. There could be thousands of kids now watching that brand new movie hundreds of times on their iPads who are going to have similar attachments to that movie. Some of those kids will grow up to be filmmakers, movie critics, pop culture writers, etc.
I think that’s missing the point. Most movies are mostly forgotten, and the merits of the story and storytelling carry the movie as a thing.
If you pick a mediocre movie in 2019 and compare to a mediocre movie in 1959, the garbage storyline will be the thing you will turn the movie off for.
In my mind, the weakness with CGI overuse is it enables lazy/sloppy practices that don’t scale. Similar to in graphics work where Photoshop is abused to create impossible edits for expediency.
My family has a tradition of watching Bing Crosby’s “White Christmas” every holiday season. It’s quaint how you can see the painted backgrounds and stages… the opening scenes feature cardboard bricks and cheesy props. The point of the article is that this era of filming is bland and will look cheesy in the future because of the reliance on tools. You can already see it today - compare the Sony Spider-Man movies to the last iteration - the age is apparent.
An easy litmus test to show its not the test of time is to plot the number of exceptional quality films made per year.
In any form of art, there have always been pieces motivated by artistic drive (i.e. a work of art like seven samurai or loan's run) and pieces motivated by profit (i.e. a product like all of the capeshit movies or the transformer movies). What we're seeing in film and other similar industries is a cabal of gatekeepers who don't fund things unless it fits the mold of proven, profit driven products. Remove the Larry Finks of the world from power, and you remove the powerwhore gatekeepers that have no concept of beauty.
The problem with that is that you also need to consider how much time has passed since each film was released. In many cases, if it’s been 10 years since a film is released it would be difficult to know how people will view that film after another 10 years have passed.
You make an interesting point - the article is not comparing apples to apples - but I think that the issue is different. It's not that CGI looks fake. The latest Dune is impressive in every sense and I don't think the CGI looks fake.
I think the toughest thing about VFX facing a director is the temptation to abuse it. They have to remember the effects are there to tell the story. They should resist to making everything spectacular/shiny/tridimensional/hectic where not everything needs to be. Compare the dark Los Angeles of Blade Runner with the lower levels of Coruscant.
And yes. Darkseid is a ridiculous villain, with a lazy design and even lazier lines. No amount of CGI could save it. Not even if he were blue and lived in a Roger Dean album cover.
I think the other thing that is missing from this conversation is the bar of entry has lowered so much with modern CGI. Look at a YouTube channel like Dust[0], which has literally hundreds of sci-fi short films using CGI to tell stories. Sure, they aren't 2022 blockbuster level, but they are 2010 TV level. There is simply no way practical effects would be able to scale the way that CGI has.
Even selecting randomly wouldn't be good enough: there are much more movies made nowdays. But if you look at only movies rated 8+ on IMDB, it's the same amount.
The linked article about architecture was just embarrassing. A classic case of someone desperately trying to universalize their personal taste (to assume survey respondents have a consistent idea of what constitutes “classical” or “modern” is a stretch). It doesn’t engage remotely with arguments from modernists who drove this change, or the scholarship on aesthetics,… very little substance for the length.
It sort of read like a sanitized version of Charles Murray’s “Human Accomplishment”, which is a “quantitative” “argument” for why Western culture is superior that commits severe epistemic mistakes [1]. Scott Alexander has some good takes, but this is not one of them.
And to give a positive example: I challenge anyone to visit The Barbican estates in London on a clear summer day, to take in the hanging gardens and constructed river, and not be awed. Modern and contemporary architecture can absolutely move the spirit.
If he's saying "everything is worse now that buildings don't look like the Taj Mahal, and clothes don't look like traditional Chinese ceremonial robes", how does this translate into support for Western exceptionalism?
It seems to me like your postive example is a non-sequiter. I don't think Scott is arguing that it's impossible to make architecture that "moves the spirit" using modern design/construction. He's arguing that, for the vast majority of modern buildings, nobody is even trying to move the spirit. Even in cases where historically we would expect them to (major institutions, public art installations, large companies, rich people with large houses).
The question is, why do people with almost unlimited resources choose to build rectangular prisms of largely unadorned glass or concrete?
And your answer is (please correct me if I'm wrong), because modernists think it's better, and people generally agree even though they can't express this agreement in surveys?
What is your basis for believing this? And even if you engage with the literature on aesthetics in architecture, how do you find yourself siding with the modernists instead of the critics of modernism like Christopher Alexander and Nikos Salingaros? This is a serious question, I've been trying for a while to find a defense of modernist architecture that I can actually understand.
I should have been more clear---I just meant that his argument had a similar kind of structure. I don't think he's engaging in Western exceptionalism (nor do I think he's a closet racist).
To answer the question, my point is that there are countless examples of powerful, inventive, playful, varied works of modernist+contemporary architecture. I simply reject the premise. I brought up the Barbican because the core of his argument is "nobody likes these things because they're ugly," and I'm saying: I like them. They move me. Some of my most enduring early memories are of the Barcellona Pavillion and the Getty Center. And to read these architects, that was often the point. I guess I'm asking, why should I give a shit about this guy's taste? Don't yuck my yum.
He's just not putting forward a very strong case, nor invoking others who have studied these questions before him. I'm not claiming that there aren't any valid critiques, just that his isn't one of them. If he's only stating that middling traditionalist buildings look worse than middling modern ones: I buy that, and I think it's probably a question of economics, as he says. But to me, when recent architects set out to create the monolithic and meaningful, it's not clear they're less successful than their predecessors.
Regarding public tastes, a mea culpa: I had dismissed the survey too quickly, and on a deeper look it's reasonable (although it includes few instances of what I'd call great 20th+21st century examples). But, as the Civic Art Society says, it's hard to disentangle Americans' aesthetic preferences with their normative conception of what a federal building (or any important building) should look like. That will naturally tend toward traditional styles. I'm also not convinced that today's choice expression of populist preferences---the McMansion---is going to stand any test of time. Whereas I do think currently dispreferred styles will endure and come to signal beauty.
As for writing, I guess I found Le Corbusier's treatises convincing enough? But for defenses that post-date the modernist era, I'm not sure I have an answer for you. To me, the best defense is the work itself.
As someone really frustrated trying to buy a home that isn't soulless this still begs the question why we seemingly can't build buildings that are exceptional quality by the standards of 100 years ago -- at any price! For fun I've toured 5+ million dollar mansions for sale and they're just as shitty as any mass-produced suburban neighborhood, they're just bigger. Like what was even the point of technological advancement, better construction equipment, logistics, materials science just to make worse houses?
That's because all that technological advancement went into making houses cheaper not better. Add to that the financialization of everything but especially housing over the last half a century (things that are the same as each other are easier to price and finance), and you get a country that builds mostly cheap low-quality housing that's all the same as each other. Add to that the fact that the labor force and the construction industry in general has coalesced around the new normal (which is natural), and you have a knowledge gap too, where most people and companies in the industry only know how to build the fast, cheap, easy-to-finance way. Anything other than the fast, cheap, easy-to-finance way is now niche, and therefore even more expensive than it would otherwise be. I think that this knowledge gap is the biggest reason why even even the richest people live in what is essentially just a bigger breadbox. If those rich people were to get anything other than a big breadbox, they'd have to really know what they wanted and seek out niche professionals who can build to their exacting tastes. And some do. But most rich people are just like anyone else, and they pick their new house from what essentially amounts to a catalog, and the builders put in the catalog what they already know how to build and like building - which is fast, cheaply built, easy-to-finance, disposable housing.
I have some insight into the high-end architectural world. You (in the general sense) absolutely can get houses of exceptional quality. You (as in Spivak) probably cannot. Because they are insanely expensive today. But these are bespoke, 'if you have to ask you can't afford', old-money type expenses.
When your estate/arch/construction costs start to exceed $20-50 million, you can get basically whatever you can think up. Giant multistory libraries, entirely of solid, old hardwoods with a glass walkwalk suspended from the ceiling? Done. 18-car garage? Done. Pool and minigolf on the roof? Done. Commercial level home gyms? Done. The client wants the absurdly large kitchen's cabinetry done without fasteners, just joinery: Done. When the client says 'I want a much higher PSI concrete for the foundation, even though it literally won't matter: done. $25k for a coffee table? Basically the same as an Ikea table to these clients.
A few examples: A home being built for client of a friend had a 26ft-tall marble fireplace from a castle/villa in Italy (which the client also purchased for this purpose) disassembled, shipped to the US, and reassembled in his CT home. This was 1 of 4 fireplaces in the residence. Hell, I've seen a property in the Hamptons where the guest house would have been the envy of Newport, RI; and even the full-time help lived in a beautiful home on the estate that exceeded most normal people's homes at 3600 sq ft.
Humans can build nice things. It's just a matter of how much money you're willing to spend. To build great things, it's a matter of how much money you're willing to throw away.
Indeed. this is completely correct. In my hobby woodworking business in 2019, I made a $45k wall all out of CNC machined mahogany. It was patterned, and 2" thick. The design, Machining, install, and finishing took months. it was >$5k in raw material in wood, not counting the costs of fasteners, sub-wall, finishes, etc. Labor made up the bulk of the costs, though.
Every part of that house in seacliff was following that mantra. Exceptional attention to detail, independent craftsmen contributing special pieces, and carefully selected design language. It was cool to work on it.
Even as the lead person on creating that, I can't afford to replicate it in my own house. And my house isn't cheap. Its just not practical, and short of doing the whole house as an exercise in design and craftsmanship you just can't get the return on those things vs drywall.
What do you think it would cost to get a 750 square foot home built super quality in old style? I don't mean fireplaces from italy, I just mean stuff like real brick masonry, super well insulated, slate roof, hardwood floors, a real fireplace. Basically I want to build a house that looks like a brick house from 1875 but is energy efficient and structurally sound, with modern electricity, plumbing and climate control. Is that a thing that any builder would do?
You would probably want an architecture firm to put it all together. They have the correct connections to the right people to make it work. You just listed off several different contractors. Roofer to do slate, Mason to do the brickwork, carpenters for the interior/exterior wood work, electrician for the modern electrical runs, plumber for water, architect for a nice plan for everyone to follow, AC/heating specialist for climate control, general contractor for all the general stuff and keeping things going to plan and holding people to task for not doing the job right and many will do things badly and argue they did it right. Had to argue with an AC guy once for an hour because he installed the unit in backwards, until the foreman showed up and made him 'put the unit in correctly'. A friend is currently dealing with a roofing guy whos crew did a seriously bad job (as in scrape it off and do it again bad). Add in something like 'smart home/IoT' and you will need a firm that does that sort of thing.
Doable? Very much so. But it defiantly is a learning experience. Also keep in mind what you put into it may not translate into what it will sell for. That will depend on the area you build in and similar sized homes in the area. No mater how nice it is.
To be quite honest, I have no clue. Your comment was what I'd actually wanted to say, but you found the words instead of me, haha.
I essentially only have been exposed to the architecture and construction of homes for the ultra-wealthy. Numbers weren't that important other than there were more zeroes than I'll ever have.
Things have weird values and a number would be hard to give. I rent and am not super knowledgeable about 'regular people' housing, oddly enough.
Biltmore House [1], the largest private home in the United States, cost $5,000,000 to build in 1895-1900 (I think it's a beautiful home and I would love to have it's library). Adjusted for inflation, it would cost around $160,000,000 [2] (maybe more, depending upon how you figure inflation). In fact, I'm not sure you could build that today for any amount of money.
>exceed $20-50 million, you can get basically whatever you can think up.
Does that money give you access to people who can build a home with soul?
An 18-car garage on its own is just a flex.
>had a 26ft-tall marble fireplace from a castle/villa in Italy
To me, this is a hint that the client couldn't find somebody to build something soulful anew. Instead, he had to take something exquisite from somewhere else to fill the void in his new house.
One example? In Canada we have a standard for air tightness, and more, called R-2000.
Houses with this standard have to have heat exchangers, for they are so airtight, that they require air be circulated in from outside.
When it is -40C, this change can mean heating costs of 200/month vs 2000!
I assure you, houses from 100 years ago were horrid with air gaps, poor insulation, and so on. All such houses have long since been retrofitted, rebuilt, otherwise the heating and cooling bills would be ludicrous.
Old houses may look nice, but there are many things like this, which have been improved 100fold..
One thing I really like about staying in old places in my hometown (Europe/not US) is tall ceilings and windows not being airtight means there's always fresh air in the room and the coastal air is wonderful.
The comment you are responding to discusses air exchange systems to supply more fresh air than simply dealing with a leaky old house. So both absolutely more wasteful and also less enjoyable than a modern and well-designed system.
> One example? In Canada we have a standard for air tightness, and more, called R-2000. Houses with this standard have to have heat exchangers, for they are so airtight, that they require air be circulated in from outside. When it is -40C, this change can mean heating costs of 200/month vs 2000!
This *is* what we call a modern and well-designed system.
I don't think this is correct at all, and this perception is probably largely due to survivorship bias.
I have several relatives who are home builders and have acted as my own general contractor in building a home, so this is an area of interest to me.
Comparing the average home today (yes that shitty McMansion included) to the average home 100 years ago, today's homes are way more efficient and comfortable in terms of amenities, climate control, being draft- and vermin- free, and safe.
I’m guessing it’s just a race to the bottom? I suspect if you were willing to pay significantly more you could still get a house constructed according to whatever your specifications are.
If you have money, you can absolutely get a house that will stand the test of time.
But corporate builders aren't concerned about that. They do shoddy work because they know they can get away with it. It's just good enough that an overworked inspector will pass it, but by the time the buyer moves in and the honeymoon period has ended, the corporation which built it has been dissolved and its profits reabsorbed into the parent company. Good luck going after them at that point.
Right, that's what I'm saying about the race to the bottom. I'm sure you could find a corporate builder (perhaps even the same corporate builder!) to build whatever project you're talking about for 3 times the price and it will be significantly better quality. It's easy to lament the way the market has trended, but are you willing to pay 3 times more? Probably not.
We have the option of selling a home we own (but don't currently live in). It was built by my grandfather in law in the 1950 so over 70 years old. We could sell it for market rate (bout $1.5M) or renovate it (making it much larger, better layout, etc) for about $600K.
I kind of rationalized that the house is still solid after 70 years, I know pretty much every inch of the superstructure now, so why not just renovate?
It’s exactly the same as movies - what’s missing is mass. Old houses made of stone and dense wood have mass and warmth and stability. Modern houses in the US are light and hollow. Even the walls are literally hollow.
I own a 200 year old cottage. It has a LOT of mass. The walls are a meter of stone, and the mortar is lime. The thatched roof is about 1.5 meters of straw.
It's REALLY COLD. Keeping it warm takes a huge amount of energy. I've removed some of the modern Portland-cement based render and replaced with lime mortaer and that at least helped with damp (it breathes a bit) but it's still really cold.
Once you _do_ get it warm, though, it does take a while to cool down. Similarly, it can be quite warm outside and still take a couple days for the cottage to properly warm up - there's a lot of thermal inertia.
The extension to it, built last year, uses 70x200mm (about 3"x8" in inches) studs, and has 200mm thick Rockwool insulation throughout the walls and in the roof. The windows are triple glaze. I heat the extension with less energy than the cottage, and it's 3 times as big and 5-6 times as much volume, due to the higher ceilings.
I think this criticism undoes itself. The thesis is that CGI visual effects are not convincing, not "weighty", implausible, unconvincing. But there's no mention of CGI being pervasively used outside of blockbusters! I'm not enough of a film buff to say for certain, but I'd guess it's a majority of non-indie releases. It goes unnoticed because it's seamless and convincing. It's enormously helpful to avoid an expensive location shoot or reshoot, fix an error, add setting, integrate actors and stunt doubles, all kinds of things. This article doesn't even know to complain about it because it's so dang good.
For an example off the top of my head, I know Wolf of Wall Street had a lot: https://www.youtube.com/watch?v=bP2sJqoZD7g You can search youtube for the title of any movie and "cgi" or "visual effects" to turn up tons of examples. David Fincher's movies have some great examples.
And to hop on my bar stool and opine, I think CGI is going to lead to a revolution in filmmaking over the next decade or two as it continues to get better and cheaper. There's ubiquitous criticisms that movies the last couple decades have been so expensive that studios only put money into blockbusters, or meddle in the creators' vision, or only fund simple action movies that are easily translated, etc. When low-budget indie films get access to this stuff they're going to to have the ability to make all kinds of movies rather than just contemporary character studies with limited sets and props. (Digital distribution will help, too.) CGI will be a landmark revitalizing force for movies, much bigger and stranger and better than the New Hollywood movement in the 70s.
I think in this kind of discussions when we say "CGI visual effects" we mostly implicitly also mean "...as used in fantasy/scifi movies". You know, the high profile, in-your-face effects. We don't mind the subtler effects and enhancements done to non fantastical movies, which go unnoticed and are therefore by definition suitably done :)
Wow, I had no idea a movie like Wolf of Wall Street had so much CGI. This makes your point perfectly, that it truly is not noticeable and seamless in a lot of movies. CGI is just a tool, and it's not to blame for so many mediocre movies being made. This reminds me so much of the constant "JavaScript ruined the web" arguments I see on HN.
Pretty much every shot that looks out a window in pretty much every movie since the just about forever has been an effect of one kind or another just because if the problems with lighting the scene and the time of day being 'shot for' and what time it actually is.
Most modern prime time television uses compositing for pretty much any shot on a street, they aren't standing on the street they are standing on a green stage.
This article is just so much 'the good ol' days bullshit...'
I agree that the criticism undoes itself. The negative CGI we talk about is the negative CGI we notice; we don't talk about all the CGI that we never notice, of which there is a huge amount. Take for example the movie 1917[0], a WWII film that's extremely practical. They actually dug trenches for the actors to run/walk through, they actually built a bombed out town for a central set-piece, and they actually had huge numbers of men running across a battlefield.
Yet it's also the case that the movie seamlessly transitions from filmed shots of actors to complete CGI replacements for multiple seconds before transitioning back to filmed shots of actors, without a single cut or transition, and no one would know that unless you watch the effects breakdown.[0]
I agree that CGI is almost certainly a massive vitalizing force. CGI is making it possible for creators, hollywood or no, to build far beyond what was ever possible to those creators in decades past. Take this tiny short film, "Royal Dumplings"[1], a student made short film for no budget[2] . Made with a camera, green cloth backdrop, and hardware store lights. What we get to see on screen, even if "un-real", is so vastly more interesting than what we'd have gotten 30 years ago at this price that I don't care.
The article does complain about it. The author says today's films have an overall level of unreality. Even when the CGI is good enough that you don't actively notice it, every scene emits a subconscious level of fakeness that pervades the whole film.
On one level this is subjective, but even to the extent that I might grant the author to have a point about current film having a “subconscious level of fakeness”, it is only in contrast to older film persistently hitting you over the head with fakeness.
If you grew up with it, sure, you learned to expect and accept it and so maybe it didn't break immersion most of the time, but not because it wasn't obviously unreal, but because it was obviously unreal in a way that you expected in cinema from experience.
If you asked me to name a movie that had a clear pervasive fakeness, I would never have thought "The Wolf of Wall Street". Conversely, I routinely feel like MCU movies are guilty of this. So it seems to be less "CGI always reads out as fake" and more how you're using it.
> I would never have thought "The Wolf of Wall Street".
For what it's worth, I would have.
Part of the reason you don't notice it is that they all have this sheen of unreality, so it's hard for any particular film to stand out. They're all color corrected to hell.
Most of the author's complaints are centered around CGI used for moving objects, especially in focus. So humans, vehicles, robots, anything like that. CGI does seem perfectly suitable to model backgrounds, but the moment it's used to show movement, the difficulty of doing it right goes way up.
Yea those are all background replacements. When watching the movie two scenes stood out to me as very unreal-feeling CGI: the helicopter landing and the boat during the storm. And indeed, it's all CGI: https://www.youtube.com/watch?v=2D7MqlNWUEs. It's awful - just look at that piece breaking off at 0:26. The scene at 2:30 is egregious as well.
Modeling the physics of such a complex scene well enough to be convincing to humans is essentially impossible. We are far too attuned to how our physical world works to be fooled by anything less than perfect.
Use CGI where it makes sense - for mostly static backgrounds. For movement, use practical effects. Let the universe do the physics simulations for you.
To me it’s not that they are not realistic, it is that I know everything is CGI and I have the feeling I am watching a cartoon. It isn’t convincing anymore not because of the rendering itself, but because I know it is all fake. I like the roughness of 90s action movies. Think Die Hard 3 for instance.
I truly hope there is a shift back to practical FX and 2d animation.
Classic movies from the 80s like Alien, The Thing, My Neighbor Totoro, etc. look so much better to me than the latest blockbuster Marvel movie.
There is something about actors acting in the same set you see, with the actual monsters you see, and the fog and lighting and everything else being real. And shooting on film just looks better to me.
Hand drawn 2d animation looks so much better too. I much prefer its soft and imperfect look to the HD perfection of modern computerized animation. 3d animation while great in some movies, I feel is an overused default animation style today.
I feel like we reached the perfect balance between visual art, technology and original stories back in the 80s, and have been steadily implementing cost/time saving optimizations ever since.
I think so much of what made The Thing great is what they couldn't show because of their limitations, not because of their great special effects. It's a thing that the great digital effect artists have realized and integrated into their movies.
When you look at movies like The Arrival, which is not considered a great special effects film because they're understated and get out of the way of the story, the effects artists plop a digital ship into the middle of a field and leave it there. Understated digital effects, in conjunction with practical effects, are often the most effective and least appreciated.
Villeneuve has a tendency to direct movies that do this well (and I apologize for being a bit of a fan boy), and I hope that more movies take this direction.
I enjoyed the Marvel films, but at times they looked a bit like digital vomit on screen and became hard to follow. I couldn't even finish Aquaman, TBH, as it was just over the top with the amount of digital noise. Though, Justice League attempted to do better than this, the Darkside fight was a bit nauseating =(
Again, though, I think the Villeneuve films will stand the test of time, where as the blockbuster comic films will go the way of films like Predator. I enjoyed it at the time, but it doesn't really hold up.
I don't think shooting on film is particularly necessary; it's possible to match that look pretty convincingly. Though attention to why the film look appeals to us is always valuable.
Practical effects are probably getting easier, thanks to various 3D printing techniques, the expansion of low-cost microcontrollers, the low costs of servos, and the proliferation of small cameras and lens adaptations for rostrum/running shots.
But saving money isn't the main reason we don't have practical effects. Saving time is the main reason we don't have practical effects.
Apparently Dune was transferred to film and back again to try and get the look. I'm not sure how much it really mattered in the end- I'm sure there are digital methods- but it did work.
If your highlights have already been clipped because of digital, you're not going to magically get the benefits of film's rolloff. There's a lot more going on with capture in an analog/physical media like film than just grain that most people associate
Modern digital cinema capture has nearly film-like dynamic range. It's gotten pretty good. Film is the gold standard but it's not always worth the trade-off in cost and complexity. And ultimately a lot of it comes down to good grading, not the DR of the original shot.
>Modern digital cinema capture has nearly film-like dynamic range.
NEARLY is not EQUALs. The dynamic range that you are touting being nearly the same is the specific feature of film that I was refering to on how it handles highlights. It's taken 25 years to get to where we are nearly there, but I understand why some purists still prefer the acquisition on film. However, the nearly is good enough for most but the purists.
You do realize that all this range and depth that purists talk about is just like what audiophiles do when they talk about the "warmth" of a vinyl recording?
So film has a huge resolution advantage especially when you use 70MM. Theoretically capable of 16K. Yet by the time you've developed the negs, created interpositives, edited, created masters, created internets, and finally created prints for the theaters, that resolution, that dynamic range, those "highlights" (could you be more vague) get lower and lower. Then you send it to a theater with a crappy projector that's out of calibration, or has a bulb out of spec. Then you get crap. Same with when you have this pristine film scanned for DVD/BluRay/4k whatever.
I'm talking about the ability to expose what would appear to be overexposed details in clouds while allowing shadows to be exposed to avoid details lost in the blacks. When color corrected, the highlights have the ability to be recovered to see the details back in the clouds. Just an example of a shot. If the exposure was too much for the clouds to be recovered, then the roll off in film looks much cleaner than just the clipping that occurs in digital.
Your reply comes off as if I don't know what I'm talking about which just isn't the case. I've already stipulated in a previous comment that modern digital is good enough for most everyone one but purists. So your audiophile comment implies to me you missed some salient details.
Man, you're just missing the point. It's about the look of what happens when it reaches the end of that range. It's not how much range it has.
For example, I've seen footage shot on a Red (don't remember specific body/chip) of table top food products with fun issues. Specifically, it was a chocolate shake with whipped cream on top. The sensor had issues with the white so that when the exposure was pushed, the white became magenta. The green data was not hitting at the same levels as the red and blue. To get white, the green channel needed to be pushed independantly, but it was already clipped digitally even though the R/B channels were not.
Scanning that same image from film would not have exhibited this behavior.
> You do realize that all this range and depth that purists talk about is just like what audiophiles do when they talk about the "warmth" of a vinyl recording?
Yes, and no. GP is referring to a specific fundamental difference between digital capture and film negative capture that is not vague or woo or wine tasting.
Highlights record differently on film negatives, because they do not clip; they roll off progressively as the negative image gets denser and denser. This is why wedding photographers kept shooting black and white film alongside digital colour for a long time, for instance.
You retain that benefit when you then scan the negative back into a digital form (because the scanner's dynamic range significantly outdoes the range of the developed negative). You can significantly "redevelop" that negative digitally.
I don't know if that specific fundamental difference was what the Dune production was aiming at -- I am guessing that if it was, they exposed their digital files very carefully to maximise the chance of being able to use it the way they wanted to.
According to Kodak[1], most film has 13 stops of dynamic range; today's digital cameras average 14 stops. An Arri Alexa has 14+ stops of range[2]. Discussing what wedding photographers used to do for a long time has no relevance to today's technology.
I haven't been making the case that analogue is superlative in any sense, though, so I am not sure you're particularly on target with the "warmth of vinyl" snark.
I'm very far from a purist. I do shoot film occasionally (stills, not motion film), but 99.5% digital, and I'm absolutely not coming at this from the LP-lover's perspective.
Not least because the "warmth" thing is purely subjective, when there are fundamental characteristic differences between film and digital regarding the way that highlights are captured by the sensitive medium, and those differences have actual creative impacts that can be exploited.
That was the point about wedding photographers. (Incidentally it's not just that the technological gap has mostly closed where black and white is concerned; it's also that the cost and supply models have changed. People don't shoot film for jobs like that now, because the professional infrastructure does not exist anymore.)
Where the cinema-oriented, digital-printing-to-colour-negative-motion-stock situation is concerned, I am not convinced that the digital source yet has enough extra headroom to really take advantage qualitatively without very careful capture exposing for the highlights; it's certainly not going to be an automatic, magical conversion.
I don't think people know why they "like" or "dislike" a certain look. Take a look at cinematography, it's all "anamorphic this"¹, "24 fps that", "digital is too harsh" etc. The article:
> But it gave us the feeling we had been picturing a certain texture that’s painterly but feels timeless…The film has softened the edges of the digital. It gave us something that film acquisition couldn’t give us, and it gave us something that digital acquisition couldn’t give us
Cinematography still has clearly a long way to go. It's currently in the "we don't know what perspective is" phase of painting.
¹ anamorphic lenses are, generally speaking, horrible. It's extremely obvious when they're used, which is probably the main reason why there has been a huge anamorphic filmmaking craze going on for a few years.
>But saving money isn't the main reason we don't have practical effects. Saving time is the main reason we don't have practical effects.
And not just time in general. The CGI could take more time to produce than the practical effects, but it's in post. The practical effects have to be ready for the shot.
That's even more: it can be done in parallel, through the entire production, as interaction of the actual physical prop on the set is not a memory barrier for the rest of production.
> But saving money isn't the main reason we don't have practical effects. Saving time is the main reason we don't have practical effects.
Absolutely. There are a lot of reasons this is happening.
Prior to the pandemic (which will be seen as a kinetic accelerant to a thermodynamic inevitability), screen space was limited, meaning the biggest tentpoles would eat up a somewhat predictable lion's share of box office revenue and thus command a large percentage of screen real estate. If you churn out big things fast, you can fill the calendar year with your films, monopolizing on total screen space/time. You can bargain with theaters for placement and advantageous timing, maximizing your own ROI and preventing other studios from outcompeting you. This is more a function of pipeline maturity and scale than development time, but efficiency absolutely helps.
It's also true that viewers only have so much time (and therefore money). You want them to spend more time watching your content than they do on alternatives (other streamers, video games, social media). Quick delivery of successful series is essential. Netflix is trying to fill its slate to compensate for media losses, but also to run quick experiments to find things that maximize for view time. Content doesn't necessarily have to look good to fit the need.
The dust may look like it's settling, but it's not. Increasing efficiency will lead to lots of new competition in the space.
You have to remember that 2D animation has been around for over a hundred years and a lot of the bad old stuff has been filtered out by history.
I think 3D animation is just starting to hit its stride. The Netflix series Arcane is really beautiful, Spider-Man: Into the Spiderverse was incredible, and I was even blown away by Hotel Transylvania 4, which I was ready to write-off as corny kids dreck, but the manic slap-stick stuff they did in it is outrageously good. This isn't even getting into the epic-budget Pixar stuff...
Also if Ghibili is your bar I'd say 99% of 2D animation isn't even on par.
I don't know if Arcane should be included in a list supporting the advancements of 3D animation. While the characters are animated, and I do mean that in the technical sense (i.e. actually moving the bodies), in 3D as meshes of polygons, a vast percentage of the effects are 2D vector/bitmap animations and all of the backgrounds are traditional static digital paintings.
I am in no way saying Arcane does not 'look good' to me or in any objective way, in fact, I really enjoy the over all look of the show. But I think it is a bit disingenuous to use Arcane to prop up the advancements in 3D animation compared to 2D.
Is it though? Without advancement in 3D animation Arcane wouldn't have been possible. It's a fairly unique 3D/2D hybrid and that merging of the mediums is apparent in how it's styled.
I stopped watching into the Spiderverse due to the bad CGI.
Plenty of people liked it, but I think it’s more people getting used to the style than it actually being good. Most modern animators and many viewers grew up with bad 90’s CGI, worse the old guard of computer animators where making it.
Taking with some of these animators they still recognize quality, but it’s like their uncanny valley has moved.
At this point the animation is so widely lauded that I'd say it's objectively well executed. I can certainly understand not enjoying the style though, I don't like the style of some of the classic Disney animations... but still can appreciate the effort and skill behind them.
I don't really buy the idea that animators grew up with bad animation and have somehow lower standards... in the case of the handful of pro animators I know, they're absolutely massive animation nerds and their taste extends far beyond what they grew up with.
Clicking through the reviews on Rotten tomatoes and it got quite a lot of hate though ending up with a high score. It was novel which attracts praise, but again people genuinely seem to like the look of CGI.
Anyway as long as we are talking animation. I asked one of the Frozen animators why the faces looked so freaky and they basically said meh, needs to be distinct to sell toys. Serious props to their skills and the massive R&D involved but that right there was the core issue. Moana’s water does a lot of cool animations but looks vastly less “real” than the objectively worse CGI water tentacle in Abyss. Again same deal it’s aiming for a style.
Ok fine you say what about CGI in blockbuster movies, except the article again points out how fake they look for various reasons.
That's what people do. They remember the good and forget the bad. In 10-20 years people will look back and remember the "good" CGI movies from today and forget all the shitty ones that came out along side them. Rinse and repeat.
Black panther is the #3 highest-rated superhero movie on rotten tomatoes. Sounds like it’s even higher in its genre than The Thing (#138 in horror) and Alien (#5). Black Panther is not a bad movie in other aspects - but the parts that use (and overuse) CGI look as fake as any other Marvel
Production. So how good the movie is isn’t necessarily a free pass for CGI abuse.
The superhero genre doesn't lend itself to the "don't actually show much to build suspense" style of film-making that horror could lean on back in the day to make up for how corny practical effects would look.
It's not really a rousing endorsement that practical's greatest achievements are cases when you have the thematic luxury of not actually showing anything.
Ok here’s an example on a more visual genre. Practical Yoda is more believable than CGI Yoda. (Practical Yoda can’t do the “jumping pea” stuff but that was entirely laughable and defied suspension of disbelief entirely. )
There's quite a bit CGI Grogu in The Mandalorian. You just don't notice it because it's done well.
And remember, the the prequels were literally the first movies to have CGI major characters. Basing the effectiveness of the technique on them would be like basing all of practical effects on the first movie to have puppets.
> Black panther is the #3 highest-rated superhero movie on rotten tomatoes. Sounds like it’s even higher in its genre than The Thing (#138 in horror) and Alien (#5).
Not surprising, given that there aren't many triple-A movies with Black lead actors around in the first place, much less movies that are predominantly Black starred - it was labelled as "revolutionary" [1] for a reason and so generated a lot of (well-deserved) hype. That it turned out to be a damn good movie too only helped the matter along.
Well, compare the number of films that Ghibli puts out compared to Marvel. Marvel pumps movies out at a furious pace in comparison; I'm not sure you even could sustain a Marvel-style tempo while remotely keeping up the same quality as Ghibli.
I've never heard of a single movie made by Ghibli.
Not that I'm a fan of Marvel, far from it! Apart from a few exceptions, they seem mostly geared towards kids (and adults who still inherit fantasy worlds).
Fury Road also makes extensive use of insane level practical stunts that other features would have gone 100% GGI. There's a difference in enhancing with CGI vs entirely in CGI. The stunt peeps on the motorcyles, the insane pendulum thingymabobbers, the cars themselves, etc are all real and very noticeably better looking than CGI
It feels like Hollywood has had a practical effects craze for a while in the past fifteen years thanks to The Dark Knight. Or maybe it's just been Christopher Nolan (and some Villeneuve) keeping that flame alive. There certainly has been no shortage of public gushing over Nolan's use of practical effects, even if some of his films didn't really live up to the hype.
I wouldn't expect this to happen. It's like hoping software will shift back to writing C++. Everyone moved on for a reason, and the people that make up the industry as a whole simply don't have the skillset (or willingness) for that kind of work. You'll see the occassional autur/indie production, but at blockbuster scale? Never again.
In anime at least, CGI cuts costs a lot, especially for 3D scenes. On the other hand, people are getting better and better at doing CGI these days. In general I feel like current animation has more movement overall, but it's not budgeted well. For example, in the introduction of Patlabor 2 https://www.youtube.com/watch?v=ydCLeclaKik, there are lots of still shots, but some movements are impressive, and the backgrounds are gorgeous. It is one of the best anime movie ever made though, so it's not like everything at that time looked like that.
There's certainly a lot of use of it, but it's painfully obvious when they do so. I think with some more time and practice many studios could make it harder to tell the difference - but it might negate a lot of the cost savings (making more fluid animations rather than the very stiff ones commonly seen, taking advantage of squash/stretch so the movements don't seem so perfect like they were done using computer animation, etc.).
Some shows/movies make it work well enough because there's things other than the animation that keep you watching, some use it for walking/transition scenes where it's easier to "blend" it in since it's background to the dialogue most of the time; but I really wouldn't say the "2d animation look" is even close to being achieved in many cases right now. Again, not because I feel the tooling is incapable of it - but because doing it right would nullify the cost savings and defeating the purpose.
Have a look at the remakes of Berserk or Sailor Moon. These are two of the most high-profile anime and manga franchises out there and should have plenty of financial backing, yet they look WAY worse than their cel animated counterparts from the 90s.
> I truly hope there is a shift back to practical FX and 2d animation.
Most 2D animation these days is just CGI dressed up to look like 2D hand drawn animation. Actually, the reduced production costs is why we are seeing so many magna/light novel being translated into anime series, even if the underlying content isn't that great.
It's because you're very nostalgic for the movies of your childhood. And you listed three classics, when that era produced a lot of trash as well. That's all fine, certainly not a bad thing, but that's where the bias comes from.
I think the point about balance is valuable. In the 1980's the digital transition had just started and mostly contributed to cleaner control of analog processes like computer-controlled tracking shots over models(all the "swooping spaceship shots" were done this way). But most of the things one could do to add visual interest to a shot, regardless of the particular tech, were fundamentally pre-production, not post: Storyboarding, staging, lighting, prop and costume work; well-rehearsed and directed performances. The parts that were meant to be special effects were still often obviously abstract in nature, outside of the biggest productions: e.g. a character walking away from a burning airplane wreck might just be composited on top of a fireball in a very basic way, if the production couldn't afford to destroy an actual airplane. Even just doing any compositing was not an easy task and involved a few specialty roles to create mattes etc. So there were some natural constraints that pushed everything towards building up the parts of the scene that were actually "live-action".
Once CGI rolled in - and really, this was true of all graphics when you look at things like 90's magazines overusing Photoshop filters - we got a taste for adding multiple layers of production on top of everything, which made the final image less "tied together" and more of a chimera.
I do think animation has gotten better overall, even 2D, because animation has always embraced being chimerical and covering that up by finding a consistent approach to the final image. All that's changed is that there's a lower reliance on large numbers of drawings. In the 80's, very few animation productions could really achieve their full vision - with few corrective tools available, you had to have some excellent draftsmanship just to get a product out the door. While there were opportunities to use that to expressive effect, most of TV animation stuck to simple shots and reused as much as they could get away with. This also really pushed the default look of animation towards whatever was easy to draw, which dampened the tone of works trying to feel serious.
Now we have digital puppets that still are fundamentally 2D but allow the animator to spend more time on creating a performance that reuses snippets of pre-drawn material, which is overall a good thing. If they want to push a scene further and add additional camera angles or poses, they have the option to do so and can apply CGI to create the necessary references too - you aren't bottlenecked so much by drawing, but neither are you stuck trying to composite in a CG image, because you can paint over it and get whatever look you want, as long as that look isn't live action.
I consider this to be the greatest scene of modern superhero movies (Amazing Spiderman 2 - Electro first fight). The movie itself had a lot of problems and Sony is entirely to blame, but I digress: https://www.youtube.com/watch?v=PWNbliU6-LQ
This scene does pretty much everything marvel movies are criticized for. Lots of CGI. Lots of explosions. Banter in dangerous moments. But I think it works extremely well.
The use of bright color contrasts between red and blue; then later darkness is really sharp. The sound design of the moment highlights the villain's confusion and spidey comes off as really distraught that this guy is in this situation. You don't really have reason to think that Spidey will die here, but it does set up a level of belief that some civilians could get hurt here, and it does come off as pretty exciting to see him need to immediately pull off a difficult, slo mo maneuver to save them. The fighting is incredibly quick, with both players only doing heavy fight-ending moves. It's stylistic as fuck.
tl;dr the biggest problem with superhero movies is the awkward uncanny valley of trying to be both realistic and absurd with CGI. Go big and stylistic.
This scene does the same thing well in that the danger and excitement is not about spiderman being harmed, it's the civilians.
I don't know that the special effects here held up especially well though. The machine gun and rocket are dorky. There's some irrelevant punching as well, but not too much. It could be argued just enough to establish GG as physically strong. The fight conclusion feels a bit weak though. I'd say it's a pretty solid scene despite this stuff.
The Mandalorian filming approach is an interesting new hybrid of the CGI use & shooting with them live in a 360 screens. This gives interesting lighting also allowing more use of practical fx.
Notably, The Mandalorian used puppetry for all the aliens. Using CGI for backgrounds but practical effects for stuff the cast actually interacts with seemed to work really well.
Not all, but a lot of them, but they also couch this specifically as a reference to... mostly Return of the Jedi, many shots contain puppets or stop motion animation that is actually shot at a lower framerate to play up the unreality.
The Volume is an impressive feat of cinema technology, but once you notice how the camera never goes long, everything is clearly shot within a small soundstage area. The new Star Wars series The Book of Boba Fett suffers from this limitation more than its predecessor.
I dont think that is necessarily a limitation of this at all - more of an aesthetic choice. They could have easily composited long shots.
The bigger thing I notice today with these products is that everything is just too damn CLEAN. When you have a bunch of real people shooting on what is supposed to be "tatooine" there is real dust flying everywhere and sand getting on their shoes and whatnot. On a light stage unless you have people specifically making the outfits and whatnot dirty and ragged it is going to start and stay perfectly shiny and new. It seems ridiculously honestly.
I have not actually looked up if that new Sci Fi "Foundation" was shot mostly on light stages but I suspect it was as the supposedly uber rugged colony that was supposed to be so limited in supplies with extremely harsh weather ends up just having a bunch of colonists with perfectly done hair and perfectly clean sci fi getups and a total lack of weather or clouds. It really took me out of it.
yes, I was going to say the same thing as a general response to the article in the OP. Even apart from the suspension of disbelief thing, CGI is very constraining in terms of the shots you can make, it all has to fit on a soundstage. It's paradoxical but while you can make a fantastic setting, it is also very constraining on the cinematography itself.
Sadly, I think most people don't really care about any of this, they just want to switch off for an hour and a half. I always feel isolated about this among my friends and family, I already get lots of time mindlessly staring at a screen reading some HN article or Reddit or clicking aimlessly through wikipedia, if I'm gonna watch TV or a movie then I want it to be something highbrow, not the latest MCU trash or whatever. If I'm gonna sit on the couch and scroll through my phone, I'd rather be scrolling on my PC instead, thanks.
I think a lot of what people are complaining about is that the third act of a lot of big movies is just "...and then they all have a big fight". I've never heard anybody complain about the extensive VFX in The Wolf of Wall Street[1].
One of the most damning comparisons to me are the orcs in the LOTR movies vs. the ones in the Hobbit movies, the former being practical make-up and the latter being almost entirely CGI. It's night and day, the LOTR orcs are a million times more believable and look like real organic living things, whereas the Hobbit movies end up looking like sterile video game cutscenes at times
I think some of these problems are solvable, if someone worked at it. The lack of gravitas of these creatures isn't in the pixels themselves, so to speak. They're in the fact that large creatures fundamentally move differently than small ones. For a sample version of this, compare the "walk cycle" of a small dog to a large one. They aren't just scaled up.
The problem is, you motion capture a normal sized person then try to linearly project their motion up to a 10 foot tall behemoth, and we can't help but see the normal-sized person underneath. The motion needs to change.
Possibly something as simple as putting some weights of the right size in the right place on the motion captured actor would do the job.
Well and bring up Lord of the Rings and you have to reckon with the beloved and acclaimed motion capture performance of Gollum by Andy Serkis.
Obviously the earlier Lord of the Rings movies did have lots of practical effects and models, but also major use of CGI. It's not exactly a slam dunk for anti-CGI, as an example.
The main problem with bad CGI is that it's bad, not that it's CGI.
Agreed. But The Hobbit was a very rushed film. Rushed CGI can be worse than other crafts when rushed. More time + more money could have achieved better results. That being said, practical FX will always be superior because you don't have to spend extra effort making lights and materials look real.
Agreed agreed. The problem with The Hobbit was not the visual effects. The problem was that it was made without a plan, under duress. Lindsay Ellis has a three-part video essay series (https://www.youtube.com/watch?v=uTRUQ-RKfUs; yes, I know it says 1/2, I promise there is a 2/3) exploring this.
Midway through Battle of the Five Armies, when CGI dwarf Dain emerges, it’s difficult to not think “wow, they really wanted to get this movie done with - couldn’t even bother to cast a human for this character.”
LOTR had the occasional bad VFX. One that stood out to me, in the theaters, was when Treebeard carries Merry and Pippin through Fangorn Forest: the compositing on the background is sloppy and the lighting on the hobbits doesn't go with it.
Gollum is the reason I can't watch those movies anymore. Any scene he is in has him bouncing around like a piece of rubber. It doesn't look like the ring made him ill at all; on the contrary, he has the energy and exuberance of Bugs Bunny. It's eye roll material.
I think a large chunk of the problem is that we've been concentrating on improving the rendering, which is clearly now phenomenal, and seemingly totally ignoring the animation, which is still abysmal.
CGI characters still move too smoothly. It's all spline curves and jelly physics. CGI is instantly recognisably unreal due to the overly fluid motions and it seems remarkable to me how little progress has been made on this.
Animated characters still look animated, except they dive straight into the uncanny valley because of the incredible rendering. I guess mo-cap only gets you so far before you have to smooth out the noise of the motion signal.
Anyone with actual knowledge of CGI got anything to say about this? It seems weird that everything else moved on while this got neglected.
Do we? Or is that just nostalgia. I'm in the younger generation but as a kid watching Star Wars, it didn't care. Now watching it I see janky models on sticks and wires and a puppet for yoda barely moving his arms.
I think it's the suspension of disbelief. Star Wars at its time was phenomenal for its VFX and SFX, but CGI in movies nowadays is saturated, so it feels less impactful to us
I don't think there's anything wrong with how he looks but the subtle over-fluidity of his movement. It's especially bad in the first shot when he's just swaying around most unnaturally. He's completely out of focus so there's nothing the rendering is doing wrong but the motion immediately alerts the brain that this isn't real.
An interesting thing I've noticed recently is that CGI both opens up possibilities and closes them down, and it's in the closing that it becomes painful. As the author points out, in older practical effects movies, you knew that you wouldn't get many big macro shots of a lot of things happening at once. In Alien, you weren't going to see the alien walking around in natural light with camera angles all around it, because it would quickly become obvious that it's someone in a suit or an animatronic. In CGI you get the opposite scenario: you'll see big believable macro shots of fleets of spaceships, but when a human actor is interacting with a CGI actor, you know that they aren't going to believably inhabit the same small space: e.g. hugs, kisses, punches, etc., are probably not going to happen, or, if they do, will involve some predictable trickery. For example if a menacing CGI robot strides threateningly towards a human actor, you know they won't believably grab the actor and go face to face, not without some cunning cuts. This is especially apparent in television shows with lower budgets. The space scenes in Battlestar Galactica were consistently incredible, but I was never once convinced by the non-human Cylons when they were in the same room as humans. I watch newer CGI movies with an eye towards how artists are addressing these issues.
As a big BSG fan, I have to disagree with the last part of your comment. The reboot’s GGI Cylons were much more menacing than the original’s actors clunking around in robot suits. (But it was still awesome to see the occasional old-school Cylon in the reboot. They did a great job calling back to the original show when appropriate.)
Man, I miss BSG. Every sci-fi show featuring androids since then owes it a huge debt. (Raised By Wolves on HBO jumps to mind as a current example.)
What weird take. For one thing, maybe an alien world like Arrakis shouldn't look identical to the deserts on this world. For another, lots of movies pre CGI looked terrible, so the idea that the past was some kind of practical utopia is idiotic. People need to stop mistaking personal taste for objective fact.
Why go to all the trouble of filming in the UAE and then just wash it out? In LoA, the desert is part of the story. In Dune 2021, it's just background for the action.
I have to agree. I don't know what it was, but it all never felt real to me. Actually the Frank's Herbert SciFi miniseries felt more real to me, despite it very obviously being filmed cheaply in a studio. Dune 2021 looks too clean, maybe.
One thing that stands out to me is the scene in Dune 2021 where Paul is outside on Dune talking with a gardener, who is watering palm trees. The conversation is about how many people could live from the water for the trees, and consequently about how atrociously hot Arrakis is. Paul is standing there, in the full sun, in a uniform obviously meant for colder environments. Yet there's not a single indication he's feeling any inconvenience from the heat and the sun. Not a drop of sweet, nothing. Just standing there like it's nothing.
Paul is used to a moderate climate, not (yet) accustomed at all to this grueling heat. Don't just tell us that Arrakis is hot. Show us that he's suffering! Show us how unbearably hot his new environment is!
Using CGI leads for so many "unrealistic" mistakes as your example. The devil is in the details, and soon makes something otherwise good quite soon unejoyable.
Another thing that bugged me about that scene. If they were watering those plants with water so precious, they would not be pouring it on the open ground for the sun to dry up.
Jordan, not UAE. Both Lawrence of Arabia and Dune 2021 were (largely) shot in the Wadi Rum in Jordan (and likewise desert scenes in many other movies).
My favorite example of this is the fact that in the film "Knives Out", the looming oil painting of Harlan (the murdered patriarch) was actually just a green square all throughout filming - the oil painting itself was digitally comped in during post.
Is Knives Out still a good movie? Yes. Would it have been possibly, perceptibly better if the actors had a real eyeline to work with on that painting, or the camera work was done with the cinematographer being able to see the real painting through the lens? Maybe.
Just seems lazy in that case really, what was the excuse? They didn't commission the artwork soon enough? Couldn't come to a decision on what the painting should look like soon enough?
Think filmmakers are just too used to large amounts of post work so you end up with this lack of decision making and things become bland and diluted.
If you didn't read this post, you would have had no idea the painting was CGI. Calling it lazy is probably as justified as if your boss called you lazy for using a third party library like Python Requests instead of writing your own http library.
The fact is there is more CGI in modern films then you know. There are a million videos like this, but this [1] video on CGI in David Fincher films is really worth a watch to get a small understanding on how modern CGI is used in films in ways you might not have noticed or thought about.
We don’t consciously notice it. But it may make the film worse because we subconsciously notice something wrong, or because the actors don’t act as well without the real props and setting to help them.
Well I consciously notice that Knives Out is a pretty good movie, so I think the conversation of what we subconsciously notice or not in this case is pretty silly.
> This isn't an especially compelling theory without some data to back it up.
It's not difficult to reason how a real painting could plausibly incite a more realistic performance out of actors than a green screen. This trope of asking for "data to back it up" for the most obvious things is tiring. This is an internet message board, not a top tier scientific journal. You can ask for evidence if you want, but do you really need to do that in this case?
> You can ask for evidence if you want, but do you really need to do that in this case?
I mean, the job is called "acting" and I'd think someone is a pretty bad actor if they can't pretend it's a real painting instead of a fake. They're already pretending to be an entirely different person, in an entirely different setting, dealing with a reality that often has extremely different rules than ours.
So, yeah, it seems like a pretty extraordinary claim to insist that they can handle stunt doubles, fake props, and painted backgrounds... but a single green screen painting is just too much for them to handle.
> So, yeah, it seems like a pretty extraordinary claim to insist that they can handle stunt doubles, fake props, and painted backgrounds... but a single green screen painting is just too much for them to handle.
Nobody said that the problem is a single green screen painting. Since you read the OP, you know that the OP laid an argument against the prevalence of CGI, specifically how overt use of CGI leads to movies feeling less realistic. This single green screen painting is merely one small thing contributing to the overall problem. But now we're asking for scientific studies to conclusively prove at p<0.05 that having a single green screen painting on set reduces actor performance? If you genuinely want that level of proof, why are you reading anything on the internet at all? Nothing you read should be convincing to you of opinions you don't already hold.
The painting also changes over the course of the movie... Of course they could have just made more than one painting.
I do think it probably just wasn't ready, but you should probably see this as a win for quality: they got to take more time to get the painting they wanted without delaying production, and paid no visible price for it.
I would never have known that the painting was pasted in had I not read it here. I have been contemplating watching the movie again, maybe I should wait another year or two first. Now I want to pay more attention to this detail.
I’m not sure if this is the reason, but the lighting on the painting, and the expression and pose of the subject, change subtly from scene to scene- reflecting the story as it unfolds.
I think this is correct, but it's also what the article says between the lines.
A problem with a lot of movies these days is that they got this great new toy, CGI, and then they don't when to stop. Whether the CGI is good or bad, perhaps because you don't have to work around limitation you follow the formula that whenever you want to add some action or don't know what to say, or don't know how to get the main protagonist out of a sticky situtation, you just add a bunch of CGI, and solve it that way. It ends up spilling into lazy storytelling and lazy film making in general, as opposed to when you had to work along side limitations to tell a story.
I think to me this seems most evident in the Star Wars films, where as technology progressed, the lack of imagination in the creators when not forced became increasingly evident.
There are a lot of bad uninformed takes in this thread, but I think this is the worst of them all.
First, CG has been part of films for more than 25 years now and it isn't a "great new toy".
More than that though, movies just aren't made like this. People don't go out with a camera and try to make something sequentially, they are planned from start to finish and refined for years before actual production.
I remember watching Terminator 2 when I was in film school (1991). It was shot in 1990, and the tech for the T1000 seemed incredible. Then a friend showed me how much CGI was used to clean up simple stuff that you would never have thought of. Watch the T800 (Arnold/Stuntrider) jump his Harley into the river bed. That was wire work cleaned up via CGI. Not really necessary, they could have shot a diff angle to hide it, but that's a tell as well. Movies since then have always used low-key FX and CGI for this type of journeyman stuff.
The thing that makes this great is that it increases a movie's verisimilitude. Your shots don't need to mask or hide things. They can be more organic and real.
In all honesty, I think you're right. But I do feel something has happened. I definitely enjoy new blockbuster films a lot less then older ones. I still think fantastic films are made, but they are usually more independent films on smaller budgets, and some of them do have CGI, also obviously so, but it seldom matter.
Is it a problem with how the modern film industry puts together movie? I don't know.
I agree with this. I agree with the author of the post that there is something wonderfully authentic about movies with PFX before the rise of CGI, but then again I am noticing a lot of CGI; most of my family can't stop gushing about how unbelievably real most modern movies look ~ they're not seeing the CGI.
They also love the Soap Opera Affect [1] of modern TVs, which I can't stand.
So while I agree with the post's author, I think my agreement is one of a matter of personal preference, not objective evaluation. Side note: As a nice "stay at home" hobby during the pandemic, I've gotten really into film photography; shooting it, developing it at home, learning as much as I can about the history and science of it ~ one consequence is that I view older movies shot on film in a completely different light now (no pun intended).
After reading all that, I can't help but feel that when movies first came out, there were plenty of people complaining about how live theatre plays were obviously better and everything is getting worse.
The article even admits that CG is a lot better today than it was a while back.
Yes, they make mistakes and things could be better. But that's because nobody is perfect and everyone is making it up as they go along. When nobody has done it before, it's impossible to copy them.
Personally, I love today's movie and their CG, especially when I can't even tell it was CG. It used to be incredibly obvious that it was CG. And it used to be incredibly obvious that puppets were puppets. Everything gets better as people do it more.
I don't think we need a shift back to practical FX or 2d animation in general. Only for those situations CGI can't handle yet.
What we need is good stories and unambitious effects that support those stories.
The author touched on several key problems:
* Color Grading:
> Notice the color differences. Every movie now takes place as if it were in The Matrix, appearing in deep greens or blues or in gray shades, like the complex colors of the world have been bleached out.
* Shadows:
> In the 2021 Dune, it is a monochrome chunk, despite the billowing fabric around the character. In the 1962 Lawrence of Arabia, the shadow of the fabric is actually see-through. It has dimensionality, thinness and thickness. Because it’s a real shadow.
* Over-ambition:
> how could the same filmmaker go from making a movie as good as The Fellowship of the Ring in 2001 to a movie as bad as The Hobbit in 2012? It’s not budgetary. The Fellowship cost 93 million to make, while The Hobbit cost 180 million.
Jurassic Park (1993) was so good and stands the test of time because the primitive CGI was hidden by night-time darkness. The filmmakers knew what was achievable and believable relative to the technology of the time.
If in 2022 you make a movie with flat colors, fake shadows, and 3D characters (like Maz Kanata in Star Wars) who need to interact in scenes with human characters, then yeah: it's going to be bad.
Modern movies fail because they're too ambitious. The battle of the five armies in The Hobbit was too ambitious: our technology can't make it look as good as live action yet.
The answer is to make movies we can make with a mix of practical + CGI, and wait for the technology to catch up. And to fix the damn color grading. Songs used to sound good. Movies used to look good.
You don't get better by waiting until you can be better, you get better by doing your best, falling short and discovering how to overcome your shortfalls.
You don't get a budget to 'try to build the battle of five armies perfectly and only then will you get to put it in a movie'.
Overambition is frankly BAKED into the movie industry and has from the beginning.
So yes, cutting edge effects will fall short, they have since the beginning of film. But the techniques are always getting better. When golum was introduced in the LoTR he was not perfect but you can see the improvement just within that series from movie to movie, he's always looking better... 'touch' interaction between cgi human actor also improves quite a bit but still lacks, the wrestle that frodo and golum have at the cracks has .. cracks.
Yes, each movie should aim to push the boundary out a bit. But you don't try to film Avatar in 1978 (without completely changing the scenes), because with puppets and matte paintings it'll be relatively unwatchable.
2D Lord of the Rings as an animated film in 1978? Sure.
Live Action Lord of the Rings? Wait till 2001 for it to be good.
Color grading has nothing to do with CGI. That's a production choice not a technical limitation. DPs and directors have been favoring a limited or neutral color palette for the last two decades. Like any fashion, this will change.
CGI is to movies as autotune is to music or makeup is to beauty. When you have a solid base, and merely use these things to clean up a few rough edges, it works wonderfully. When you destroy any sense of realism or humanity under an entirely synthetic facade... yeah, nobody's buying it, it destroys the emersion, and it all looks / sounds the same after awhile.
The top grossing films of the last decade disagree with this take. Many people are, in fact, buying it. We might not like it for aesthetic reasons, but you can't argue that it doesn't sell.
When I first watched Star Wars in 1977, I knew that the spaceships weren't real, that Luke didn't live on Tatooine with a landspeeder, or that lightsabers worked. I didn't care because of the "willing suspension of disbelief." We all know that the movies we watch are fake, Potemkin villages. We don't care if they use the film-making arts to help us suspend our disbelief. Doesn't matter if they use CGI, stop-motion animation, matte paintings, shoot on soundstage or in real locations.
The examples in this article are surprising because I don't think they look very good.
While the CGI in the Star Wars example isn't amazing, it's no worse than the Ewok that has fabric folds showing on its costume.
In the second example, the devil is too human looking so all I see is a person in makeup (good makeup) where Steppenwolf looks completely alien.
Finally, the dune example gives a feeling of being truly out in the desert, where the Lawrence of Arabia looks like they're filming in a yard and the colors look oversaturated compared to what I've seen in real life deserts.
My understanding there are some 'matte paintings' (cgi or paints) in the distance (to get that awesome horizon), there is some crazy color grading happening, (flattening the colorspace considerably i'm sure), and there are some cgi overlays happening (whenever you see spice in the air or any kind of duststorm). So yes those shots, even though they were shot in actual desert are doctored heavily and perhaps can be considered cgi on the whole.
That said the whole article reads as 'old art better than new art' 'old man yells at cloud'.
Yes exactly. This is just another essay about this that misses the forest for the trees about this stuff unfortunately. Not only does this have negative aesthetic side effects, it is produced by what is widely reported as an overworked, underpaid labor force.
The rhetorical lengths some people go through to avoid talking about the actual, material reasons for these things. We are in general much more interested in just lamenting society, instead of complaining.
It's fascinating to me that a movie production with a budget of 10^8 dollars can't slice off a tiny bit of that for the people that literally make the movie.
Not for nothing, I wish more people here would pay attention to forces like this in IT. I feel like there's a lot of people not following the money when we discuss things like, e.g. Wayland.
That might only be a second or third-order cause. The real one is cost. I doubt unions are what drives the difference; it's that an animator can do what a crew of 10 did before.
Now, that only covers a small number of studios. So, of course, not every studio can be part of it. But it is a union and it does drive some of the industry standards and expectations.
They don't have the collective bargaining power of a union and are paid much less/exploited, resulting in better profit margins for the owners of the film.
Cost is the only thing that matters. I don't believe the budgets of today's movies are any smaller than they were before CGI, CGI artists are very much expensive and there are tens of thousands of them working on a movie.
22 year veteran of CGI here. It's not less work. It's certainly different work. In fact it probably takes more time because directors think (and studios let them) do do-over after do-over. But once a giant set is blown up, that's it. Once you've flooded the set and soaked your actors it's prohibitive to do it again.
Nowadays All movies are, to some degree, cg’d. Color grading (the final stage of the post-production process) is entirely digital, and things are being done with digital color grading which would have been un-dreamable in analog days.
Also, let’s not forget that lots of vfx address minor things like (from experience) removing an umbrella that the directors daughter accidentally left on set.
Finally, practical fx frequently require a digital helping hand. Like removing visible wires from fly by wire scenes.
Yes but it's clear that the article is not talking about this. It's talking about things like The Hobbit compared to LOTR. The latter is a classic, the former has aged terribly, in a big part due to awful CGI
Yes. I wrote that without looking at the article. However, having now done so, I see nothing there of substance. The author is claiming that non-check films look more ‘real’. Amazing…. Apparently reality can now be measured in degrees of intensity.
The author buries their best point, which relates to The merit of non-cg is the merit of performance. We may be able to see through the smoke and mirrors of the transformation scene in the American Werwolf, but it has the appeal of the stage magician, an appeal rooted in immediacy.
Here is a paper I wrote on the subject… a bit dense but I believe I made a few good points…
I disagree. Our tastes migrate, and what satisfies us yesterday may look bogus today. I remember being blown away by the ch of Terminator 2, which looks very pale nowadays. Still a good film though.
So much of this is just how _much_ work had to be done to get practical effects to work, and how practical and CGI are often good for different things. For an insane example of this checkout this video of how they did the ship landing in the original Dune movie (of which step 1 was "rent a stadium").
Also, I think there's a very selective bias against CGI where when it looks _real_ we just assume it's real and don't notice so we're only criticizing the portion where it failed to live up to reality.
I think there's a lost magic to practical SFX, but nostalgia aside, the single most important flaw I see in CGI that still remains unfixed to this day is...
... bad lights/contrast.
In almost every CGI scene, barring honorable exceptions, you see the CGI portions are fake because the lights are all bad. The comparison shot in the article between Steppenwolf and Tim Curry's Darkness from "Legend" is telling: while Darkness is shrouded in, well, shadows with sharp contrast, it looks as if Steppenwolf is brightly and uniformly lit.
Things that lack contrast and are uniformly lit are less visually interesting, especially when they don't match the environment. I notice this flaw in everything: Jurassic Park's dinosaurs, Star Wars ships, monsters in Lord of the Rings, mostly every modern creature effect. I know CGI artists must know this, so I wonder... why is it so difficult to fix?
(The honorable exceptions are mostly due to showing the CGI creature more sparingly, or where the light conditions by coincidence really match the creature).
Lighting in general seems to be underappreciated, even without CGI.
I'm reminded of Fifty Shades Freed, which starts with a wedding scene that's lit with green. Green lighting is basically nonexistent in nature, so when you see it, it's not an accident - someone had to mess with it, and it's usually to indicate something is wrong/unnatural. You can really see the care put into films like The Matrix with how they use green lighting, which is in stark contrast with films like Fifty Shades Freed which use it mindlessly.
But when the author says
> Every movie now takes place as if it were in The Matrix, appearing in deep greens or blues or in gray shades, like the complex colors of the world have been bleached out.
I don't think he really gets it. Sure, unnecessary color grading is easier than ever to accomplish, but intentional use of lighting to help tell the story is probably going over his head.
Everything has CGI from the most average TV show to biggest blockbuster and, for a lot of it, you don't even notice. What people complain about is the CGI that they do notice.
I think the issue is not, so much, that there is too much CGI. It's that a lot of movies are effectively hybrid live action and animated movies. And the animated parts tend to be way over the top to the point of being boring.
In Black Widow, the big 3rd act CGI explosion-fest was all planned out before the director even got involved with the movie.
I will note you are picking some of the best effects movies of all time and comparing them with everything else.
On the other hand compare "The Expanse" (Prime Video) with B movies of the Alien era, and the level of story telling they are able to do in a long form TV show. Then compare TV shows of the Alien era. The story telling there is much aided by CGI. Tho that show uses a good mix of practical and fake. Lots of small sets for ships, lots of fake displays. Many fantastical worlds that they can pump out with CG.
Alien's horror genre gets away with not actually showing that much in order to build suspense. Few genres are afforded that luxury, particularly not super hero movies. I don't see many people citing Superman 2 as the kinds of effects we should go back to.
And those aren't the points that people generally point to as failures of CGI. You can have goofy CGI with those, but you can definitely have goofy looking practical effects with elements too.
Really? The little xenomorph skittering off after it burst out from the chest literally made me laugh out loud with how silly it looked, it was so incredibly stiff and fake.
That example of practical effects looked quite stiff and fake like you said. But practical effects also continued to improve over the years even though CGI came to dominate the film industry.
Alien was released in 1979, but a few years later saw the release of An American Werewolf in London (1981) which featured a memorable werewolf transformation (all practictal effects). Then in 1982 saw the release of The Thing which took practical effects a notch higher.
Also, today filmmakers sometimes use a hybrid approach: combining practical and digital effects.
I do wonder how much further practical effects would have developed if CGI haven't overtook the film industry.
There are of course some minor problems, but the overall environment creates so ”real” feeling. It is not perfect, but somehow it makes one to believe this could be real. I don’t really know how to describe that. It is missing from the modern CGI movies.
I just watched Alien again recently and had the same thought. It looks infinitely better and more realistic than anything made in the last few years. We should go back to using models and miniatures and restrict the computer to things like automated camera control.
Why would we stop progress on a technology just because it’s not perfect? Particularly on something as subjective as this. This attitude is prevalent in some of the weirdest fields but oddly enough I don’t see anyone saying we should stop flight because the leg room in modern planes isn’t great.
Although I’m curious what your thoughts would’ve been had you been around to see the Wright brothers on their first flight. Or the first cell phone that came in a suitcase. Or the first camera that was just a photo sensitive plate and a hole for a lens.
How would any of this had been improved if people just gave up the second they realized something about the technology wasn’t perfect?
Hmph. Isn't CGI just different path? We could have improved miniatures as well. Maybe technology to produce them or use with CGI. It is not so binary. Computers are no the only technology we know.
Okay but I've never said we should stop using practical effects. I don't see any point in trying to limit artists in what technology they can or can't use.
I don't think people generally recognize the absolute ubiquity of CGI these days. You only notice when it doesn't land, but there's an order of magnitude more you don't notice. Judging CGI on what doesn't land is as much of a fallacy as judging practical solely on what doesn't land.
> I think there’s a genuine mystery to be explained here: if people prefer traditional architecture by a large margin, how come we’ve stopped producing it?
It takes time—decades, perhaps centuries—to develop the techniques and traditions that produce beautiful architecture. When new materials like glass, steel, and plastic become the de facto standards in construction in a relatively short period of time, we can expect construction to be ugly or unrefined as we discover those techniques and establish those traditions.
Not even closely. You couldn't make the Matrix without CGI and the CGI in something like Foundation looks so absolutely gorgeous that I wouldn't mind if I could just fly around in the world and gape at it.
No matter how much I love TOS almost all the aliens were just a guy in a fursuit and all the rocks are shiny...
Now of course you can make terrible movies and try to cover it up with CGI, but back in the day they just made terrible movies and tried to cover it up with boobs, guns and explosions. Not really that I mind either.
There's bad, poor, ok, good, and great CGI just like every other aspect. There's probably far more CGI in movies than the poster thinks. There is weightiness in plenty of CGI. Thanos was one of the best villains and he was basically all CGI. Tons of weight and feeling in almost all of the MCU movies. There was a lot of cheezy, bad puppets in old movies - I was there and it's usually pretty painful to rewatch them. I used to love watching the Star Wars trilogy every year, but hadn't done so for about 7 years. I watched it this year with my son and ... it's a bit rough at times. Some parts are brilliant, others don't hold the shine they used to.
There was a period up to, maybe 5 years ago or so, where CGI was generally pretty awful, as it was just actors pretending in front of nothing but green screens, and the CGI was crazily complex and perfect. That was my least favorite era.
But filmmakers have learned to do clever things with CGI.
If you haven't watch the series by Corridor Crew about Good and Bad CGI, I highly recommend it:
I disagree on the author's definition of "real". I see two ways they can be defined in the context of movies: Real could mean that the objects portraid look like they were actually filmed through a camera; but it could also mean that the objects portraid look like the objects they're intended to portay. The former is easily achievable with practical effects. But the latter is in many cases ONLY achiavable with CGI, which in many cases already starkly outperforms any practical effects (admittably, to the detrement of realness according to the former definion in some cases).
Which of the two you prefer depends on what it is you're looking for in movies. One motivation might be to see humans acting realistically, reacting to their environment and struggles. If that's all you're looking for, practical effects almost always win, because they allow budget to be allocated on actors and scripts, and actors have an easier time imagining to be in the situation they're acting to be in. Many blockbuster movies, however, are made with a different goal in mind. Marvel, Jurassic Park, Star Wars all don't have realistic human interactions and struggles as a primary value proposition, but interesting fantasy worlds and fights, with "sufficiently" realistic characters. I for one enjoy both types of movies, deeply, and I don't think that CGI ruins movies today. In particular, I don't think that CGI ruins movies like Jurassic Park or Star Wars, which in my view are great movies for the year they came out in, but by todays standard's are clearly objectively worse in almost any respect. Imagine something like the first star wars movie coming out today! Nobody would know about it, even fewer would care. Nostalgia is a powerful force.
I'm surprised there was no mention of Spielberg's challenges in making Jaws (longer story here - https://www.mentalfloss.com/article/31105/how-steven-spielbe...). For those not familiar with the story, in the 70s, they couldn't make a "realistic" mechanical shark for the film which Spielberg had been relying on. The shark didn't work most of the time, it looked weird, so he couldn't show it. Which meant he used the lack of being able to see the shark as a tool which improved the movie.
There is currently an over-reliance of CGI, it's a cheap trick that the industry is getting used to. They'll eventually learn to use it properly AND the quality of the CGI will improve. These two factors together give me hope that movies will become good again.
Of course, getting excellent writers creating new unique content and getting that content produced instead of more Marvel/DC/rehashes is a bigger issue I think.
I haven't found any sci-fi movie interesting or entertaining for the last decade or so because of all the CGI. It feels like it takes the human elements out of storytelling. Which is why I find Wes Anderson's movies such a refreshing take. They use nearly all practical effects and it is human and fun. I would recommend seeing the French Dispatch, his latest movie.
give Villeneuve’s Dune a chance if you haven’t already. I don’t agree with the post on this one, they went to great lenghts trying to make it feel real.
The comparison of the shadows in the article feels a bit weird, considering it's comparing shadows in a bright clear sky vs a sky with a lot of dust and debris. Especially weird considering Dune does a lot to make it's CGI feel more grounded then the competition.
All of Dune's outdoor scenes were filmed outdoors, to capture brighter light and sharper shadows. Instead of green screens, sand colored screens were used, to eliminate the subtle green glow from the light reflected off the screen. Even subtle things like having actual glass in the windows helps a lot to make a scene more grounded.
Is Dune's CGI perfect? Obviously not. But it does look pretty damn good. And I'd like to see more filmmakers use it as an inspiration when producing other movies.
sci fi suffers a lot from this. Some scenes of Ad astra, The Martian, Interstellar or Gravity scream "I copy pasted the actor's head in a 3d scene", it just doesn't work at all for me, even on a technical level you feel something is off in the movements/lighting
I've recently watched prometheus right after watching the original Alien and it's just painful to watch
Moon falls just outside your "last decade or so" but would have been the exception that proves the rule. Deeply moving and intriguing sci-fi, all practical effects. I heard somewhere they used some models that were originally made for a Red Dwarf movie.
You should check out Prospect (Trailer: [0]). It's an indie sci-fi that I believe is still on Netflix. It uses a minimal amount of CGI and a lot of props and practical effects. The director did a few AMAs on reddit. [1]
It stars Pedro Pascal before he made it big in The Mandaolorian.
Reminds me of the project that used convolutional networks to make artificial images look more realistic by mapping the style of real images onto them [0]. Just slap that on top of your CGI and call it a day.
CGI worked pretty well in the original Matrix and Terminator 2. When CGI is innovative and fits the plot it's fine. When it's used in the same formulaic superhero movie 30-minute final battle scene we've seen a dozen times--with the same big boss and hundreds of disposable minions--it gets old.
While I agree with his premise, the author's entire argument is: <amazing old film> is better than <terrible new film>, so CGI is bad.
- Is there any reason to believe that the Lord of the Rings trilogy used less CGI than the Hobbit trilogy?
- Same for the Matrix trilogy vs Matrix Revolutions?
- The CGI in the newer Star Wars films is actually fantastic. It was everything else that was a letdown and caused it to be an inferior product over the originals.
What we really should be blaming is the paint-by-numbers formula for film production used by all big studios nowadays. There are no original properties, no risks taken. Using practical effects vs CGI isn't going to change this fact or dramatically make a bad movie good.
> the author's entire argument is: <amazing old film> is better than <terrible new film>
I think that's unfair. Villeneuve's Dune is not a terrible film, and in fact its visuals are pretty nice -- though I agree they suffer in the comparison shot with Lawrence of Arabia.
The original Lord of the Rings trilogy by Peter Jackson, also mentioned in the article (as a "good movie"), is a special case: its visuals are both terrible and good. They are good when they are practical effects, e.g. the makeup of the Orcs looks terrific, and they are terrible when they are CGI, e.g. the mines of Moria have some of the worst CGI ever, and so is the embarrassingly bad troll.
YMMV. The LoTR troll was fine for its time and holds up okay imo, though maybe that's also because it was released the year as the horrid-looking troll from Harry Potter.
Obviously this is subjective, but to me it was pretty bad at the time.
But disregard the troll: take a look at the scenes with the Fellowship running through Moria. They are horrid, laughably bad. They are a painful to watch, and were bad back then as well. They were worse than videogames at the time!
I feel like these points land harder when talking about animation rather than live action films.
Consider that when Disney made Bambi 2 in the 2000s, they came to the realization that their animators did not have the skill required to animate deer antlers and then basically had their hand forced into using CGI. In the 1930s this was not a problem: the film contains not just hand-animated antlers but beautifully painted backgrounds, an incredible score and some of the most expressive character animation ever produced.
It hardly seems to be a stretch to call animation done during Disney's golden age objectively better than what is being done today.
I don't believe there are not artists today who could animate deer antlers or paint beautiful backgrounds. Disney did not want to pay for them or wait for the time it would take to animate a film by hand.
Bambi lost $200k USD (as a result of being released during WW2) in 1942 when it released[0], the equivalent of ~$3.4 Million USD today, so I'm sure they were cutting necessary costs. It only recouped the losses with subsequent re-releases, which had the benefit of access to the European market.
Christopher Nolan (the director of Interstellar) has notably not jumped on the modern film making bandwagon. Nolan still shoots on film and eschews CGI whenever possible. The authors of this article might see Nolan as a lone wolf standing against the forces of CGI that have ruined film over the last few decades.
Uh no. Nolan used a ton of CGI in all the Batman movies he made, in Interstellar, in Dunkirk, Tenet and especially in Inception. Yes he likes to use practical when possible, but the idea that he doesn't use CGI is ridiculous. Everyone uses CGI. Even when it's simple compositing stuff. Everyone conflates CGI with doing a shot/scene entirely digital. Some films do that, but most don't.
I did not say that "no CGI was used". Rather, I stated that Nolan uses practical effects whenever possible. The author of the article was arguing that modern movies (since approximately the Star Wars prequels) have been ruined by too much CGI. I was pointing out that Nolan has not jumped on the trend described in the article and rather only uses CGI when necessary (some would say he doesn't use enough CGI).
To learn more about Nolan's film making process, I recommend you watch some interviews with him.
> Everyone conflates CGI with doing a shot/scene entirely digital. Some films do that, but most don't.
The article does not conflate these things. Rather the author decries "excessive" CGI use in shots that are not entirely digital.
As someone who grew up with movies with CGI, older movies look fake to me.
Puppet Yoda from the 1980s looks extremely cheesy and fake and really takes me out of the movie. You can tell he's a puppet so he doesn't look like he fits in the real world.
Similar with other practical effects. At the beginning of Star wars episode 1, the ship was blown up and it looks like it's made out of cardboard. To me that looks very fake. Metal doesn't explode like cardboard does.
Some years ago, I was at my cousin's house watching some movies on his HDTV. The first movie was "Star Trek II: The Wrath of Khan," which I recall seeing in movie theaters when it was released in 1982 (as I date myself here). I think STII is the best of the Star Trek movies and still holds up to this day, but watching on that HDTV was painful! I could see the costumes. I could see the cheapness of the set. It just looked cheap, and not at all like the incredible spectacle I remember from the movie theater (or on repeated rewatchings on TV).
The next movie was "Ronin" (1998). Great movie, but on HDTV, it looked like a cheap TV soap opera. Again, it was painful to watch. On the movie screen? Or a normal TV? Incredible. HDTV presents a level of fidelity that older movies can't properly deal with, which you may be picking up on (as someone who not only saw "Empire Strikes Back" in theaters, but still think the puppet looks much better than the CGI version).
"As you listen to that gentle creaking, or that low-tone burbling, what you feel at that moment is the deep satisfaction that comes from witnessing Being, in all its irreducible and incomputable eddies and swirls. Your eyes are satiated. There’s probably a German word for it, and if not, there should be."
Anyone know the German word for this? There's gotta be one, right? Something to express the zeitgeist of CGI-laden current day cinema?
I've arrived in an interesting place I didn't expect.
Mostly, I don't care!! First and foremost, is the story good, compelling?
If so, I'm in! I see some of the effects, there are others I know I don't see, and good on those crews. Did their job. :D
I like to see them, when I see them too. Used to be, looking back, poor effects would stand right out, and it took me out of the experience. But, there was one exception, and that was the original "Star Trek" shows. For some reason, I didn't care on those and all these years later, I think I know why, and it's the story, the people, all of it being compelling.
Also, on that program the music is just awesome! I still use episodes as background when working. The sounds are just great!
Now, I super appreciate real, filmed things. The lack of effects is the special deal now. Cool!!
Finally, makeup:
I love good costumes, makeup, all that tech theatre stuff. The only time it's objectionable is when I do see it, and am not really supposed to, and usually that comes down to some environmental thing, the actor struggling (like sweat or something crappy for them and the makeup), or what seems like excess, like caked on.
Otherwise! Yeah. Decorate the people. Let's check 'em out.
CGI just adds way too much. Now that it's so (relatively) inexpensive for even modest productions, films can rely on spectacle to cater to the lowest common denominator, which is apparently far more lucrative than making actual art.
As a side note, I'm partly ruined by having a father who has worked in VFX and CGI animation and dabbling in those fields myself. While CGI has improved in my lifetime, I can't help but see areas where big budget FX are obviously flawed and find it very distracting. As an example for some reason, studios believe that fully CGI animals must be animated with exaggerated motion and gestures that most animals aren't making at any given time; this leads to CGI creatures that, to me, are not believable. Animal animation also tends to be overly smooth and floaty. Of course these are often true for animated humanoids as well, but animals should be easier. I sucked ass at character animation, but my less-bad animations were the ones involving animals. Animals tend to only be doing one thing at a time, so it's really not that hard to work off reference material. This also isn't to say that animals should be portrayed 100% realistically in film; my point is that the physicality of CGI animals has barely improved, and it ruins the experience.
But maybe I'm the only one? Sometimes CGI animals in films are so badly animated that it's hard for me to believe that even the average person doesn't lose some of their immersion when watching a movie.
Now that I think of it, perhaps animals aren't used as extensively with mo-cap because of animal rights issues? I dunno... whatever the cause, there's something that I don't think is being done right most of the time.
There's good digital effects and bad digital effects. Sadly, most digital effects are in the bad category. Compare say Pleasantville's use of digital color changes over time to make the story come alive with the changes the main characters experience. It really sells the story properly without looking out of place. Whereas stuff like what's listed from Justice League and The Force Awakens fails because they don't fit well with the story. Like Cyborg wouldn't need to be a CGI overladen presentation if they did subtle practical effects backed up with equally subtle digital effects. Steppenwolf is a whole separate problem of visualizing the character. Making him completely CGI was a mistake. For The Force Awakens, the same issue is apparent with Maz Kanata in that she should've been partially a practical effect prop such as what John Favreau did with Baby Yoda on the Mandolorian and then augment the practical side with digital effects to make it pop.
Also, I'll say that there's one area that digital effects beat out old ones: matte paintings. Digital matte paintings are far better than what came before and allow for greater degree of flexibility for shots. But they've been abused for replacing location shoots for them entirely.
Edit: I would also put Joseph Kosinski's Oblivion in the list of good digital and practical effects blending along with some location shots. It's really one of those things that once you notice who does it better you begin to realize that CGI and green screen alone isn't sufficient. You have to mix all the methods together (practical, digital, matte, etc) to achieve the look you're after rather than pretending someone using a 3d modeling tool can magically whip up the look in post-production.
>Notice the color differences. Every movie now takes place as if it were in The Matrix, appearing in deep greens or blues or in gray shades, like the complex colors of the world have been bleached out.
Could this reflect that we as a society have progressed towards a more advanced state? Maybe we have reached a more 'real' reality, a bit like Quentin Tarantino movie characters. As a consequence, the movies that we are watching now look different.
>There's the realer than real universe, alright, and all the characters inhabit that one. But then there's this movie universe. And so From Dusk Till Dawn, Kill Bill, they all take place in this special movie universe. So basically when the characters of Reservoir Dogs or Pulp Fiction, when they go to the movies, Kill Bill is what they go to see. From Dusk Till Dawn is what they see. [1]
I think this is an instance of the symptoms being mistaken for the disease. CGI did not ruin movies, it was the demand for shallow, detached, visual excess that has pushed aside story telling altogether in favor of cheap tricks.
There are still good stories, but they make nothing like the movies with big explosions and apocalyptic visions ... insert Michael Bay joke here ..., but it's really also just a set of people taking advantage of and serving a systemic, cultural, and civilizational degradation and contraction, or even possibly an implosion. Michael Bay got rich/famous off spamming explosions because that appealed to the masses. People of the past would have likely found all these movies today extremely boring and repetitive, because they are. For example, I can recall exactly one movie I have ever seen that was not a "all ends well" ending. Essentially every single movie today is guaranteed to end a single way. There are no tragedies anymore, it's essentially all comedies (in the classical, not the modern sense), it's this weird mental drug that has been created that feeds only endorphins. It's boring and stunting.
Just like movies did not ruin books, CGI did not ruin movies. They are merely phases on a clear degradation of our society and civilization where the people keep making poorer and poorer decisions. It's really not any different than the negative feedback loop of weight gain; more weight, means less likely to work out, which leads to physiological changes that make overeating more likely and has a negative impact on mental state, which acts as barrier to engaging in rigorous physical activity ... which starts the cycle over.
"CGI ruined movies" because our society has been getting mentally less healthy and our standards, expectations, discipline, and principles have been dropping like rocks. It's the symptom, not the cause.
I was on the side of modern music against autotune critics in the last thread, and find myself against the pro-CGI "modern movies are fine" narrative in this one, interesting.
For me it feels like visual effects set the bar higher for greater spectacle. It's more easily to be jaded by spectacular effects and to become accustomed to them. (I'm also pickier towards films, often imagining how imagery could have looked better or the story could have been told different.) It also feels like we're in an era where many Hollywood films can easily have access to great cinematography, stunning aerial shots of natural vista, etc.- and then it just ends up looking like Mac wallpapers. So many films can have that National Geographic documentary look, even arthouse pictures like say The Green Knight. Is it just easier to film on location than it used to be?
I wouldn't compare CGI, which is obviously very weak when placed next to reality, with real scenes. I’d compare it with the matte painting backdrops of the 30s, 40s, and 50s. Movies like forbidden planet. If your go to see a CGI movie expecting to be “fooled” then you are certain the end up disappointed.
A dimension I dind't see mentioned in the article is the volume of production and changing dynamics in the VFX industry. Back in the 70s/80s/90/s there was dramatically less special effects-driven content being produced. The community of those films was so small that you could have minor celebrities in people like Stan Winston. As the demand for the resource has increased, the talent in the production team has not kept up. This seems to me to be a key factor in the decreasing average level of CGI quality. Also keep in mind that VFX is a generally bad business. While more is being spent on production, VFX shops are extremely margin-driven and when you combine tight margins with tough (global) competition you end up making tradeoffs.
The problem I have now is that after reading about how Mandalorian is filmed, I simply cannot unsee the set. Every time somebody walks around you can tell by subtle signs that they are on a small set surrounded by CGI. These days I yearn for a true exterior shot.
One thing that bothers me about a lot of new movies and TV shows is that the actors just look digitally inserted. Even when it’s a static shot and there’s no interaction with the background, it just looks wrong. I’m not sure how I identify it, but I suspect it has something to do with the way the actors are lit. I remember watching a Marvel movie where all the heroes were shown as a group at the end and I felt like I was looking at a high school yearbook because it seemed like they weren’t all present in the same space. The production values are high for a movie like that and you’d think they’d be using state of the art techniques.
A part of this that's missing is the effects of HD filming on practical effects.
Back in ye old days of the 90's, props didn't need to be super good looking unless there was a close-up. Now in the days of 4k, every prop needs to look gorgeous or the suspension of disbelief gets pinged. Same reason why you have people going through the Marvel weightlifting program and shaving _everything_.
The increase in film detail happened from the same technological push that made CGI. Now folk are trying to figure out what to do with this hugely different film-making world, and you can see the cracks.
That said, I agree with a lot of the author's ideas, particularly about Dune.
35mm film comes out to about the same sharpness as a 1.5k CG image. Even with the best lenses and scanning, 35mm scanned at any resolution still doesn't get sharper than a crisp 1.5k upscaled to be at the same resolution. Even middle of the road RED cameras have exceeded large format film in their sharpness and sensitivity.
Not really. I watched "Star Trek II" and "Ronin" on HDTV and found it painful. The sets on STII looked cheap and the uniforms looked like costumes. "Ronin" looked like a soap opera. I found it hard to watch old movies on HD.
Not quite. They once did a measurement of the effective resolution of actual analog movie theaters (sometime in the 2000’s IIRC), and the result was in the ballpark of 720p, so halfway between (anamorphic) DVD and Blu-ray.
The cost of labor will continue to rise while the cost of CGI will continue to go down. We will see more and more CGI until the cost of labor decreases relative to the cost of making movies.
One thing that isn't accounted for is that as far as digital assets go, they are, for all intents and purposes, immutable, immortal, and more liquid (and counted as CapEx rather than OpEx). A prop from Alien might still be around in basically the same condition as it was made; the little fake CGI gun model and CGI Yoda will always be there; it doesn't matter what the actors or prop-makers want or don't want. It gives the studios more leverage.
In the same way that frameworks lead to bloated more poorly performing applications than we had in the past. Recent backend engs asking me to send two requests to upload just a single file because their framework somehow made it difficult. They didn't want to have to write a SQL query. So instead keep the code "clean" by every time requiring two requests instead of one, objectively providing something crappier than what we've had for over a decade and more. The youthful bent to software dev only exacerbates this.
While I agree that there is somewhat over-reliance on CGI in movies nowadays, this post is like cherry picking the examples. Comparing bad CGI shots to few of the greatest films ever made (and with high budget for the time as well) just seems odd. I wouldn't say CGI itself ruined the movies, but how it is now implemented. There are films like Gravity which in my opinion, blended the CGI perfectly with what was happening in the scene and that's where CGI enhances the experience rather than otherwise.
The biggest problem I have with CGI is that the speed of motion does not match what is physically possible - especially in superhero movies. It's like the motion you get when watching "Family Guy" stunts.
One interesting comparison, I would like to read about, would be between "Blade Runner" and "Blade Runner 2049". The original used models and moved cameras around the models for effect, whereas the sequel, I'm sure, used CGI but to good effect. I'm a fan of both movies.
While I think the article is something to consider -- this could be because I'm reading David Mamet stuff just because -- I think a lot of this falls away if you have a good story which perhaps matter more than whatever CGI or not you're using.
E.g. I personally find myself dissatisfied with a lot of Marvel, probably because most of the movies are "cocky guy gets taken down a peg by someone who's kind of in the family and kind of not, but definitely wins in the end anyway."
I agree - then I think of quality period pieces that use CGI with great artistry ("Zodiac" and "Hail, Caesar!" come to mind) and I think again.
Check out, for example, this VFX before-and-after clip for the show "Babylon Berlin". In my view the CGI recreation of Weimar Berlin added enormously to the reality and presence of the story and characters.
CGI is there for set extensions, and is really common. Its often cheaper to do a virtual set extension, than it is to build it for real.
In short, the author is pining for the "good old days" which were never there. Yes, there are some standout films that look great from the last 60 years. But we are forgetting the sheer amount of dross.
Of _course_ lawrence of arabia is gorgeous, its shot in technicolor with a fucking massive budget. Its an adventure movie, like jumanji its going to have bright colours.
Dune is dystopian rebel movies, about complex and grubby power struggles. Just as it always rains when the main character is sad, or there is a storm when things go wrong, colour is a tool to signpost emotion.
The author is a typical film theory fan, all talk and no experience.
TLDR: the author is comparing the cream of 50 years of film to generic movies of today. Asserting things that are demonstrably untrue, as 95% of all big budget films have digital set extensions and other CGI elements.
Not sure if it is caused by CGI or me becoming older, but I noticed the following: in the past I would see a movie landscape (let’s say a new planet in a Sci-Fi movie, or adventurers discovering a lost place) and would feel crushed by emotions due to the beauty/darkness/emptiness of it and it’s massive size.
I cannot feel anything like this anymore, I just think “oh, cool CGI”, and that’s it, no emotions whatsoever. And that kind of ruins a lot of movies I would have previously enjoyed.
It has definitely, entirely ruined effects-focused making-of videos. How'd they do it? Computers. What did it look like? Someone using a computer. How can we show you the "behind the scenes" of that shot? Disable part of the rendering.
All identical, all boring.
The only interesting part is "how much of this shot was CG?" but a still conveys that. Extensive making-ofs are practically dead, except for people in the industry who want to know exactly which commercial plugin they used or whatever.
Counterpoint, CGI allows you to tell stories that would look like total shit without them. Maybe compare apples to apples and look at the 1980's dune vs the new one.
I also think a lot of CGI has hurt the directing side of things. Once you virutalize the camera the same sort of things happens where there's no gravity to a situation. I first started noticing it in Fellowship of the Ring where there were many establishing shots with a roller-coaster of a camera floating through the air. Those weren't egregious but now it's everywhere and poorly done. Give the camera some weight, let things be still.
I'm not sure I 100% buy this argument. Like yes, objectively the creature work in the new star wars lacks the weight and realness of the original puppets, or even those used in the Mandelorian.
But we've also got to see the world of starwars in an absolutely incredible new way! x Wings skimming over the water with incredible camera angles, spectacular space explosions, realistic lightsabers that spark and have volume (sword fighting choreography aside).
I recently watched Project X, a campy Matthew Broderick movie from 1987, and I was just blown away. It was so incredible to watch real Chimpanzees interacting with real human beings in real environments.
I had that illusive sense of wonder that CGI is always striving for and failing to reach. And, as a consequence the movie almost felt like it was from a future where CGI had finally achieved its goals...the only thing is it was a campy movie from 1987.
These are people that aren't the kind of people that I would normally think would care about things like CGI being used in movies. Apparently this is hitting a nerve.
Is CGI considered an automation that reduces overall production cost?
I am not sure whether it is or not, but I wonder if it has become much cheaper over time. I remember a time when CGI in movies was a thing of wonder, at least it was to me in the 90's.
Nowadays I definitely see the Blah-ness of it. But that said I find action movies extremely boring and unrealistic, so I'm out of the loop.
>Let me offer a comparable conspiracy theory. If you go back in filmmaking prior to somewhere around 2000, give or take a decade depending on the title and genre, almost all films look so much realer.
Shall we call the idea that the films that look realer are better realism? At any rate there was no argument as to why looking realer makes a film better, it seems to just be assumed.
CGI should be a tool to tell a well crafted and well narrated story. Unfortunately, CGI has become the driver for how a movie should be made. Once it becomes the only tool for telling the story, everything else falters. CGI should absolutely be used to support making a good movie but a movie should not be defining the scenes and the movie.
> As you listen to that gentle creaking, or that low-tone burbling, what you feel at that moment is the deep satisfaction that comes from witnessing Being, in all its irreducible and incomputable eddies and swirls. Your eyes are satiated. There’s probably a German word for it, and if not, there should be.
Without even mentioning CGI, just modern digital color grading is wide open to abuse.
"Harry Potter and the Goblet of Fire" was absolutely mangled by the director. The colors are really distorted and awful. It looks nothing like the other movies in the series. Worse, it is otherwise a good movie.
>Even accounting for inflation, Jurassic World in 2018 cost about 25% more to produce than Jurassic Park in 1993, and all the dinosaurs in the original feel ten times as tactile and threatening.
But, but, dinosaurs in the original were famously CGI!
Digital cameras replacing film has also ruined it. The dynamic range and depth of images that comes from film blows all digital images away...same for photography.
Much of the romance and beauty seen in old pictures and films is due to this difference.
The problem isn't CGI per se (hell, even David Lynch uses digital blood), but the pervasive "we'll fix it in post production". This makes directors and cinematographers extremely lazy
Looking at how good real-life effects and modeling were getting in the 70s and 80s hard not to wonder how much technical progress would have happened if CGI hadn’t taken over.
By far the best action movie of the last decade "Mad Max: Fury Road" looks so great because most of special effects are practical, and only enhanced with CGI.
I think that while FR had a lot of practical effects, to call the CGI only enhancements does it a disservice. CGI was used in almost every single scene and shot.
Fury Road is a bizarre choice to try to make this argument. It is soaked with CG from start to finish, including CG cars, CG head replacements, CG body parts, completely CG shots, CG weather, CG sets, CG crowds, CG ground, sky replacements, CG environments, CG set extensions, green screen stunts and pretty much everything else you can think of.
It is wall to wall computer effects.
This is like someone saying they don't like sugar or carbonation, so they only drink coke.
Have you read the post we're discussing? That weightlessness part? Of course, Fury Road is a modern movie, and of course it absolutely does take advantage of modern special effects technology. However, nothing in Fury Road feels weightless (maybe except flying cars in the storm), because the core objects we see are shot in an old school way, with stuntmen and practical effects.
while I am 100% in the "more practical effects please" camp, I would also wager that criticism of the state of CGI in films today is in large part a matter of survivorship bias: we are only noticing the bad CGI, and not appreciating just how much more happens to be CGI. Case in point, no one has probably seen even half as many real animals in a film in the past ten years as we think we have.
So, there's a mistake the author is making here: since film budgets have gone up, the increased cost has to be for more VFX, and thus the reason why CGI ruins movies isn't so much that CGI is cheap but that directors are unconstrained by it. There is definitely a kernel of truth in there, but I want to point out that VFX is actually still comparably cheap.
VFX artists are unusual in that they are one of the few parts of Hollywood that isn't unionized in any meaningful way. This is because of outsourcing. On paper, this is all above-board because the artists being hired live in lower-cost-of-living cities. However, in practice, this also means the artists do not have the option of unionizing. So the studios can browbeat their artists into churning out uninspired work quickly[0].
The article brings up a few more technical problems; such as CGI characters not fitting into practical sets all that well, or not having all that much visual weight to them when interacting with practical actors. However, that's something that can be worked around and planned for with creative set or prop design. This doesn't happen because the people involved don't care, or don't have the time to care, if their bad guy looks like a hologram.
That's really the core problem. Bad CGI work in an otherwise good movie can be excused, and it is possible to make good CGI that will hold up over time. But uninspired movies are bad whether or not they use special or practical effects. The problem is not so much that CGI is a cheap filler material for movies or that directors don't want to work under constraints; but that studios are running the show and asking for safe, uninspired work. Everything else is just a symptom.
The traditional Hollywood business model was that creatives would option off novel projects to the suits. This has now changed to a model in which the suits ask creatives to create derivatives of works they already own. It's why every studio was jumping over themselves to create a "cinematic universe" about a decade ago. Why go picking through a bunch of projects that might succeed or fail on their own merits when you can bring out a franchise with a built-in audience[1][2]?
[0] This is also a problem in 2d animation, specifically Japanese animation studios that churn out lots of light novel adaptations.
[1] Yes, your minds turned to Marvel at this point; but I don't think we should blame the MCU for this. The trend towards franchise film started long before we started making true-to-the-comics superhero movies.
[2] China is also a big factor here. If you can please the PRC censors, big-budget Hollywood blockbusters do very well there even if they get panned for being too derivative in the states.
Absolutely. Creativity requires constraints even if they are self-chosen constraints, and necessity is the mother of invention.
Perhaps the most interesting thing to come from the most recent Star Wars trilogy is that fans and an entire industry can get behind the idea of choosing retro creative constraints.
And I often think that good old centralized nations were a mostly agreed upon set of constraints, and whoever managed to get access to its top and produce things (books, movies, else) would be granted a fair amount of value, naturally because it required stamina and grit to keep churning across layers and be recognized as worthy. The problem with that system is that you can get gatekeeped at the bottom by lack of luck or perverse agents but it granted some kind of filter.
Yes -- and you can definitely see evidence of greater upward mobility in the current cinema system; more directors who move quickly from indie films to major movies etc.
I've seen this take too many times. CGI can have the same level of artisanship as practical effects, you just need to put in the work. The Lord of the Rings trilogy is a great example: Weta Digital was part of the entire filmmaking process, worked extremely closely with the art department, and was part of the same entity that made most of the props and practical effects. The digital effects in the trilogy hold up, even the ones that look a little janky today, not because of some amazing rendering software, but because they were integrated into the vision of the films from day 1.
I think this can be observed to a lesser extent on many more recent movies with strong vision behind them. For example, Sony's Amazing Spider-Man movies had generally generic CGI, but the Spiderman model and sequences were phenomenal. The studio and effects shops had a really strong idea about how Spidey should feel and move, and that drove his visualization.
tl;dr it's not the effects, it's the creative bankruptcy.