Hacker News new | past | comments | ask | show | jobs | submit login
The contagious visual blandness of Netflix (haleynahman.substack.com)
471 points by _gfwu on Jan 22, 2023 | hide | past | favorite | 460 comments



Another aspect of this is staffing. It doesn't immediately seem tech related, but it is.

There's less people in the background as extras, less people making background scenes or setting up. It can mostly filled in by production technology if necessary.

Some movies just seem so empty, and they look "cheap" if you see it in this angle. Some big hollywood production like for example Passengers where there are barely more than three actors visible in the whole film. I've noticed there's a lot of these hollow/empty productions with very few actual actors visible, and they are rather lifeless.


IMO, that's part of what made Game of Thrones fall off a quality cliff. It wasn't just subpar writing on the main plot lines when the source material ran out. The world itself became hollowed out, and the main characters stopped existing within an organic world. It turned into an avengers movie, where the leads were largely alone on screen only interacting with each other (except for designated Crowd Scenes).


GoT has the same dynamic as Lord of the Rings, where the world itself is arguably the protagonist, and the characters are there to explore it for you.

The GoT show runners appear to have totally misunderstood that, and threw the suspense and world building stuff under the bus for creating moments of interpersonal drama. But the drama and actions increasingly made no sense vs the buildup of the previous seasons.

Who are the white walkers? What is the rest of their history? What are their goals? This was the big driving mystery of the show, and they simply decided to wrap it up in a single big battle with no further revelations.

That totally removed the payoff a lot of people were waiting for. Much like Lost, it became clear there's no actual coherent mystery.

Maybe GRRM deserves some criticism for not having things wrapped up himself. But mostly I think the show runners just totally misunderstood what was driving the earlier seasons.


From episode one it was a telenovela with more nudity, wigs, and CGI. There was one good scene between Cersei and her husband, though.


Ugh the Lost thing was so infuriating.


The Lost finale was definitely disappointing, but at some point it was expected. At some point it became clear that the mystery was just a huge MacGuffin, and the writers had painted themselves into a corner.

I think that taught newer generations of Showrunners to not make the same mistake. Modern shows like Breaking Bad and Dark managed to finish right on time with perfect endings.

(GoT however...)


> I think that taught newer generations of Showrunners to not make the same mistake

I'm worried the opposite lesson is being learned here. Most shows "build up" then not worry if it's actually going somewhere or even can have a good ending.

Nowadays the exit strategy is first and doremost a cancelation or the next writer's problem.


Yeah, that's hard to disagree with. That's definitely true for a lot of Netflix series.


New generations, maybe. JJ Abrams himself still seems to love a mystery box, to the detriment of more than one major franchise.


What happened to the polar bears


> where the leads were largely alone on screen only

I feel like that is many movies now. The worlds are small and the consequences / events are just there for the primary actors to respond to and seem entirely disconnected from any kind of universe.

Massive events happen just for the sake of having a hero mug for the camera. But no consideration otherwise.

At this point in the Star Wars universe I expect the regular people of the universe would realize that ALL force sensitive people are more trouble than they are worth… but no the people of the universe show up, say dumb things, die (or don’t show up… literally nobody at times) at the appointed time for really no reason that to give the heroes or villains something to do.


There does seem to be an absence of "angry villagers raid the wizard's tower" stories in the Star Wars universe. Missed opportunity, probably has something to do with Star Wars primarily being escapist fantasy for people who wish to be the wizard.

Maybe the EU novels have such stories, but I haven't read those.


There's a scene in the original movie that kind-of explains this: https://www.youtube.com/watch?v=Zzs-OvfG8tE

> "Don't try to frighten us with your sorcerer's ways, Lord Vader. Your sad devotion to that ancient religion has not helped you conjure up the stolen data tapes, or given you clairvoyance enough to find the rebels' hidden fortr-"

From what I remember from wikis the EU novels expanded on this as: Jedi/Sith/force-sensitives are rare enough that even at the height of the Jedi's power, most people believed them to be a myth, that the Jedi Order was more like a powerful religious group of regular people.


This is reinforced in The Mandalorian, where (to memory) the protagonist hadn't even heard of the Force or Jedi. And Mando was much better-traveled than most of the Star Wars universe population.


It works there but the later trilogy just falls apart and makes no sense how invested everyone is in jedi-ism...


To be fair, that was also a problem with the books: the scale was completely off in quite noticeable ways – huge armies but no mention of how they’re supplied, cities which are referred to as large but feel small in terms of logistics, etc. We’re repeatedly told winter is coming but there doesn’t seem to be much in the way of stockpiling supplies, building greenhouses, etc.?


Stocking supplies is much better conveyed in the books. Also notable that although the main map is shaped somewhat like England+Wales+Scotland, it seems to be continent scale, allowing southern parts to seemingly not even stock winter gear.

Greenhouses…would be a very rare luxury for a rich few, and would offer minimaly supplement for them; also, heating them (IRL, with coal) would be prohibitively expensive in labour, fuel, and fuel storage.


I was referring to the books – my primary memories of ASoIF were that the level of violence should mean more desertion and economic collapse back home and unless they just magicked it away there’d nowhere near enough food to survive the promised winter. It’s not like the books didn’t have good parts but I had to consciously suspend disbelief a lot while reading it.

Greenhouses don’t require heating but it’s also a proxy for the questions about urgency: if you knew this was coming, shouldn’t you have people working on every aspect of storage, stretching growing seasons, migrating south, etc.?


Is there a particular season when this happened?


Season 2.

Everyone was too busy suspending disbelief or distracted by their phones/laptops to really notice. I gave the show the benefit of the doubt until the end of Season 4, when Tywin Lannister would have come in and eliminated Joffrey - his family's success relied on installing a competent king to keep the throne, not a useless brat coddled by his incompetent kids.

But then rewatching it for the second time, the lack of causality was apparent in Season 2. There's a reason stories don't generally kill main characters, and that's because viewers invest in them. If they do kill a main character, there has to be strong point that leads to even more character development. But once you see through the charade, the whole series is revealed as a series of vignettes that imply awesome character development, without ever actually doing the work to bring it to fruition. The show was able to coast in pop culture until season 8, when it was no longer possible to ignore that the characters weren't going anywhere.

I thought they would have taken a step back and applied more massaged production to HoD. After all, this franchise had an outsized amount of upmarket interest, only to crater at the end to the point of completely falling out of the zeitgeist. But rather, it seems like they continued the same tack of not really having a good story and filling in gaps of time in place of showing character development - like the episode where all the actors switched out and everyone is just older. Classic GoT!


Yet apparently you still paid (at least your attention) to give house of dragons a try reinforcing this disgusting habit of making only remakes and childish franchise bollocks. We must punish them


This is why I avoid dreck like Velma, even to hatewatch or to see if "it's as bad as everyone says". A view is a view, and unlike terrestrial television (in the absence of a Nielsen People Meter), watching is no longer passive, it's a signal to the network who is watching back.

Remember the lyrics of the musician whose wisdom resonates to this day, "Weird Al" Yankovic: "But I only watched Will & Grace one time, one day... wish I hadn't 'cause TiVo now thinks I'm gay!"


Pirate life! I don't even know what silo had HoD.

But yes, guilty as charged. The worst part is I'm the type of person that doesn't just give up, but rather insists on finishing a show.

And regardless of formulaic production, I get disappointed by many series that start out great in the first season or two, and then the writing just craters. The best defense I've found is to stop binge watching. Watch one episode to relax after dinner, and then go do something else.


Seasons 7/8.


I often think about Die Hard With a Vengeance. The city of New York is a huge part of the movie's texture. The movie actively engages with all of the people on the street and in the cars, and who are watching from their office windows. Characters who could have very easily been left out of the movie, but their inclusion make it so much richer. You see and feel their days get ruined because of the carnage. It actually feels like New York City.

There are so many different side characters who only have two lines, but each of them were cast with care and feel like real people who are living through their day and intersecting with John McLane. I love this movie for it.

https://www.youtube.com/watch?v=8RVVJmuoAQ4

On the other hand, I just watched Michael Bay's Ambulance, which is a prolonged police chase through downtown Los Angeles and the only people who seem to be on the street are criminals and dozens of police cars. At no point are we in touch with anyone who isn't directly contributing to the story. The reality is that downtown LA is packed full of traffic and the story would have taken a very different (and interesting) angle if it reckoned with the fact that getting through the traffic is it's own game.

Die Hard With a Vengeance, however, acknowledges this reality and comes up with some very creative scenarios for our protagonists to overcome, and it makes for a far more compelling movie as a result. It feels far more realistic, even in a movie as heightened as this.


This reminded me of the Netflix movie Bright (modern day Lord of the Rings kind of people), and while focusing on the main characters, it still manages to incorporate the realness of NPCs going about their lives.


People used to make movies to make a point. Now people make data-driven drivel.


> LA is packed full of traffic and the story would have taken a very different (and interesting) angle if it reckoned with the fact that getting through the traffic is it's own game.

You might like “To Live and Die in LA”


>Die Hard With a Vengeance, however, acknowledges this reality and comes up with some very creative scenarios for our protagonists to overcome

I hadn't thought about this before but you're correct and it even extends to the antagonist. As they are unravelling his plot, they come up with the question of "how do you drive that much gold through New York without anyone noticing?".

It's a very interesting perspective.


Reading this, and looking at the example photos in the article, it's the Nighthawks aesthetic (https://www.artic.edu/artworks/111628/nighthawks ). It has a place in cinema and art, and always has. But when any aesthetic becomes dominant people would notice the lack of the other aesthetics that they are used to experiencing.


I noticed the increasing "emptiness" as well but my initial thought was that it is a COVID thing. Less people on set less risk.


It's definitely a COVID thing, especially things like the lack of populated restaurant scenes as pointed to in the article.


This is one of the things that made The Breakfast Club such a great example for low budget filmmakers. Very small number of people on camera, low number of locations, yet it does not feel weird within the context of the script.

There have been others that are great examples of it being odd. The TV show The Good Guys was shot on empty streets in a dead part of town because of low budgets, and it comes across that way. Was the crew so small that they couldn't send a couple of the grips or PAs to walk down the side walk? Another of the low-budget examples I like is in Queen of the South's first season, they used the production's grip truck as the bad guy vehicle in an episode. Maybe it was the pilot? It was just one of those hung head in shame moments for me


Clerks and Cube are other movies where this happens.


This is quite apparent in the most recent Law & Order season.

Compare/contrast an episode from season 22 which aired in 2022 vs season 1 which aired in 1990.

There are scenes in the original that have like a 15-20 people on screen doing a highly complex choreography for a crime scene with different actors playing uniformed cops, detectives, evidence specialists, witnesses, bystanders, just every one that you would expect to be in such a scenario in real life.

And that's just one scene. There are many more with unique locations, with remarkable shots on location outside and inside buildings -- all with many more people on screen.

Now they barely have 20 people in the final court room scene when the original seasons used to have 50+ people just for the background, let alone the 12 jurors, 3 lawyers/judge/defendant.


I it was released in 2022, it was probably filmed at the height of COVID. So that would explain fewer people on screen. Working around restrictions was exceedingly difficult. But if they saved money, and there were no big complaints, you can bet they will stick with it. Just like all the hotels that are now not cleaning your room during your stay unless you request it specifically every day.


Kind of ironic also that law and order was the place to get your start in acting in NY. There are so many famous actors that had an early role in the show.


> Some big hollywood production like for example Passengers where there are barely more than three actors visible in the whole film.

I agree with your general comment but in Passengers it was really central to the overall plot and premise. It's not a good example of this phenomenon.

I don't think this was drive for budgetary reasons, that film clearly had a huge budget looking at the visuals.

As for money, Hollywood should be much less dependent on 'A-listers' and tap unknown talent. This will also make the 'A-listers' cheaper and stop this annoying thing where you see the same faces in every movie, often cast more for the name than for best fit to the role.


I disagree here about the premise, the premise is part of the problem, these films are hollow.


It's about 1 guy waking up on a spaceship with 80 years to go :) It's just really part of the story.

You might not like the premise but the film would lose its impact without it.

It's not a typical "because cheap" scenario where other characters are cut, here it's an integral part of the plot.


I've seen some things on Netflix that look like they're optimising for phone screen viewing.

I wonder if they simply think background details aren't important when viewing on a 6" screen.


Another movie that does this is Tenet. There is this massive action set piece at the end, people are literally fighting for the fate of the world. Then you have people running around, shooting at nothing.

Are massive set pieces with hundreds of extras just not economically feasible anymore?


I remember that. There were so few people in that scene, it felt uncanny.


I hadn’t thought of this at all but you’re totally right. Movies and shows have way fewer people on the screen just in the background.


That is a great observation, and something that has put me off a lot of movies too. It’s an uncanny valley type of thing.


The Matrix 4 felt so empty because of this. It felt like a small group of actors together on a cramped stage.


That film is so meta and self-critical that you could argue it was on purpose.


> Some movies just seem so empty, and they look "cheap" if you see it in this angle. Some big hollywood production like for example Passengers where there are barely more than three actors visible in the whole film.

Maybe that's true, but for Passengers, no extras is an integral part of the movie plot. Anything else would make 0 sense.

And they pulled it off rather well.


> There's less people in the background as extras, less people making background scenes or setting up.

This isn’t actually a new problem. I guess the new part is this:

> It can mostly filled in by production technology if necessary.

I saw an interview with the director of the 5th (I think?) of the original Planet Of The Apes movie, Battle for the Planet of the Apes. (It might have been a DVD extra, I don’t recall.) It’s supposed to be a war between the humans and the apes, but they didn’t have the budget to hire enough actors, so they had to rely on using particular angles of the camera and various props to make it less obvious how few people there actually were in the movie.

Later I saw a review of the movie Jumper which is also supposed to have a war between normal humans and humans that can teleport, basically. The review was talking about how there didn’t really seem to be that many people involved in this supposedly worldwide war.


I think that Stalker (the movie) [1] does not feature more than 3 people in a scene.

[1] https://en.wikipedia.org/wiki/Stalker_(1979_film)

Is it empty? Is it cheap?


This can be done, and has been done since early cinema, by making the film essentially a stage play. This definitely can make for some excellent films! It focuses the production on the story, and most of the time of evocation of out-of-scene activity.

But your point highlights the opposite: merely having fewer people in the scene to save money without considering what it actually takes to make it work. Even Blum gets this right; these bigger productions have no excuse except laziness.

And in my case it's led to me simply watching less video, because it's boring.


to be completely fair to netflix, Amazon spent like $80-100m releasing season 1 of wheel of time and those 8 episodes also looks like lifeless garbage.


That can't be the result of using LED ligthing, because modern LEDs are really good and don't require much correction in post, yet movies still look like that. In fact, I don't think you can pinpoint this to a single technical reason.

It's just the industry converged onto the same recipe book - some of that is cost cutting, some is just massive amounts of A/B testing throughout the years.

Demand leads to optimization, which in turn leads to convergence and less varied supply. Which is kind of paradoxical, considering the low barrier and creative freedom available with today's tools. And yes, it really started to happen somewhere in early 2000s, not just with movies and the looks but with everything related to visual entertainment, from games to illustrations.


its nothing to do with LEDs

all shots are conformed before they are graded. That means that they shoot a macbeth chart every time they change the lighting, allowing for people to make all the shots look the same before they start to fiddle with the colours.


This is only partially true. The problem with LED lights is that they deviate quite a bit from a Planck and/or daylight frequency curve. So you can end up with colors looking different even if the color chart looks right. And even the best Aari LED lights have ssi scores of only ~75. I wouldn't be surprised if this is a part of the effect the author describes.


if that was the case then teal/orange grades would be "impossible" as most lights don't kick out that much teal (~510nm)

Not only that, but you can micro adjust the colour of these lights to suit your needs, which means you can have much richer colours on set and not worry about colour cast.

if it was such a burden on the film maker, and limited artistic expression, then we'd see holdouts for tungsten lights (in film making). When was the last time you hear someone in film go: "yeah, these filament lamps are so much greater than LEDs, so much more real."

If you're on location then you can have >5 times the amount of light for the same power/heat budget. No one is going to sniff at that.

Anyway, the first thing you do when you get film/footage in through the door is conform it. you need to have everything at a base grade before you can add "artistic colour" to it. so subtle difference in colours are always obliterated.


Whenever I've used the local rental house, it's the massive ARRI tungstens that are being loaded in the production trucks.

Although I guess it's their size that makes them stand out, maybe the LEDs I just don't notice.


Yeah, another example is NBA shooting style, which converged to hard 3s plus layups for a time

https://www.thespax.com/nba/the-nbas-ongoing-three-point-rev...


It's a decision with the grading and it's a zeitgeist.

https://www.vox.com/culture/22840526/colors-movies-tv-gray-d...

There's another article I can't find on it.

It's got nothing to do with netflix. Digital is not to blame either, but it's certainly a catalyst.

It mostly has to do with insecurity surrounding digital and the bizarre idea that saturation is trying to 'emulate' film, (when color timing films was always just been about trying to emulate reality).

You can correct most of this problem with a decent media player. Just play a film in VLC and touch up the saturation and possibly the contrast in the effects settings.

It really doesn't take much adjustment to makes a film that once had a depressing palette suddenly half decent. I can't tolerate watching films through sunglasses so do this pretty often.

It really is a fashion. I've seen an advert for grading software that had a famous photographer casually showing the viewer how he ruins his pictures with this process of desaturation. I've no why people think this looks good, but it is deliberate.

I asked the cinematography reddit once why people do this and got a lot of incoherent responses about how saturation is just trying to emulate film and that it's just how 'new' things should look. Apparently many new filmmakers experience of reality has been through tea-shades on an overcast day. It beggars belief.


Colors, yes - but the article discusses more than just grading, it's the entire visual language that is optimized and inevitably lacks individuality. I don't think it's just fashion, more likely it's the result of convergence in a mature industry under market pressure. The same trend is also clearly visible in everything visual in the recent two decades, not just cinematography.


The number one thing that bothers me the most in the "netflix aesthetic" is the weird looking background bokeh that every scene seems to have.

It looks like a cheap gaussian blur filter applied in post production and adds to the fakeness. This is probably due to the optics in some of the latest Red digital cameras, but it's still not convincing. Kinda like 48 vs 24 FPS.


I will never accept the hate for 48 FPS. 24 FPS looks like garbage. Panning shots are a stuttery mess. 24 FPS looks somewhat okay when the camera is still and only the people in it are moving, but the ridiculous choppiness ruins any moment with camera movement.

A higher refresh rate is simply an objective improvement. I don't care that you associate it with bad soap operas.


What you're saying is equivalent to:

"Impressionism is objectively worse than realism. Look at a Van Gogh painting, there's hardly any detail. It looks coarse, and grainy and the brush strokes are so wide I can hard tell what's what. Starry Starry Night would be objectively such a better painting if he had just reach for a finer brush and included more detail. Adding more detail would have made it far more realistic, and thus objectively better"

You see your premised is founded on the false pretense that realism is the goal. The Cinema frame rate of 24fps (while not chosen to achieve this, but by a happy accident) happens to be at that perfect balance between too choppy and realism. It's just off from reality enough to trigger that "impressionistic response" in our brain. The same thing that happens when we look at a Monet painting. Out brains find impressionism appealing because it's specifically not what our eyes see in the real world. It's an interpretation. It's someone else's view of the world. More detail is simply not the goal.

24fps Cinema is impressionism.


It’s more like complaining about higher resolution reproductions of a Van Gogh, because you grew up with dial-up internet and could only view highly compressed JPEG reproductions of Van Gogh so “that’s just how Van Gogh is supposed to look, these high fidelity digital reproductions look like desktop wallpaper.”

No one is saying that filmmakers couldn’t use lower frame rates as a deliberate stylistic device, similar to how 12 FPS can be used in films now (or black and white, or out of focus images, etc.), or even that all films previously shot in 24 FPS would be better if the frame rate was doubled with no other stylistic changes.


Playstation 2 and it's games were designed to look its best on Sony Trinitron TVs. When viewed on modern TVs, the games don't look like they used to, and the games can seem kinda dull in color but jagged in detail (anti-aliasing seems to have been in part the responsibility of the CRT itself). If the console was redesigned for today's TVs they'd look better.

I think the problem today is that the art of cinematography and technology evolved together up until ultra high-def digital came about, and now the art isn't keeping up with the advances in tech.


> anti-aliasing seems to have been in part the responsibility of the CRT itself

Yep, and it goes much further back than the PS2. The original example I saw was of Zelda II on the NES, but here's a whole bunch of examples in a whole bunch of games/systems: https://www.youtube.com/watch?v=ao-1uCCXBwc


The problem with higher frame rates is the same as higher resolution. It emphasizes even more the poor quality of the content. The low fidelity masks some of this, leaving more to your imagination (which is how impressionism works). If the content were of higher quality then it wouldn't be as much of an issue.


On the other hand, low frame rates put a ceiling on some measures of quality, just as low resolution does. A fast-paced fight scene or a panning shot over a field of gras in the wind is very limited by what 24fps can show.


Gemini Man was a poor movie, but it was really interesting how it tried to use and explore the opportunities offered by high frame rate. I'm 100% in the club of people who can barely look at 24fps pans on the big screen, and wish 48fps and higher became the norm, were explored deeper so they could find solutions for the "soap opera" look - most likely figuring out a new look.


Throw in as much garbage as you like, the human brain is only wired to appreciate a certain degree of information before it all gets washes out as noise. This is why imho the last 20 years in cinema has been a slow skid toward meteocracy.

The race to shove more crap at the screen to the appreciation of a younger and younger demographic to chase other tertiaries like toys, collectables, games etc.. means that the films are almost unwatchable by older people. I guess they're making their money so whatever, but it means that films feel more dead than ever before for myself. Admittedly, this may be more biased by aging out of existing target demos anyways, so shrugs.


We need to go back to 480i to increase immersion.


What I find funny is looking at older TV shows that were clearly meant to be shown on old analog 4:3 CRT's but shot in film anyway. Now these shows show up in syndication and all that film got scanned as 16:8 HD. Some examples include Seinfeld, Friends, even Wizard of Oz. Many times it feels like you are watching a play instead of an immersive show. All the props look like props instead of something "real"


I had similar moments when watching the remastered Star Trek TNG. Some of the sets - the bridge set for one - had been built so that they would just barely hold up when viewed on analog video / SD TV. With the film scans, all the messy shortcuts and crimes are easily visible. The set builders must have had some deep knowledge of what shows up on camera and what doesn't to cut the corners they did.


The lofi tv scene is waiting to explode.


[flagged]


They are addressing the point directly. Could you be more specific with your criticism?


I have no problems with non-realistic art forms. I think 12 FPS hand animation can look amazing, and the 12 FPS is part of the artistic expression. You're right, realism is absolutely not always the goal, and being unrealistic is sometimes the point.

But for the kind of movie that's supposed to take place in a world very much like ours? The kind where it's just a camera filming actors in a street or whatever, and there's no intentional impressionistic art style? An "impressionist" frame rate is totally inappropriate in such a context.

EDIT: I'd actually be very interested to know why people are downvoting this comment. I'm accepting that there can be artistic merit to low frame rates, but I'm rejecting the idea that impressionism should be forced on art which isn't trying to be impressionist. What's objectionable? (I see the comment is in the positive now, it was at -1 when I wrote the first edit)


Its like if people loved combustion cars because you get high on petroleum fumes, and rejecting electric cars because they don't stink the same way and dont make as much noise.

Its a cult.


An "impressionist" frame rate is totally inappropriate in such a context.

Start with the shakey cam effect, often added post production, which does what no human head/eyes do during fast movement.

(And often used to cover up poor quality fight scenes)

Second using an incapable camera person, the camera jumping around like my grandma doing video on her phone. Steady cams, and gear, and booms, are a century old!

While I am open to looking at all modern issues, I'd prefer getting rid of purposeful garbage first.


> The kind where it's just a camera filming actors in a street or whatever,

Hopefully this image search [1] answers your question. The photographs in each of these was "simply taking a photo of a street, a common real-life thing". Yet, when you look at them, almost every single one of them is highly stylized. Whether it's simply by being B&W (yes that's not reality), having pronounced bokeh, strong silhouetting, or even vignetting, etc. They're all absolutely not what I'd call realism.

Realism is just boring. And artists don't find much appeal in producing it.

[1] https://duckduckgo.com/?q=street+photography&t=osx&iar=image...


That's also an opinion and you are stretching the art comparison quite much, 24fps is a standard and limitation and photo/film most often is to capture some reality and not so much an interpretation of someone else's view in Monet sense? Most often the cinematic art part comes in on other dimensions, even if for some few it is those 24fps and other limitations.


It’s hard to argue that it’s an intentional artistic choice if the overwhelming majority of compositions use it. It would be like saying a Monet being viewed through oxygen/nitrogen atmosphere is an artistic choice.


And even more so when it was 100% driven by technological limitations. The original Hollywood filmmakers of the early 1900s didn't choose not to film at 1000 FPS. There was simply no feasible way at the time to get above 20-30 FPS. The current standard was never chosen.


Many paintings were too "never chosen" and were made in that medium because photography didn't exist back then, Henry VIII probably would have hired a professional photographer instead of a professional painter if it had been possible, despite all that painters still exist alongside photographers in modern times.


Not necessarily because art does happen in eras. So maybe everyone were impressionists for a batch of years it was the de jure way to paint but in a 100 years it moved on but we still appreciate them. Sure, 48FPS may be the way of the future but that doesn't mean the flavor of 24 FPS won't be appreciated in the future.


No it doesn’t. Lots of different artistic schools have existed at once all through history. Just because some style was invented doesn’t mean that everyone dropped what they were doing and became surrealists.

There are people painting what would be considered impressionist painting now (literally now, I have a friend who frequently paints on sundays and texted to ask before writing this).


My understanding is that once photography was invented, painters somewhat abandoned realism in favor of impressionism and other more experimental forms. Since painting was no longer needed to fulfill the boring need for reproducing reality. Not to say, that realism doesn't continue to this day, but honestly, most painters don't aspire to it even now.


I never said intentional. Up until the last decade, filmmakers had little choice but to distribute films in 24fps. I'm ignoring the few titles where they did indeed experiment with higher ones, like Oklahoma at 30fps and Brainstorm.


Yes, an I did indeed state that it was a "happy accident". 24 was chosen for pragmatic reasons but happened to, simultaneously, result in an amazing impressionistic effect. Both are possible.


> What you're saying is equivalent to: "Impressionism is objectively worse than realism

Type 'art' into google, how many pictures are impressionism, 1 in 10?

Pick a movie at random, what are the chances it's made in 24 fps? 99.9%?

Why do you thin TVs have motion smoothing? Because, we, normal people, are sick of you impressionist-obsessed people controlling everything.


TV as a medium is different from the big screen. People watch sporting, talk shows and live events on TV, rarely at the movies. And the convention is for those types of things to be viewed at higher frame rates. Films can be viewed at higher frame rates but then they tend to have a TV-ified look. Which is ok, if you personally dislike the film look and prefer movies to look like daytime television. But for many people, film shot in high frame-rate, particularly in slower, less action-oriented scenes, looks too much like a broadcast experience and too little like a cinematic experience.

On a recent episode of The Filmcast, David Chen speculates that there's something of a generational divide to the preference [0]. He thinks that younger people, brought up on big screen TVs, 120 fps oled smartphones, and 240 fps gaming action, may just come to prefer high frame rate for everything, whereas older folk might be more nostalgically drawn to the traditional slow frame rate for movies.

[0] https://podcasts.apple.com/us/podcast/ep-700-avatar-the-way-...


>He thinks that younger people, brought up on big screen TVs, 120 fps oled smartphones, and 240 fps gaming action

This might be true farther in the future. But the vast majority of kids now were certainly not brought up on 240fps games and 120hz smartphones.

The iPhone 14 is still 60hz and the Galaxy s21 is the first Samsung phone to have 120hz and it was released in 2021.

And 240fps gaming monitors are a thing, but not really mainstream yet.


The only pictures of realism google image search for “art” in the top ten is the Mona Lisa, The Birth of Venus by Botticelli, and the Consequenses of War by Rubens slightly out of the 15.

Compared to two Van Gogh paintings, a Hugo Simberg painting, an anime mural, a digitally rendered skeleton covered in flowers, and abstract stock photos of an colorful eye, and two impressionist paintings.

So to answer your question it’s 3/10 are impressionist. 4 if you have a liberal definition of Impressionism and include the skeleton.


So all cinema and all producers are targeting precisely the same level of impressionism? None desire any more or less realism?


Well a few experimented with slightly more realism. Todd AO was a film format that ran at 30fps. 20th Century Fox shot Oklahoma in that format. I've seen it projected at 30fps and it certainly has one foot in the "soap opera effect" door.

The movie Brainstorm also experimented with using the HFR as part of the story telling. But being in the film days, it was difficult to project/distribute.

Avatar 2 is the first modern equivalent to Brainstorm, where the filmmaker opted to "turn a knob" to digitally change the frame rate in a variable fashion throughout the film, depending on what the content needed. (Not the final content appeared to be 48fps, but the apparent frame-rate within that container varies from 24 smoothly to 48fps.


I agree with your point on realism vs impressionism, but I still buy the explanation what we prefer 24 fps movies because we're conditioned to, not because of a preference for impressionism.

I also like my wine expensive and with nice labels.


I think I reject the "conditioned" argument. If I saw this [1] painting and compared it to a hyper-realistic painting (or photograph for that matter) I suspect the photograph would be boring in comparison. Certainly not world famous.

No one would say, "well you're just conditioned to like the impressionistic form better". I think it's just objectively more interesting. An iPhone snapshot of the same scene is just boring. If anything, my point is proven by what people attempt in their photographs. They add vignetting, color treatment, bokey, etc. All these things are forms of impressionism. No one wants a perfectly realistic photo.

[1] http://4.bp.blogspot.com/-YiHuMtP-658/Usi_fBZsLSI/AAAAAAAADA...


Cool, so people that want that effect on their movies can use it, it hardly sounds like something everyone would want every time though.

Or is impressionism the only valid art form?

Also -

> happens to be at that perfect balance between ...

Yeah na, it's just what you're used to.


> Cool, so people that want that effect on their movies can use it, it hardly sounds like something everyone would want every time though.

That's not what GP said. GP was arguing that 24fps is a genuine artistic choice, and that an argument that 48fps is intrinsically superior is absurd. GP was arguing that frame rate is a choice artists can use to tell their stories, that 24fps is Impressionism and 48fps is realism.

Also, cross examining (e.g. "so, you're saying...") is against HN policies. Misrepresenting what someone said is also not great.


> GP was arguing that frame rate is a choice artists can use to tell their stories, that 24fps is Impressionism and 48fps is realism

Okay, where is 77 FPS movie for surrealism, 12 FPS movie for abstractionism, 174 FPS for cubism?

The whole argument is fictitious


Well that's just not how the physics works. It's a single dimension that slides from hyper-impressionism (at low low Frame rates) to hyper-realism at high ones. And BTW it's been shown to top out around 75-80fps. Any higher and there's no additional perception of realism. Realism is just the single axis.


> GP was arguing that 24fps is a genuine artistic choice

Yes, and I'm saying that's spurious argument because if it were an artistic choice we'd have a lot more variation of said choice.

> cross examining (e.g. "so, you're saying...")

Where did I do that? I was making the point that impressionism is not the only art form. I know the GP wasn't arguing it should be, but again, that's where their analogy breaks down.


See my other sibling reply regarding the "conditioning" argument.


I'm afraid I don't really buy that. I know, having lived through the transition away from PAL/NTSC and (worse) VHS that people will make arguments as to how the old tech softened things, made them look nicer, softer etc. How such and such an existing standard is perfect, and does this, this and this to the brain. How HD is unnecessary and just shows off actor's bad skin and who wants that?

Habituation is powerful, and even if this effect you're talking about does exist, well firstly it's not going to be appropriate for every subject matter. Secondly the technological limitations which led the industry to arrive at that standardised framerate, the chance of that landing on some sort of optimum spot... it's just not believable.


I watched the new Avatar movie with HFR. Some scenes were in 24fps, some in 48. Everytime it switched to 24fps it became unpleasant for a few seconds. Like playing a video game and getting a frame drop. Sometimes they switched like 5 times a minute. The effect going from 24fps to 48fps was very satisfying though. I don't understand how people can think 48fps looks bad or "fake".


To me, 48 fps per se doesn't look fake -- it's the fake stuff in the shots that 48 fps reveals to be fake.

The first Hobbit movie set me against 48 fps for this reason. The weapons on the characters backs bounced around as if they were made of foam -- because they presumably were made of foam. 24 fps hides this imperfection and lets your brain fill in with a more appropriate interpretation.

The same bugs me about rerenderings of 90s television at higher resolutions. The sets suddenly look fake -- because they are -- but in standard definition you simply can't tell.

Any improvement in media fidelity must be accompanied by a complementary improvement in set/makeup/prop design to avoid this problem. Problem is, at a certain point -- this extra work simply surpasses what is relevant to tell a good story, and is left undone -- and high fidelity reproductions belie this shortcoming.


That could actually make sense. I remember I was never a fan of high frame rates, also due to the Hobbit.

But I found myself filming (as in home videos) mostly in 4K/60fps now, because it just looks more realistic.


I suspect that sitcoms, YouTube stuff, home videos, etc. work better as 60 fps because the content in them is real. There aren't props, fantastical settings, or other elements requiring suspension of disbelief.

I wonder if likewise, 48 fps would work well for drama/comedy movies, where the sets and settings are realistic, vs. action/fantasy films it tends to be advertised on. I haven't knowingly seen such a movie.


I was thinking of asking in the comments if a single film could be mixed. So thanks for mentioning this. Though I will say I didn’t notice in Avatar!


They also showed the film without HFR (24fps completely), maybe you watched that


It was projected in constant 48fps. They doubled the 24fps frames.


Higher refresh rate is objectively an improvement in the sense that it is definitely the case that there are shots that would look like a stuttery messy in 24fps, which they can pull off in 48fps. I mean they can pan twice as fast, clearly!

But 24fps restricts the director to certain types of shots, and I think the statement of preference for 24fps is mostly a result of that. Scenes where the camera is mostly still are the ones that focus on the actors and their acting. This is well known enough, I think, that it is basically an implicit statement of preference toward the acting and shot-framing aspects, and away from the technology/action/sfx aspects of film-making.

The former focus is generally seen as more prestigious and artistic so it isn’t surprising that there’s a subset of the population that prefers it. It also isn’t that surprising that that population is over-represented in the movie-discussion space. However it seems that the latter group buys more tickets!


How low frame rate looks is dependent on the display technology. It looks good when the the frame or pixels are displayed for a very short amount of time (like on a CRT or a film projector) or when the transition between frames is gradual (like on TFT and IPS panels). It looks not so great on an OLED screen where the pixels are continuously lit up and the transition is near instantaneous.


You'd hope that cinemas use a display technology which is optimized for looking good at low refresh rates, _especially_ those big expensive IMAX cinemas. Yet big panning shots at 24 FPS looks like a choppy mess there too.


Newer OLEDs can be configured to do Blank Frame Insertion (BFI), sold under various confusing marketing names.


Only problem is it massively reduces perceived brightness—an area where OLED already lags behind LCD.


As someone who very much likes how 24fps looks, it’s still worth noting that 24 was only chosen to balance film usage with smoothness.

I’m willing to buy into the idea that lower framerates cause our brains to “fill in” details and thereby improve immersion by some impossible-to-measure metric, but let’s not pretend that’s why 24 was chosen. It was chosen because 30 or 48 or 60 would cost more to shoot.

Display tech is also a big factor. It took me a little while to adjust when I first got an OLED TV. The lower response times of LCD give it a sort of natural interpolation between frames, but OLEDs are fast as hell and you can absolutely see the difference.


Considering the heat generated by lighting, the relatively low sensitivity of film stocks, and the relatively small aperture, uncoated lenses available at the time (1926) 24 fps was standardized, I suspect exposure requirements may have also played a role in preferring one of the lowest feasible frame rates.

It may or may not have been a coincidence that 24 divides 120, potentially simplifying the design of electrical devices (lighting, motors) powered by 60 Hz AC; I don't know enough about the technologies involved to say.


I also believe 24 has more "mental penetration" (excuse the wankiest term to ever be ill-conceived) by being in between a series of still images and something fluid. Maybe the same reason we were so absorbed by lower resolution games.


The benefit for 48fps is mainly for smoother special effects. People are so used to 24fps now that the stuttering you're talking about is part of the collection of characteristics that people call "cinematic".

It's not a video game, you're not in control, so the standards are different.

In a high CGI film, like Avatar 2 for instance, where the entire film is mainly CGI, then shooting in 48fps has an advantage and looks much better. You simply have more clear information on the screen per second that way, which helps for 3D and for CGI detail.

The only issue really is that camera movement needs some experimentation to get right. Some movements in 48fps are too quick or too smooth and this makes them jarring to general audiences who are used to 24fps. Actors also move a certain way for 24fps (they slow down certain motions for instance), so they'll need to fiddle with that as well in 48fps.


I'm currently watching through House of Cards. Not exactly a high CGI series full of special effects which need high FPS to look smooth. Yet the incredibly low FPS is jarring at times. Turns out people like to put big sweeping camera movements even in non-CGI movies.


On the contrary, I think HFR can make improvements (or unlock new possibilities) for very routine shots, like panning and zooming.


> Panning shots are a stuttery mess.

It's worse in some movies than others. I don't know all the photography lingo, but from what I understand it looks worse when the shutter speed is high, giving each frame a very short exposure and therefore little motion blur in each frame. With longer shutter times, each frame has more motion blur and therefore a 24fps pan doesn't look nearly as bad.


> I don't know all the photography lingo,

It's called "shutter angle," see e.g. [https://www.red.com/red-101/shutter-angle-tutorial]. The angle, as a fraction of a full circle, is the fraction of the shutter time that the frame is exposed to light. Larger shutter angles give more motion blur, and smaller angles give a more stroboscopic effect.

When shooting on real film, an obvious complication is that the set lighting has to be well-balanced for the chosen shutter angle. A dim scene can't easily be shot with a small shutter angle, and a bright daylight would overexpose a large angle.

Special effects, including postprocessing trickery, are almost the opposite. The default state of a render is to be perfectly sharp and to exist outside the flow of real time, and both focal and motion blur are deliberate additions.

Digital sensors are in theory compatible with arbitrary "shutter angle" equivalents, but your phone's video camera is more likely to set the shutter speed as necessary for auto-exposure.


At the end of the day I can’t really fault someone for having a genuine aesthetic preference one way or the other. And yet it’s difficult for me to understand complaints about HFR as a genuine aesthetic preference distinct from the complaints that likely existed about talkies, color film, HD video, etc.


Same. I'll say, with no disrespect for those who prefer it, that when I was a kid and super into videogames, 60 fps seemed cool because it was new. But it always feels slightly hyperreal, kinda in that uncanny valley.

24 fps feels best to me. Better than 30, 15, 16, 12, 6, 4. Better than 60+. Even slightly better than 25.

Those numbers may seem odd and random, but doing a lot of animation using various framerates, 24 matches up with reality more closely than other framerates to me. Might be subjective, who knows. But when I'm noticing persistence of vision, for running, cycling, driving, I like 24 fps.

Pans & scans & tracking shots & dolly moves & steadicams, and how they convey motion via the aperture & ISO speeds & f-stops, will determine if the images will have more blur, or be sharper.

In Saving Private Ryan, Spielberg made the settings be super sharp so each image has no blur and is in great detail, to give the sense of like, kinda being on edge and your focus is super sharp to your surroundings. I think Chris Doyle in Chungking Express does some crazy open-aperture stuff with super low frame rates, creating a kinda like, trippy surreal staccado blur, sorta kinda on the opposite end of the spectrum of what Spielberg did.

I dunno. I'd only go for other framerates if projects demand it. But 24 feels great to my eyes because it seems to feel the most natural in persistence of vision in motion.


I agree that higher is objectively better and more realistic. I think in a few years we will start seeing media catch up with this.

The current trend is higher refresh screens. All new phones support 120Hz, and next generation MicroLED TVs will support at least 240Hz. MicroLED TV and phone screens have practically no refresh rate limit (nanosecond response times[1]), so they can easily be made to update at the limit of human perception (1000Hz or more[2]). The only bottleneck is that we are waiting for better HDMI or DisplayPort bandwidth.

Another trend we are seeing is vastly improving AI upscaling (and interpolation). We are not necessarily far from the point where the filmmaker can no longer make the choice for you, and if you want to watch a Nolan classic at hundreds of FPS, you can simply run an AI frame interpolator to do so with very few visual artifacts. See NVIDIA DLSS for a current real-time example of both AI upscaling and interpolation for video games[3].

Related to the AI improvements are improvements in compression. This is not really something that is seen yet but the level of semantic understanding that AI algorithms have should let them far surpass traditional "dumb" (manually human-programmed) compression algorithms. This will make you realistically able to store many 16k 12-16bit 1000FPS+ HDR films, and can also be used for increased transfer bandwidth efficiency.

Lastly is the impact of VR technology. Immersive viewing experiences, like watching the Avatar movies in 3D with VR goggles, are vastly more immersive at higher frame rates. As VR tech becomes better and more widespread this is another incentive to provide high frame rate media to the public.

[1] https://www.pcgamer.com/samsungs-new-microled-tvs-are-five-m...

[2] https://blurbusters.com/blur-busters-law-amazing-journey-to-...

[3] https://www.nvidia.com/en-us/geforce/news/dlss3-ai-powered-n...


In the old PAL/NTSC days we already had 48/50 FPS in terms of refresh rate. In fact those were only half frames, toggling between displaying only even and only odd line numbers. In fact this was very clever. Horizontal movement looked fluent but had a lower resolution but the latter disadvantage was not very noticeable because of fast movement, where fps matters. Still pics or slow movement provided more detail (24/25 full res pics per second). There was a rather long time of analog/digital transition era where I liked analog better.


> 24 FPS looks like garbage. Panning shots are a stuttery mess.

For what it's worth, I think 24 fps looks a lot better on my DLP projector—which has basically zero motion smearing—than on a standard LCD. (This is also assuming I've set my projector's refresh rate to a matching 24 hz.)

But I do agree higher frame rates are better.


Even 48 is barely enough for panning shots to not cause headaches and nausea, I want at least 60. It’s only at 120 that diminishing returns really begin.


Headaches and nausea?

Panning shots typically only last a few seconds.


That’s all it takes, at least for me. Instant (minor) headache at low frame rate panning. Same in games running under 30 fps. I get migraines and nausea from longer exposure.


I'm somewhere in the middle. I'd like to see movies use HFR like some movies use the IMAX aspect ratio.

Imagine Spider-Man: Far From Home in 24 fps, except it transitions to 48 fps in the Mysterio illusion scenes. Granted, this would spoil some moments in the movie, so it's not the perfect example.

I wish Avatar: TWoW was entirely 48 fps, OR it handled transitions better, like keeping it consistent across entire scenes.


> 24 FPS looks like garbage. Panning shots are a stuttery mess.

This depends on the TV, of course, but I prefer 24 FPS over the soap opera effect.


That's the thing I'm talking about though. I don't care that you associate better motion with bad soap operas.


I don't associate it with bad anything. I just don't like the way it looks.


I agree wildly and I'm seriously shocked that the guy above is arguing in favor of it. I can't stand the weird motion smoothing on modern TVs and would not buy any TV where it can't be turned off.


I'm not arguing in favour of frame interpolation at all. Footage that was shot at 24 FPS should be displayed at 24 FPS. But footage should be shot at a higher FPS.


I could probably get behind that, I'm actually somewhat curious now if it's the higher framerate or only the algorithmic "motion smoothing" that causes the famous soap opera effect.


I get it, maybe, if the source material was filmed at that rate. Interpolation of 24 FPS content to higher rates just looks unnatural.


Higher frame rates and resolution is great for real things like sports and news. For fiction I much prefer lower, it allows for a greater suspension of disbelief. When I watch less than perfectly shot movies (almost all of them) in high res/fps I'm aware that they are actors and of their less than perfect acting.


Motion perception is also affected by exposure time, not just framerate. Adding motion blur in post or intentionally increasing shutter duration to induce blur is still somewhat of a work in progress, so maybe you’ll eventually get what you want.


A higher framerate increases the chance of experiencing "uncanny valley", because it starts to look too real, but clearly isn't. For example, you see more subtlety in actors' motions (the movements that prove "they are a real living human" to your lifelong-trained eye), and it also makes the animatronic or computer-generated characters/elements that much more distracting (they much more obvious when compared to the observable organic properties of living actors).

The more granularity you are provided, the more accurately you can discern the natural or unnatural properties of the things you're looking at. The movement of a lifeless prop goes from "oh wow that was creepy" to "oh that is a piece of plastic with some paint on it". Our ability to discern what things are is pretty strong.

When everything is rendered down to (or shot at) 24fps, the film gains a slightly more dreamlike or fantastical feel. This supports the ability to suspend disbelief, because your imagination fills in the rest, and the aforementioned granularity is not there to betray how much fake stuff there is on set.

Increasing the framerate is not an objective improvement, because the factors I mention are factual, and will not be considered an improvement for all types of films. The more the film experience matches reality, the more glaring any deviation from that becomes, thus breaking the suspension of disbelief.

Put more frankly, when I watch a film, I do not want to see how the sausage is made. I want to be "tricked" into believing what I'm seeing is plausible, and high frame-rate (alongside super-high resolution) is a factor that makes that less likely to succeed.

Like, high frame rate is fine if I'm watching, say, a documentary. It will look utterly stupid and uncanny-valley-triggering when I watch some sci-fi space adventure with tons of computer-generated elements. It was horrible watching The Hobbit at 48fps, and made it WAY harder to see it as a fantasy world. It looked more like "these are actors walking around on a set", and I couldn't shake that perception for the entire film. Again, this is demonstrably not an "objective improvement", it basically sabotaged the experience for me.


> The movement of a lifeless prop goes from "oh wow that was creepy" to "oh that is a piece of plastic with some paint on it". Our ability to discern what things are is pretty strong.

You can really start to see this when watching older movies (say, 30 or 50 years old) in HD. These movies were never intended to be viewed on a 16:9 1080p or 4k screen. A good example is Wizard of Oz (1939) or White Christmas (1954). -- the whole movie just feels like a set. You can clearly see the makeup they are wearing.

Even older shows like Seinfield or Friends can have this kind of feel. These were always intended to be shown in 4:3 480 on some crappy CRT. Not 16:9 at 1080p...


Oh yeah, I have a family member who watches these old TV series, and the increased fidelity of TV broadcasts today makes so many fake props blatantly obvious. For example, at about 24 seconds into the intro to "Green Acres", you can see quite clearly that the background is a painted backdrop: https://www.youtube.com/watch?v=DrbPAt1_vc4 ... If you watch an actual episode of the series they use these painted backdrops with quite some frequency.


Yeah or Hitchcock's Rear Window. Even though they built actual buildings apparently, the backdrop of the sky is clearly a painting. These movies were not meant for modern resolutions.

Still, I don't think it really takes away from the movie. But with others of the same era that weren't such masterpieces it's more annoying.


It sure would have been nice if people who disagree with what I wrote actually say something, rather than just downvoting my thought-out good-faith post and burying it out of sight. I have thought a lot about the subject of framerate in cinema and came to the above conclusions/thoughts, and if someone disagreed with that I'd be interested to hear about it (as long as it is similarly thought-out and not some kneejerk response), rather than just having my contributions to the conversation buried.


Before I ever seen it I thought that.

But damn HFR really looks terrible. Maybe 30fps or something might be optimal.

The Hobbit and Gemini Man looked awful to me.


24 FPS looks like garbage. Panning shots are a stuttery mess.

100% that's your TV, or media player making the mess, not 24fps.


My TV or media player is making a mess when I'm at the cinema?


It's probably your theater then, many converted to digital instead of film.


Set the refresh rate of your display to 24hz that should completely eliminate the stutter that you see if 24 FPS.

https://www.youtube.com/watch?v=5SSU-s0AUH0

This is an example of judder, matching the refresh rate to the frame rate gets rid of it.


Sure any kind of frame rate conversion makes it worse, but lots of 24fps footage has visible judder. The camera can pan fast and the frame rate is too low to show fluidly.

It’s part of the 24fps look, and film makers need to be aware and avoid it as best they can. But judder is definitely a thing at 24fps.

The shutter angle (how much of the 1/24th of a second the film/sensor is exposed for) also makes a difference. The larger it is the more motion blur you get, but it looks smoother. The lower it is the less blur in each frame, but the more obvious the judder as they’re played back.


I agree. I truly do not understand how people accepted first sound and then color in movies but now they are completely balking at frame rates higher than the completely arbitrary and actually-too-slow 24fps. 48fps makes motion so much smoother.


agree agree agree


High refresh rate makes people feel sick. It's "too real", such that the brain can't synchronize what it's seeing to what it's feeling motion wise. Ultimately a 2d plane is not optimal for a realistic experience. By it's very nature, film is impressionist.


Do you have a citation for that? Lots of people play video games on big screens for hours at a time and don’t get motion sickness.


There were lots of reports of this with the hobbit, e.g. https://theweek.com/articles/469863/why-isthe-hobbit-making-....

Being on a big screen probably makes it much worse.

Also I’d conjecture video games are different. It’s like driving a car. Since you are directly controlling the camera usually, I think this helps your brain avoid nausea. Although there’s plenty of people who can’t play FPS either because it makes them feel sick.


Those reports were about the imax version though, and I have personally seen that happen in imax theaters showing 24fps content. That’s field of view, not framerate.


People who play games typically don't typically feel sick during cutscenes, even long ones, even when they're running at 60/120/144 FPS.


Sorry, but my personal experience with motion sickness and games says it's triggered by too-low framerates. Early 2000s laptops were particularly unbearable.


I'm not sure what you think "objective" means, but typically it means independent of experience. When you say "objective improvement," in what sense, independent of experience, is higher frame rate better? Obviously there are 2x the visual data because there are 2x the frames, but that's borderline tautological.


I'm confused.

You don't think having 2x the visual data is better independent of experience?

> Obviously there are 2x the visual data because there are 2x the frames, but that's borderline tautological.

Isn't any straight forward application of a dictionary definition borderline tautological?


> You don't think having 2x the visual data is better independent of experience?

What does "better independent of experience" mean? The experience is the whole point, and its quality is not linearly proportional to the amount of data on screen.


That's like saying white images are better than black ones, because white is encoded as 0xff, 0xff, 0xff, which is numerically greater than 0x00, 0x00, 0x00. Quantifiably better! We're taking a totally subjective preference and saying one is objectively better, based entirely on some arbitrary measure of the "amount of visual data". I personally think "greater visual data" looks worse than "less visual data." This is a totally subjective preference! Is this preference objectively wrong?


I'd say its like saying white images are whiter than black ones. Whether or not you want a white image is subjective. How white a white image is objective.


Better for whom or for what purpose? Remember, we are using value-laden terms here.

The parent essentially said higher frame rate is better because higher frame rate is better.

There are many reasons for which higher frame rate is worse.

2x frame rate requires twice the file size. It makes CGI, practical effects, costuming, and lighting more difficult. Most importantly, the audience doesn't like it.


Higher frame rate makes CGI special effects harder to make convincing. E.g. the Hobbit.


It is clearly the case that twice as many frames is an objective, quantifiable, technical improvement. That’s objective. The problem isn’t that the original poster misused objective, it is that they didn’t ask the follow up question — does that objective improvement result in a better subjective experience.


See how you qualified your use of "improvement" with "technical?" The parent post did not, which is why I commented.


A technical improvement is a type of improvement.


The purpose of a movie is to give the illusion of motion. A higher refresh rate creates a better illusion of motion than a lower one.


> The purpose of a movie is to give the illusion of motion.

Argument?


This trend started when the first video-capable full-frame DSLRs appeared on the scene about a decade ago, and they led to a mixing of the visual language of photography and filmmaking. Combined with the fact that camera lens developers are working hard to remove flaws from their products, making their picture more and more perfect, but also removing any character a lens can have.

The classic Super35 frame size is more akin to APS-C in digital photography, whereas the new top tier cameras use full frame sensors, which are 1.5x larger in diameter.

The larger the sensor, the more bokeh you get, with a shallower depth of field.

Modern tech for focus pulling allows for extra shallow depth of field even if the actors and the camera are moving. Something that would have been mostly out of focus footage 30 years ago can be easily shot with a small crew.

Large aperture photography lenses and large sensor also means much better low-light capabilities. Which means you can shoot with less studio lights, in a much lighter setup with less crew, less planning etc.

Blurry backgrounds also allow for cheaper sets.

Which means the cheaper the production budget is, the more likely they'll go with shallow depth-of-field shots.

And we the viewers will inevitably connect the cheapness of production with these visual cues.

In contrast, when an older movie had shallow depth of field scenes, those were expensive shots that had to be carefully planned, and painstakingly executed with "flawed" optics which gave them a lot of character.


> The number one thing that bothers me the most in the "netflix aesthetic" is the weird looking background bokeh that every scene seems to have.

I complained about this with a group of friends the other day and it seemed I was the only one that even noticed it. I feel like I'm taking crazy pills here.

Sure, when you focus on something the background is a bit blurry. But on some modern movies, EVERY shot has a bokeh background. Like, imagine sitting down for dinner, you focus on your sandwich and suddenly the plate and table are blurred and out of focus. It's exaggerated to the point where it seems more unrealistic than if they hadn't done any post-prod.


One reason this change happened is the move from super35 to full frame sensors combined with moving to ever faster lenses. f/4 on super35 used to be common, now it's all f/1.4 on full frame, increasing the bokeh massively.


I noticed that too. And I guess this is a way to offset the high resolution that shows every minute details. It must be cheaper and easier this way.


This background blur becomes a necessity if the filmmakers are using the "LED wall virtual set" technology. Otherwise, you could see the individual LEDs if the background was in sharp focus.

E.G.: "Why 'The Mandalorian' Uses Virtual Sets"

https://www.youtube.com/watch?v=Ufp8weYYDE8


This is why the scene in Mando that’s clearly just some random hill outside LA still looks so much better than anything else in the show. Andor has some great scenes that are shot on location in Scotland.


IIRC the Netflix blur on the top/bottom of the frame is actually from using 1970s anamorphic lenses. It's super distracting to me because sometimes it blurs the foreground too


It's called falloff, I think.

And yeah, it's the effect of some anamorphic lenses.


I think one of the reasons everybody keeps producing content with massive background bokeh is because of digital video formats[1], having a blurry background and a relatively still subject in focus requires relatively little video bandwidth, so the video can still look good even with a kind of crummy 2-4 Mbps encoding.

1: edit, I guess it's mostly about having to stream it over the internet


As the other comments say, that's probably just the choice of lenses. The effect that's much worse is the vignette effect added for reasons I can't fathom.

I took some screengrabs of The Serpent Queen on Starz, here: https://imgur.com/a/pAfcZg1.

It's actually an excellent show, with good production values and plenty of good cinematography, but the overuse of digital vignetting drives down my enjoyment. Sometimes it's used to darken an overexposed sky (an ancient trick; Kubrick's opening scene in Barry Lyndon famously uses it, and it looked bad then, too), but it's evidently also an attempt to add depth and to highlight the characters in the center of the screen in post rather than during filming; it's just lazy filmmaking, and I see it constantly.

I don't like the use of anamorphic lenses since I find the corner distortions distracting, and I generally dislike the use of super widescreen aspect ratio that a lot of filmmakers use (4:3 is often a much more natural format, especially for less epic dramas), but at least this is an artifact of the technology used and one that goes back half a century or more.


Yeah, maybe since everybody has to stand out, they came up with that color grading setting.

But what do I know. This article opened my mind to compare and contrast between movies. I was not one to critic video, but watched for the story.


some thumbnails of television commercials exhibiting this slightly out of focus background https://i.jollo.org/jiWn4jTj.png


One aspect of this I’ve talked about before is the phenomenon of the “permanent golden hour”: what used to be a narrow sliver of time when you could get the most dramatic lighting is now easily replicated in post. So as a result tons of movies have all their outdoor shots set in this incredibly unnatural-looking light that really pulls me out of the movie. Like I remember the banker in Switzerland scenes in The Wolf of Wall Street, with the open window and it just looked so hilariously fake I thought Scorcese was doing a bit. Nope that’s just how outside scenes/shots are lit now.


Ironically this kind of thing is something that's been done for ages, although in a different context -- old films would shoot "nighttime" scenes using underexposure (see https://en.wikipedia.org/wiki/Day_for_night ), so a lot of shots esp in 50s/60s films (I recall them most clearly in westerns) supposed to take place "at night" have instead a distinct "darkened sky" look, which really doesn't resemble nighttime at all.


The Wolf of Wall Street is a movie that would surprise most people by how much special effects it has:

https://www.youtube.com/watch?v=pocfRVAH9yU


You know, what gets me watching that - 50 years ago being an actor meant a life of traveling to exotic locations for shoots. Today, it apparently means spending just a lot of time in a studio with a green screen. Whatever the reason or benefits or whatever, man, I feel like that’d be incredibly disappointing as an actor.


Some of these just seem completely pointless lol


This is also proliferated by the new HDR formats (HDR/HDR+/Dolby Vision etc). Having everything perfectly exposed in a frame looks totally unnatural, real scenes don't look like that.


I disagree, my experience as a photographer has been that almost everything is “perfectly exposed” in my visual experience, but committed to a photo it is suddenly a mess. This isn’t really because the camera is bad; our brain creates a visual experience with huge dynamic range without us realizing it.

A simple example is backlit subjects on a cloudy day. We aren’t even aware the subject is backlit, but a light meter will reveal multiple stops of difference.

The result is a “real” photo frequently doesn’t actually look like your experience of the scene, and the photographer needs to do the work your brain is normally doing for you. Poorly done, it does look fake. This is where photography (or cinematography) becomes art.


It’s such an odd choice too. HDR shows nicely on HDR displays or when combined with a well-chosen exposure level (such that the relevant parts of the shot are visible) on an SDR display.

Instead, it sometimes seems like the dynamic range is compressed to just fit mostly everything on an SDR display? Do directors not pay attention to dynamic range anymore and then just “fix” it this way in post? Surely the old timers at least would still how know to do it.


A lot of recent (high-budget!) shows are very incompetently graded and mastered.

The recent game of thrones spinoff had a lot of scenes with terrible grading, where they'd clearly done something wrong in the grading pipeline - maybe accidentally squashed the entire HLG range down into SDR before exporting to PQ.

The grading on Wednesday was quite good. They had a number of technically challenging lighting scenarios that they pulled off quite well, and the output was properly mastered PQ.

It's possible studios don't actually care about this because, form what I can tell, most viewers don't even notice stuff like this.


Pirate encodes long have been fixing terrible banding and other artefacts on BluRay releases, it seems now they have to start messing with gamma math. Wonderful.


As Plinkett would say: “you might not notice it… but your brain did”.


Possibly so! However, often people will not only deny being able to see e.g. compression artifacts, but also react negatively to the idea that other people can! Frequently they will draw comparisons to e.g. people who claim to be able to hear the difference with gold-plated audio cables. Beyond ambivalence to addressing quality issues, there is perhaps some hostility towards it.


> real scenes don't look like that.

Movies never looked like real life, They had a butt tonne of artificial lighting. even up to the 80s outdoor scenes routinely had artificial lighting to get better exposure.


There's also Midsommar as a recent example of a very day-time movie with lots of outdoors artificial lighting


But this isn’t true at all. Look at a movie like The Conversation. Or Mon Oncle. Or Pulp Fiction. The lighting when intended to look natural very much does, although the latter also makes intentional use of stylized lighting (such as the taxi ride scene).


HDR just looks so weird and unnatural. Used wisely, it can peoduce great results, used everywhere it is just boring.


Yes! The entire new Matrix movie looks like that as well. Granted, those scenes were taking place inside the Matrix, but that choice did not really work for me. Seems as if the director wanted to move as far away as possible from the old movies, in terms of what the simulation looks like.


It may just be confirmation bias, but I've been noticing this with some scenes set in the outdoors shot in front of LED walls. They have almost too beautifully perfect, almost colorful lighting conditions.

Examples that come to mind right now:

1. Scene in The Batman where Bruce and Selina have a moment in the under construction tower at golden hour. Maybe some other scenes in that movie too. 2. The entire opening sequence in Thor: Love and Thunder.


I've said it and I say it again, most or some of the reasons why it looks weird to us (people who grow up with a TV) come down to camera quality.

Most TV camera setups were about lighting and the needed amount.

Since a few years cameras are so good, we have almost real blacks and real whites sometimes even in the same frame and its naturally filmed and does not look like total shit. We have crazy aperture sizes and we use them and there is the often coitized gaussian/blur backgrounds. We have very sharp images even with very bad lightning conditions so some movies are very dark.

I think like the younger ones are already very used to it and wonder about "how odd old films look".


Back in the 80s we had films that were in color. These days, films are all in blue and orange. The blue and orange looks as bad as the 1930s two-strip technicolor, where the palette was green and red.

Case in point: House of the Dragon. The beautiful sets and costumes are all hard to enjoy when it's all drab blue and orange. Take a look at the picture:

https://www.theguardian.com/tv-and-radio/2022/sep/15/house-o...

This happens in movie after movie after movie.


Not only that. Different cultures who have traditionally had their own sense of color have all started copying each other...cause you know to "appeal to the largest number of chimps possible". Stuff from Europe, Japan, Korea and India all starts looking similar in color.

Its similar to the unimaginative western boring shit color palette that exists in Windows, Apple and Android permeating all software guis across the world.


>Its similar to the unimaginative western boring shit color palette that exists in Windows, Apple and Android permeating all software guis across the world.

Yep. Modern movies have become the cinematic equivalent of Corporate Memphis.


For anyone who wants to see what color movies used to look like, I just watched Purple Noon. It's so beautiful you can just turn the sound off and use it as moving wallpaper.


Wizard of Oz is color magic.


It is an aesthetic choice. Not one I would make, but it is one.

By drenching everything in one monchrome color gradient it is also easier to hide bad set design, costume and VFX. Doing a crystal clear shot with a steady camera, full color and a believeable effect is much harder to pull off (and thus: more expensive).

Technically nothing speaks against doing any other style of lighting and coloration, but then you cannot hide behind the colors and then you have to think a bit more deeply about color schemes than "blue=calm/depressive" and red/orange=aggressive/dangerous".

But Netflix is not in the business of producing motion pictures that are deep, Netflix wants to crank out as much content that people watch as possible, for the least budget possible.


The color tone is used by the artists to impart a mood. Sticking with the GoT theme, looking at the drab colors of Winterfell, Castle Black, etc are meant to convey the dreary nature of the cold. The drab colors of Kings Landing vs the vibrant colors of Dorn or High Garden. When done right, they can serve a purpose that helps the story. No story serving purpose comes from the trend following styles.


> The color tone is used by the artists to impart a mood.

The mood it imparts on me is annoyance.


> Back in the 80s we had films that were in color

those films were mostly grain.

But as other people point out, the grade of a film has been a deliberate choice for over 20 years. None of what you've seen has ever been natural. but now its a bit more obvious.


> a bit more obvious

I.e. obviously worse. Much worse.


But not in Schitt’s Creek. Worthwhile things have always been rare.


You've never watched Midsommar, did ye?


In other words: dynamic range.

That late 80's film is set up at daylight in a brightly lit joint. So that cameras could actually record it. The other scene is in a dimly lit dining room with directly visible light sourcee (that don't saturate). You can bet that 80's film could not have captured this kind of low-light high-contrast scene.


A famous counterpoint to your statement that eighties film stocks would not be able to capture that kind of low-light high-contrast scene: Barry Lyndon (released in 1975).

However, it's pretty much guaranteed that both the old and new scenes mentioned in TFA have a _ton_ of extra lights and flags just out of camera. If anything, the article is pretty naive in assuming that they just walked up to some diner and started filming using available daylight.


Yeah, the difference is just that the first movie’s DP was aiming for a naturalistic shot, whereas the second DP wasn’t. Tons of 80s and 90s movies have heavily-stylised non-naturalistic lighting (e.g Tim Burton’s Batman)


Exactly.

Going further back in history there's also this thing called Film Noir that further disproves the argument that digital cameras (or, more narrowly, Netflix) is the origin of the high-contrast look.

I mean, even the most celebrated film of all time, Citizen Kane, is celebrated among other things for its stark contrast and that was shot in 1941 or thereabouts.


For Barry Lyndon, Kubrick had Zeiss build ridiculously fast f/0.7 lenses for him that was previously only used by NASA. That’s how he was able to capture these low-light high-contrast scenes that look so gorgeous.


Except Barry Lyndon did exactly that.


You’re not wrong, but Kubrick famously used some of the fastest lenses available at the time to shoot the candlelight scenes. I believe the folklore is that he had to borrow them from NASA, so it’s probably a stretch to expect that kind of technical availability in general film productions from the period.


Not so much a "borrow" but rather "they had already been designed so the hard part was done - he just bought some."

https://neiloseman.com/barry-lyndon-the-full-story-of-the-fa...

> Kubrick obsessively researched the problem. He eventually discovered that Nasa had commissioned Carl Zeiss to build ten Planar 50mm f/0.7 stills lenses in the sixties, which were used to take photos of the dark side of the moon.

> ...

> Anyway, Kubrick promptly bought three of the Zeiss Planars. He liked to own equipment himself, rather than hire it in, and to this end he had also purchased at least one Mitchell BNC camera. As befits Kubrick’s perfectionism, these were perhaps the world’s most precisely engineered cameras, previously used for special effects work.

https://en.wikipedia.org/wiki/Carl_Zeiss_Planar_50mm_f/0.7

> In total there were only 10 lenses made. One was kept by Carl Zeiss, six were sold to NASA, and three were sold to Kubrick.


A Zeiss 50mm f 0.7? Heck, I don't have a camera for that lense, nor the budget, but this combination sounds like a dream...


Some ideas - http://www.naturfotograf.com/need_speed00.html (note: its a rather old style site that hasn't been updated in two decades - I can't find it properly linked from http://www.naturfotograf.com/index2.html and the navigation is on the right hand side header and footer).

Make sure you look at the http://www.naturfotograf.com/need_speed03.html - the Scentless Mayweed is really neat for its depth of field.

The AI Noct Nikkor 58mm F1.2 is another neat lens. https://imaging.nikon.com/history/story/0016/index.htm (make sure you look at the full size examples of the 58mm f/1.2 and the 50mm f/1.2 and look at the car lights in the corner of the frame).

(and digging - found it). You might also enjoy Philip Greenspun's review of the Canon 50/1.0L https://web.archive.org/web/20040605021643/http://www.photo.... and the Graceland paperweights https://web.archive.org/web/20040617130935/http://www.photo....


Seems like need to use my 50 f1.8 more often!


A 50mm f/1.8 (or f/1.4 if you have a bit more money) is a beautiful lens to work with. On a full 35mm frame, it provides a normal field of view which approximately matches the field you see (kind of, we've got two eyes and so more of a panoramic view, but unless you're pulling out an xpan ( https://www.hasselblad.com/about/history/xpan/ ) it isn't something that you can capture - still catches the part that you see looking forward).

And it doesn't have any weird perspective issues with flattening on long lenses or the exaggerated perspective of things closer.

It's just the right lens for so many situations. Light enough to handhold without any issue. Fast enough to get a nice shallow depth of field.

As an aside, if you are looking at primes, a 35mm f/2 is a good "I want something wider" and the 85mm f/1.8 is a very nice "I want something longer" while not being ungainly or slow.


For everyday use I have a 24-120, the first VR model, ultra sharp on a D700 so I got lucky even if the VR doesn't work anymore. Lately I found myself using a manual focus 300mm often... The 50mm was mostly used for low light, now I need to experiment with shallow focus depth a bit more. I'm missing something on the short end, either a 20mm prime or a 16-35 (or something similar). A 85 f1.8 sounds tempting so, as does a 80-300 f2.8, or a... A gear aqusition syndrom is bad desease...


> A gear aqusition syndrom is bad desease...

For my Nikon 35mm cameras... I have a 16mm f/2.8 fisheye; 17-35mm f/2.8 zoom; 35mm f/2.0; 50mm f/1.4; 85mm f/1.8; 105mm f/2.8 micro; 135mm f/2 DC; 70-200 f/2.8; 80-400 f/4-5.6 VR; 200-500mm f/5.6

The 80-400 is very nice and I used it for a long time but when I got the 70-200 (I have, though it is in the car bag rather than hiking bag) the 80-400 just doesn't have as much of a place.

The 16mm I'm not even sure where it is now - it was too easy to make mistakes with it ( https://www.deviantart.com/shagie/art/Fisheye-Oops-2767240 ) - the 180° field of view means that straight down is straight down - lean out and do funky tripod setups to avoid having what is straight down show up in the frame.

And yea... Nikon... I've also got a Maymia 645e with 3 lenses, Hasselblad XPan with 2 lenses (I just couldn't justify the price tag on the 30mm f/5.6), and two Horseman 4x5 (one field, one rail) and I typically have the Nikkor 210mm f/5.6 on it though I've got a shorter lens too - just that one doesn't have the image circle allowing for useful movements... I think it's a Schneider 135mm f/5.6, but I'd have to check. The common design for a 135mm has a 200mm image circle while the Nikkor 210mm has a 295mm image circle - and that's a big difference.

---

Another thing to do with the shallow depth of field is to get some extension tubes. I've seen photographers use the thinnest extension tube for some shots so that it reduces dof (it gets into macro equations), but not too much that they loose too much distance focus. And since there's no glass in there you're not degrading image quality with another refraction (compare teleconverters).

https://www.bhphotovideo.com/c/product/375238-REG/Kenko_AEXT...

https://www.diyvideostudio.com/do-extension-tubes-cause-shal...


This is some nice equipment! I'm kind of envious, especially if you find time to shoot all of that somewhat regularly.

I was that close to pull the trigger on a great condition 80-200 f2.8 Af-D, but then I realized that the, again manual focus, 80-200 f4.5 just sits in a shelf... But heck, is that old thing sharp, I'd say on par with the 50mm.

Regarding teleconverters, the old 300mm works incredible with the equally old 2x TC, the 1.6 AF TC had worse image quality. And telling the camera that it is a 600mm f7.1 lense gets exposure just right, shooting handheld at ISO 200-400 for stills and 800 for birds and such gets quite some keepers. It does lack the flexibility of zoom, I love this 300 regardless. It is such a master piece of precision mechanics and optics, without any electronics, that I don't even miss AF. Well, most of the time that is...

I like the fish-eye shot with the shoes in it! Just thinking of that problem with a camera where the VF only shows 95% of the image...

Are those Kenko rings any good? I'm asking for a friend with occasional bouts of GAS...


It's been a while. My last dedicated time was 2017 (the eclipse). When I lived in California, I went to Lee Vining for two weeks in October every year.

http://shagie.deviantart.com

https://shagie.smugmug.com

And then there was '09 after I was laid off and just drove around for a few months.

Outside of that... I really haven't had the time in the past several years to take a dedicated chunk off and just go. Maybe this year.

The Kenko rings work fine for AF lenses (G lenses have some difficulty). Two of my photos from long ago shot with 'em:

https://www.deviantart.com/shagie/art/Daisy-s-Closeup-511810...

https://www.deviantart.com/shagie/art/Macro-Rose-5117983


Those shots are beautiful! I hope you find some time this year, it is difficult when you hold a day job, but someone has to pay for all the trips and such!

I'll look for some used, affordable Kenko rings then, the only G lens I have is the 24-120, all others are either AF of Ai-S ones.


>some of the fastest lenses available at the time

?? more like ever. are you suggesting that we have lenses widely available that are faster today?


This is an unnecessarily hostile reading. I just wasn't sure that it was the fastest lens at the time, and I didn't want to say something incorrect.


probably nobody in a non-classified situation has built such a thing recently. But if you put enough engineering into it, you could exceed those parameters today. Glass/glass manufacture, lens shaping, etc all improved tremendously. It would be obscenely expensive; every lens would be completely custom, you'd probably burn through hundreds of blanks before you had something worth using, and there are other, easier ways to achieve low-light imaging anyway.


probably? are we just making stuff up now just to prove a point?


I mean, I work with optics, and I know the history of making good glass, so.... no?


the fastest commercially available glass is 0.75-1.0. that means the 0.7 from the glass Kubrik used is still the fastest. i seriously doubt that even someone the likes of Kubrik would have the clout to approach the NRO and ask for whatever the best they have is to shoot a feature film. so your non-classified caveat is kind of pointless.


John Alcott, Kubrick's cinematographer, who got an Oscar for that film and is known for using natural light, is probably more responsible for that.

https://www.imdb.com/name/nm0005633/bio


i.e. one of the greatest auteurs was able to achieve what can be done by a TikTok today (just about)


Couldn't have put it better (about lighting). I'm from the younger generation and have always kinda felt that movies from the 90s are too bright


I always thought it was about visual clarity. Yes we can have very dark scenes now but why would you want to when you can't tell what's happening. Prime example being Battle of Helms Deep vs Battle of Winterfell. I get the feeling of nighttime from both of them but in one I can actually tell what's happening.


It's an interesting question: can filmmakers actually use this newfound range in a meaningful way? I feel the same about those two scenes -- one I can follow, one I cannot. But I wonder if anyone out there is filming high range scenes at night in a good way, and the Battle for Winterfell is just a failed experiment in figuring that out?

The final episode of Midnight Mass comes to mind. And much of The Vvitch. Both aren't totally ideal -- they can be difficult to follow at times -- but I think they use the range to convey chaos and fear of the unknown in a meaningful way.

Curious if anyone has other examples of this pulled off well.


One big hurdle is that just because the camera can deal with the dynamic range, that doesn't mean the viewing setup can. I can watch the Battle of Winterfell just fine on my laptop screen, or with an expensive beamer in a dark room. But on a TV that reflects the bright window in the dark parts of the screen?

The lack of dynamic range in movies meant we got away with glossy TV screens in brightly lit rooms, often combined with compression algorithms that do worse in dark scenes. And with more and more content shot for television or streaming instead of cinema, this will limit what filmmakers can do without causing problems.


We started watching House of the Dragon, and stopped because it was impossible to see anything on the TV in our bedroom with any ambient lights. Only way to watch is on our basement home theater setup which allows us to black out the environment. It's ridiculous. When we watched the GoT final episodes we had to wait until after dark to watch them.


I’ve had this discussion enough times to know that this is a minority opinion, but the battle of winterfell is a masterpiece. The darkness in that scene does so much to add to the tension that it’s essentially another character.

Watch it on a good screen in a dark room, and come away wishing other nighttime shots were done that way.

Edit: darkness is like a good score. It adds a ton but it doesn’t degrade well and people with potato setups won’t be able to see/hear.


GoT had several nighttime battle scenes with great lighting. I think that was the problem. They had to go darker than the fights they already showed.


I don’t understand the reasoning here, are you saying that GoT was a tech demo? To me it made sense that as the show got further along and the situation grew more dire that it would be visibly darker as a metaphor.


Sure, it's a tech demo. If you set the brightness on the demo to 7 early on and say "This is dark," you can then turn it down to 5 later and that will seem even darker. The idea is to start at 7, not at 5, because when you have to turn it down to 3 you can't see anything.

The Craster's fight and the wall fight were both at night and looked great. But maybe they should have been a bit brighter so there was more room for the narrative dimming and actually seeing things. This all feels like betting into an empty pot, though, because the darkness of episode 3 isn't even a top-10 season 8 complaint for me.


The problem is that if you've already done a battle scene that's as dark as reasonable, you have a problem when you then want to "go darker".


The former always seemed more like gloom than night to me. Which can be compelling, but it isn’t a replacement for night.


They were lit and graded with the pre-home-theater home video market in mind.


could be lighting. could also be a deliberate choice to make movies darker[0].

[0]: https://www.youtube.com/watch?v=Qehsk_-Bjq4


In part.

But another important factor was that movies couldn't be too dark because tvs couldn't reproduce them properly.

We all now have devices that can display the full SDR and HDR is becoming more common.

Filmmakers were forced to shoot and deliver bright images. So now the pendulum has gone to the other side. It's probably going to stabilize at some point.


It's hinted at here, but film produces a much better overall look than digital. It is just that digital is way cheaper and easier to work with.

There is certainly more versatility in what you can create in post production with digital but the base look of what is captured is more beautiful on film, and this is the same with photographs. Look at high quality old images captured on film, they blow most digital images out of the water. Much higher dynamic range and depth of colors and contrast.

This is why Tarantino, Nolan, and P.T. Anderson still only use film. I think this is most glaring with the mid to low budget range of films that can't make up some of the ground with production techniques and post. Old 80's and 90's mid range films on film look so much better especially in night scenes.


I'm highly skeptical of the technical claims you're making here.

Quality photography paper has about 10 stops of dynamic range (which is why the zone system for composition and printing goes up to X). While the negatives themselves may have slightly more dynamic range, there's also the clarkvision page[1] indicating they, in fact, have less dynamic range. This makes sense: you have less area to work with, and need to get something done with bigger grains because of light and time constraints.

You may well think it looks better, but more information it captures not. More noise, yes. Bleeding artifacts you like, sure. More information, no.

[1]: https://clarkvision.com/articles/dynamicrange2/


I'm skeptical, too, since a lot of things shot on film still went through a digital intermediate.

I think this is more about overall production aesthetic/budget/execution, than a film v. digital.


Do you disagree that film looks better than digital?


Film was a lot more expensive, so there's a selection bias. Digital is newer, so we haven't yet benefited from survivor bias.

In the hands of a sufficiently talented team, film does not look better than digital, because digital can be made to look just like film: https://www.polygon.com/2020/2/6/21125680/film-vs-digital-de...


This should not be overlooked.

A film that exists is better than one that never gets made.

Digital reduces the cost required to get certain shots; makes certain shots possible that would be impossible with film; integrates more directly into cheaper digital production workflows that make it possible to pull of effects that would be impossible on film…

Much as we may nostalgically enjoy the aesthetic of the optical compositing of original Star Wars special effects, they were incredibly expensive and - compared to what can be done today for a fraction of the cost with all-digital workflows (eg in the Disney plus Star Wars shows), objectively not as good.


> A film that exists is better than one that never gets made.

Some might disagree. The democratization of content creation that happened with the internet and social media was in general a good thing, but it can't be denied that it also resulted in a massive reduction of the Signal to Noise ratio in content quality. The reason the Google search result page is much worse today is precisely because there exits tons of content today that should never have been made.


Sturgeon's law applies - 90% of everything is crap. 90% of movies shot on film were terrible too. Everything in the direct-to-video bargain bin in Blockbuster in the 80s and 90s was shot on film, and trust me, they were not all Tremors II.

So when more stuff gets made, more crap gets made. But the 10% that's good gets bigger too.

You want a world that contains Everything Everywhere All At Once? You have to also accept The Hobbit: The Battle of the Five Armies as well. Sorry.


There is nothing film can do that digital photography cannot do today. Any claims of film superiority must be based on subjective interpretation and existing familiarity with film like look.

Digital cinema can 100% emulate film like look if they wish to add rendering artifacts.


I liked the argument in the article: film “works” because it can do a bit less: it adds enough hassle that the rest of the production (e.g., lighting) is done more carefully, and thus the whole thing comes out better.


On a similar thread to this is the aspect of how film shoots did not have a video village. The primary people to see what the camera was shooting as the DP and Director. Video Assist came along (the old b&w flickering TV near the film camera) that allowed more people to see the image. Once a video feed could be provided to anyone, every body started to provide opinions. This actually slows things down.


Barriers to entry tend to raise quality. But people tend not to consider that a good argument for keeping them.


This just isn't true. Take a 35mm camera and a digital camera side by side to shoot the same scenes, and there will be noticeable differences. Some maybe subtle, but they are there. As I just commented somewhere else here, the highlights are a key area of this difference. Digital clips where film rolls off. I've seen so so many digitally shot scenes where there's an actor against a bright sky. There is nearly zero detail left in the sky as they've pushed the exposure for the actor's face. The details in clouds are absolutely gone. It makes me cringe every time I see it.

Edit: Wanted to add another example of the differences. Digital records by separating the RGB of the light where film does not. As an example of digital oddity from this, I’ve had to work with footage shot on an Alexa (IIRC) where the white whipped cream on top of the desserts had a weird exposure thing where when in post the exposure was pushed the white went to magenta. In order to get it white, you had to keep pushing the exposure to get the third primary color up enough to catch up to white. By then, the other 2 channels were well past useable. Since it was a commercial for desserts, the white in the cream was important to keep the details of the texture. Film would not have this problem.


Doesn’t colour film have different layers of emulsion sensitive to different wavelengths of light?

That’s not so different from digital having RGB filters right?


Could expand on the rolls off phrasing you just used, if you don't mind?

I have a decent amount of experience with digital audio is why I ask.


I don't think taken to this extreme this is true either. Sure for a given film setup you could come up with a better digital setup but that doesn't mean every digital setup is strictly better now than what you'd use if digital wasn't an option. and you could always construct some monstrosity of a camera that just throws more film at the problem to get a better result, though IMAX seems to be the limit of what people are willing to do on that front.


Why is digital trying to look more like film instead of the other way around?


Because people are funny.

It's the same reason you'll still find scenes and photos shot in sepia tone and black and white. People prefer the way film looks even though it's less true what was actually shot.

People like typewriters, vacuum tube amplifiers, and Vinyl records; Not because they are better than their more modern counterparts, but because of nostalgia and a preference for the "feel" of these things.

But, it just so happens that it's easy enough to recreate film effects that actual film ends up being just a masochistic activity.


This is it 100%.

Vinyl and Film may technically be less accurate at reproducing the original, but the ways in which they are imperfect gives them a “feel” which many like.


Because sometimes digital footage has to be matched to film. Alternatively, some directors enjoy film artifacts just like some people enjoy analog distortion artefacts in audio.

Additionally, you seem to be asking a lot of leading questions but not volunteering any opinions of your own. Is there something you'd like to share?


Because film has a color grade built in to some extent ewhile digital does not. The "look" people built into the film back in the day because they couldn't just do it in post can still be a great look today and is pretty orthogonal to objective quality of the information captured.


Basically all of film innovation history has been about constantly trying to look more like digital.


It's rare that people try to do that now, the person you were replying to said that's it's possible, not that it's common.


Hmm, not across the whole pipeline though. For instance, 35mm film tops around the equivalent of 12k columns of resolution, while a lot of digital cinemas are still equipped with 2k projectors.


This is a completely subjective opinion though. One person could prefer film, and another digital. What does this prove?


Maybe it's subjective, but if you ask X million random people and, say, 51% prefer film, then film is better than digital in the only metric that matters (which is, "people liking it"). In practice I'd guess it'd be more 70/80%...


That sort of false binarization doesn't help anything. If it's 51-49, there are lots of good questions to ask, like "who feels which way" and "what do they like about it" and "what things do they see as in each bucket"?

The notion that there's just one "film" and just one "digital" is another false binarization. There are many, many components to each, with notable changes over the years, as with most technologies. And there are cultural aspects to each as well, further confusing the picture.

The way forward isn't in zero-sum arguments over false binaries. It's finding the next step via the sort of careful examination and experimentation that got us the many previous steps.


It's the same as with audio: Many people like the effects of common limitations of non-digital tech. Digital processes can choose to add these effects if the creators agree - but they don't have to. And just because something can be done with digital doesn't mean doing it is better (e.g. in audio, mastering for vinyl vs brickwall-masterings for CD releases)


I'm not sure this is a fair question. An awful lot of people seem to think that instagram filters look "better" than simple photos. Style and clarity are not necessarily congruent.

You could just as easily watch Casablanca and proclaim "black and white produces a much better overall look than color".


If a movie production wants a digitally produced thing to look like film, they can do that, to a degree that you can't tell the difference. So no, film doesn't look better than digital.


What part of the process of light coming through a lens, being focused onto film, the film being developed, and then light shone through the film onto a sensor to capture a digital copy of the frame, versus the light from the lens being focused onto a sensor and digitized directly, makes the film based image ‘look better’?

Because the frames are going to be digitized, right? Nobody’s arguing that optical printing and editing by splicing reels of film together produces better looking results than digital workflows, surely?


I do. The talk of dynamic range, etc. is nonsense. That may have been true in 2008, but sensor technology has improved incredibly,


Emphatically. Film looks bland. People like film for the same reason they like vinyl; it’s psychosomatic.


Film looks better because it shows less information, not more information.

Gandalf's makeup and nose pores taught that to the moviegoing public.

Motion blur is a more common example.


The main difference to me between film and digital isn't the dynamic range, but how the edges of whatever the range is looks. In digital, it's clipped. In film, it rolls off. That is one of the key differences to me. An area that is overexposed in film looks much less harsh than the look of the digital overexposed.


One thing I notice when doing night photography is that digital sensors tend to saturate quickly, leading to loss of color saturation if one of the channels gets fully exposed.

Film tends to deal with that a bit more gracefully so you don't get white centers on colored point lights.


Tarantino uses film because he likes the artifacts film gives, not because it's objectively better. Same for PT Anderson. Indeed, both directors favor grungy 1970s aesthetics, which should give you an indication where they're coming from, stylistically. Nolan typically uses large film gauges like 65mm which are not comparable to 'regular' 35mm film.

Additionally, film does not have more dynamic range than digital. Take the latest sensors used by the Arri Alexa cameras for instance, they have a range of 17 stops. This is better than regular 35mm film. [1]

[1] https://www.arri.com/en/camera-systems/cameras/alexa-35#2730...


Audiophiles who prefer vinyl prefer it for the artifacts, not because it's objectively better.

However if you ask them, they will tell you that vinyl is objectively better, that faithful-up-to-the-Nyquist-limit digital reproductions sound "cold" and "mechanical", that digital doesn't have the dynamic range of analog, and that nothing can come close to capturing the sound of a live performance except an AAA recording and mastering process to vinyl disc.

It's the same with Tarantino. To him only 24fps on grainy flickery celluloid is cinema; the newer technologies are cold and lifeless and they hollow out the performances.


Yeah I always think about that Brian Eno quote:

>“Whatever you now find weird, ugly, uncomfortable and nasty about a new medium will surely become its signature. CD distortion, the jitteriness of digital video, the crap sound of 8-bit - all of these will be cherished and emulated as soon as they can be avoided. It’s the sound of failure: so much modern art is the sound of things going out of control, of a medium pushing to its limits and breaking apart. The distorted guitar sound is the sound of something too loud for the medium supposed to carry it. The blues singer with the cracked voice is the sound of an emotional cry too powerful for the throat that releases it. The excitement of grainy film, of bleached-out black and white, is the excitement of witnessing events too momentous for the medium assigned to record them.”

― Brian Eno, A Year With Swollen Appendices


I'm wondering how much of the critique in the article is down to the Arri Alexa look - it's pretty distinctive, so much so that I can usually pick out if a show or film was shot in it (latest example was watching "Dark" and I double checked, and indeed, it's an Arri Alexa prod).


Digital is vastly superior to film for photography. I shoot both, sometimes side by side. What differs is that film is more pleasing to the eye without significant post processing, digital requires it.

But in both cases, if you get your photo in the viewfinder before you get in the camera, digital always comes out better in the end. There’s simply so much more information captured, and you can in fact get more stops of DR with digital than with film.

Edit: To add to the above, I use a light meter and check DR in post processing. Film is usually good for 8-9 stops of DR (depending on film type) and is better in overexposed scenes with bright lights. Digital generally gets 12-14 stops of DR and handles underexposed scenes better at constant ISO.

Another factor is that in dark scenes due to finer grains, digital is capable of running higher ISO in producing recoverably usable photos with denoise filters (Nik’s Tools). On my current rig, ISO 3200 is usable, on film typically ISO800 is the max I would run, there are a few good ISO 1200 films though, but they are black and white. ISO isn’t directly analogous between film and digital, but it’s close enough for the purpose of this comparison.


Unless you're illuminating through slide film with a somewhat pure white light source, film also requires significant post-processing.


None of this is true, nor does it need to be. As my photography professor says, “digital is better, but I don’t want better.”

Roma is as beautiful as anything shot on film, Amelie from 1995 is as teal-orange medium-shot photo filter as anything on Netflix. The Crown is often beautiful because of impressive practical lighting effects.

[EDIT: Amelie is from 2001, I mixed up the date with the same director’s City of Lost Children from 1995]


Amélie is from 2001, not 1995. Interestingly, it is one of the first films that was colour corrected completely digitally. This is also why the movie's colour palette is so striking. Not necessarily to everyone's taste, but I really found it to 'pop' in the cinema.


Sorry, I was also thinking of the same director’s City of Lost Children, which is 1995 and still has a teal/orange aesthetic. (Apologies to the French for using the translated titles.)

It looks like City of Lost Children was graded by Yvan Lucas via an analog process called ENR. Subsequently, Amelie was the first digital coloring he did, and, since it was before LUTs, his analog experience was crucial.

https://www.digitalmediaworld.tv/post/2917-colourist-yvan-lu...


Is ENR similar to bleach bypass? They seem to be mentioned in the same breath but it's a bit vague.


It believe it’s a specific method for a partial bleach bypass on prints but that’s basically me being ChatGPT from random internet knowledge.


Film has become vastly inferior to digital both in stills and cinema if we're talking about technical aspects like dynamic range and colour accuracy.

The problem is, just like with CGI which has gotten pretty good at modelling light transport, it feels off because it's too clean and clear. Digital it's too good at capturing reality. Film introduces a bunch of artefacts and limitations that 'look' nicer than accurately captured reality. In part this might be because most of us grew up watching cinema styled that way, in part because it leaves more room for the brain to compensate for those limitations with imagination.

With the advent of easily available stylisation AI trained on film, we should see digital close that gap any time now.


Can you do an 8K remaster of a movie shot in 4K digital? With film you can just rescan the negative with a higher resolution.


This will not magically make extra detail appear, though. Film does not have infinite resolution.

For instance, the ARRISCAN film scanner goes up to 6k for 35mm. They could go higher, but there's no point.


Yes? There's a ton of AI upscales coming out and they're becoming really good; plenty examples on youtube including iconic scenes from iconic movies. Perhaps another 12 months max before AI upscales go from enthusiast/amateur remaster community to commercial.


Having an AI generate 75% of the movie pixels that weren’t there in the original work doesn’t really count as a ‘remaster’ imo


It's what most of those "HD remasters" of film are doing. (Well, sometimes it's a human or a chemical/mechanical process rather than an "AI", but either way you're fabricating details out of noise and/or thin air).


I suspect that getting objective improvements from a 4K to 8K film scan would require extremely well-produced film and probably a large-format film like 70mm.


> P.T. Anderson still only use film

https://www.imdb.com/title/tt11271038/fullcredits/?ref_=tt_c... it might be shot on film, but its graded digitally. So the argument doesn't really apply here. The colours you see here are a choice made in the DI stage.

> film produces a much better overall look than digital.

no, not anymore, that hasn't been the case since about 2010. (no ignore RED cameras, they are terrible at colour) All the (big) films you've seen since about 2007 and all the films since 2012 have been graded digitally (apart from maybe a mike lynch film) This means that the colours are not because its film, but because of a choice made by the Director of Photography, or producer or director. The thing you think is a filmic look is mostly digital. Yes, that includes films that were shot on film, like the Noland batmans.

Yes, that means even the film you saw at the cinema on a film projector was graded digitally. it was just lasered out to film and printed for distribution. Also, the other bit to remember is that new prints look much nicer than old prints. Old prints look like shite, dirty, scratched and sometimes washed out.

"real" film shot now is nothing like the grainy stuff of 25 years ago, it has a far higher dynamic range than then, and much smaller film grain. Its just that digital film cameras are now even better and more consistent. This crossover happened in about 2012.


Those 80's and 90's night scenes aren't better because they're shot on film. They look better because they're better LIT.

Digital is easier to work with in low light some productions will try and save money by cutting back there. Can't really do that with film w/o it being painfully obvious.


This is utterly false and bordering religious beliefs.

Steve Yedlin has dedicated decades of his life to ultimately prove [1] that digital and film can achieve the same exact look, save for the fact that digital is infinitely easier to deal with.

[1] https://yedlin.net/DisplayPrepDemo/index.html


28 Days Later was one of the first digitally shot movies and has some breathing scenes in it. I don't think digital is to blame.


I'm pretty sure at this point high end digital cinema cameras can straight up outperform film in all but the most extreme cases (like giant IMAX negatives maybe) but that's also a much more recent development than most people would think.

More freedom in post allows more freedom to do things audience may or may not like but I don't think it's technical limitations holding things back anymore.

Similarly I'm a bit dubious on the LED lighting comment by the anonymous redditor. Yes LED color rendering used to be pretty horrible but these days it really doesn't have to be. all the major LED manufactures have high end white parts for color critical applications and even color mixing is actually good now with the advent of phosphor converted broad spectrum colored LEDs.

All this stuff increases the available creative space but at the same time it might not be easier to find the good stuff in a larger space


There is some small truth to this - at least for image acquisition. The bayer filters in modern digital sensors are way to broadband, which perhaps counterintuitively reduces saturation, but more importantly reduces color accuracy from human perspective. This is especially apparent in the green-red spectrum.

The sensitivity curves of Kodachrome 64 and presumably 35mm film stock much more closely match CIE 1931 effective bandpass curves and results in much more accurate color.

However, most this can easily be fixed in post with modern tools. Almost all of what you are seeing today is artifacts of the Director/cinematographer/editor's artistic choices and not the acquisition tech.


Soderbergh has shot his last few films on iPhone and they look fantastic. Lighting and framing while shooting and good editing and grading afterwards are far more important factors.


This surprised me because I’d watched No Sudden Move by Soderbergh relatively recently and thought it looked pretty good. Reading more it seems like only Unsane and High Flying Bird were shot on phones (the latest of which was 2019).


Vignette is a useful and appealing effect. That doesn’t mean a lens with vignette is superior. The images straight out of the camera may be more appealing. Yet on the other hand, with a low vignette lens, you can add exactly as much or little as you want in post.

Film vs digital seems very similar, with a greater degree of “natural processing” happening in film.


> film produces a much better overall look than digital

To add one more data point to all the other sibling comments, Vince Gilligan (a self-described huge fan of film) talked on one of the Better Call Saul podcasts about how they did a blind test, and the consensus was that digital looked better. This was around five years ago.


Star Trek The Next Generation was famously shot on film. The HD scans are so good that the props can be seen to be fake in many places. Later franchise shows that were shot in digital simply cannot be released in HD as there is no HD source material.


This is completely incorrect. Later shows like Voyager and DS9 were also shot on film.

However, all post-production happened in standard definition, so all special effects, editing and colour correction would have to be redone, which is incredibly expensive.

They did that with TNG, but since those blurays didn't sell enough, they didn't bother doing the same for DS9/VOY.


Thank you, I did not realize that the difference was in post. There's no better way to get correct information on the internet, than to post an incorrect statement on the internet!


No worries! There's a good chance that anything I post that sounds remotely authoritative turns out to be super inaccurate as well :)


So film is better than the digital from that time, but not necessarily better than today's digital


> Later franchise shows that were shot in digital simply cannot be released in HD as there is no HD source material.

This is a persistent myth, but false. DS9 and Voyager were shot on film, like TNG and TOS. TOS was also edited on film, TNG and later were scanned and edited on videotape. This makes it much more work to remaster TNG and later shows for HD, which was done for TNG but not later shows because the TNG sales plus the perception that the other shows wwre less popular and would sell less made it not worthwhile from a business perspective.


> This is why Tarantino, Nolan, and P.T. Anderson still only use film.

Or maybe they are just old and incapable of adapting anymore.

There is a saying: "Physics advances one funeral at a time". Meaning that you need to get rid of the old-timers which are preventing the field from advancing.


Wasn't that the whole point of the movie? How horrifying the uncanny valley can be. Not just M3gan herself but the entire world that the movie takes place in. James Wan is a brilliant director.


Agreed. I think a lot of this comes down to stylistic preference of the movie.

Why should a movie about near future technology tie itself to scenes one recognizes today?

This edges on shaking fist at cloud.


yes, yes it was. The author has missed the point, its meant to make you feel unsettled and empty.


I haven't seen this movie, but I could see that. There are definitely some movies that use this quality as a part of their setting. See: Gunpowder Milkshake.


Personally it's poor writing & bad camera work that bugs me more than anything. Sure this digital style is apparent, but I guess it doesn't fundamentally ruin a movie for me (whereas the other two factors do).


There seems to be a major difference between Netflix US and their international productions. The later European one are, IMHO, somewhere between good and great in writing, production, acting. Those are usually done by Netflix and some local partner.

Every once in a while I come across a Netflix US thing, and it feels like having grabed something of a direct-to-DVD sale...


There was an article about a year ago that explains this. There was an internal, political battle between a lady who produced most of the Netflix originals that really hurt it out of the park like House of Cards and The Queens Gambit and folks who wanted more, but cheaper production. Well, she got fired. The international stuff AFAIK is just licensed by Netflix.

I rotate every few months through streaming services and none what's new like a locust swarm. Haven't subscribed to Netflix in over six months, but can't find enough to justify subscribing again.


> Everyone is lit perfectly and filmed digitally on raw and tweaked to perfection. It makes everything have a fake feeling to it.

It wasn’t until I started watching “Stuntmen React” videos on Corridor’s channel recently that I realized people are even still doing stunts. The footage in modern movies is so massaged and strangely well lit that it feels fake to the point that it might as well be the CG as I’d assumed it was.

You’re not getting the visual benefit to justify injuring these poor stunt people when it doesn’t even feel realistic at the end. Needs some grit, some imperfections. The things that made Jackie Chan’s early work so amazing.

They just don’t feel like they’re really happening, so there is no weight to it.


> You’re not getting the benefit of injuring these poor stuntmen

As someone who knows a lot of stunt people, I'd just like to point out that the benefit is not in injuring them, it's in the effortless action scenes. But yes, a byproduct of the action are the injuries.

We really need to credit stunt people better. They're the most likely to actually lose their lives during filming. Hence why some are pushing for a stunt category in the Oscars: https://movieweb.com/best-stunt-oscars-needed-why/


That's not at all what he meant. He was saying that in both cases the stunt man is getting injured, but in the case of digital it doesn't look better than CGI, so at that point why not just do CGI and eliminate the injuries.


I worded that somewhat poorly. I didn’t mean to insinuate that injuring them was the goal, I meant to say we are not getting benefit from the final product that justifies the means. I have reworded the post.

To my point though, the “effortless action” is the problem, the final product of all this work and effort feels effortless in that it feels floaty, fake and without consequence. You don’t watch a modern fight scene and get the visceral reaction older things bring.


I was watching through the Marvel movies just to catch up. Mostly I was surfing on my phone at the same time, as they're not that engaging. At one point during Winter Soldier though I caught myself having put down the phone and was sitting on the edge of my sofa, fully engaged in the action scene with an elevated heart rate.

Some time later I saw the "Stuntmen React" where they went through the scene, and it suddenly all made sense: that whole scene, the chase/fight on the overpass, was almost entirely practical stunts.


The John Wick movies have spoiled me for this exact reason; too many cuts in a fight sequence, and it feels jarring and unreal. It's sort of counterintuitive, but long uncut sequences demonstrating the quality of the choreography really ramps up the tension (IMO).


With all the special effects and video postprocessing in photos and films today, it's no surprise they look unnatural. I think we're seeing in film and TV the same phenom that's already big in photography and music — digitally filtered and synthesized media feels unnatural. Well duh. It is.

Video and audio are being compressed to exclude noise and 'purify' the primary signal to make the product pop (something that's also making TV ads feel very unnatural today). But this isn't new of course. For decades, postproduction and direction of film and TV has been simplified and its primary signals boosted to make more impact. The cost, IMHO, is that acting has taken a back seat to visuals and background audio, creating the zombified chesspiece actors of today, which are also a big part of this “uncanny valley” of all forms of video. The result is expressive subtleties present in our remaining great actors (who are now age 60+) like Judy Dench or Daniel Day-Lewis are absent in nearly all actors under age 40, unless they got their start in stage or outside American media, where the practice of overproduction has not been as rampant.

I think what we're seeing is simply the viral spread of special effects, digitization, and overproduction into all of media, with video simply suffering its effects last.


Also botox.


This is a great discussion and a great starting point for interesting discussions.

It’s quite interesting that Japan anime are adding more movie-like look and feel by adding complex scenes & backgrounds (see, e.g., a Dec 2022 anime series, Chainsaw man), whereas Netflix goes the other way around. It’s said that animes are trying to gain more mainstream recognition in overseas markets this way, although it’s adversarial to the relatively low budget in an anime as well as the way they are created.

I think Netflix vs traditional movie has something do do with expectations on the screen size to watch, TV vs in cinema. It’s about information density. There are media specifically tuned to be watched on smaller screens, i.e., phones.

I used to advocate simplicity in artmaking, largely influenced by East Asian art style and also tech advertising and interface design at the early 2010 era. But as I grow up, I started to appreciate complexity, texture, imperfection especially after I moved to NYC. This has become so much so that a empty-esque room makes me uncomfortable, not organic, not homecoming. Wonder if this has something to do with age and audience also.

The Netflix are tuned to the younger generation vs traditional movie tuned to the older. And there is also that Netflix tuning to overseas markets that are unfamiliar to American audiences.

There is much to be discussed under this topic, but here are some sporadic thoughts of mine.


That’s an interesting point about phones being the viewing space, not TVs which so many people commenting seem to be assuming. That would obviously affect the choices used on production and it wasn’t addresses in the substack post that I recall.

It reminds me of David Lynch telling people to stop using their phones to watch films: https://www.youtube.com/watch?v=wKiIroiCvZ0


I want to talk about the blandness of the acting in Netflix shows. As part of the young-adultification of everything, they tend to have this po-faced life-and-death fake gravitas. They're full of scenes like "a vampire coven takes a vote on whether to excommunicate, kill, and devour one of their own because she fell in love with a human". But the characters seem like children or young teenagers, despite being played by, and even representing, adults. It's like watching a precocious, somewhat psychotic 10-year-old making her Barbies plot to murder each other.

Contrast that with, say, Beverly Hills, 90210, which aged remarkably well for a 30+ year old series. It too gets rather self-serious at times, but the acting talent was really good and well-directed and they play their parts with a sort of relaxed, open California swagger that makes them seem honest and likable. 90210 has some interesting cinematic choices as well: the colors appear saturated but both the video and audio are soft and fuzzy, for what I guess is sort of a wistful look that seems less like how things actually happened and more like how the main cast's older selves remember it happening.

Some exceptions exist, of course. Stranger Things is incredibly well acted and goes for 80s verisimilitude, right down to authentic Tandy props and making the bedrooms look like they came out of E.T. with the lighting and set decoration. I Am Not Okay With This has a real honest feeling to it as well. They shot on location in Brownsville, PA, and really tried to capture the feel of the town in the background.


I decided not to watch their Sandman adaptation despite generally positive reviews because the unmistakable and garing Netflix aesthetic of the trailers soured me on the prospect. It's like they ruined a graphic novel with an AI style transfer to "in the style of Netlix".


It looked pretty good and it was a great adaptation. It also didn't really look that much like their other shows (many of which are indeed extremely mediocre) and it sounds like you just have some anti-netflix bias.


That’s an unfair rebuttal and a bit defensive sounding. I agree with OP that it has that Netflix look which would make sense in it turning off a fan of the comic.


I'd say it sounds like I made a superficial judgement based on the trailers, which is exactly true as described!

Would you say the finished product has more visual flair than the trailers indicate? It may be they intentionally picked shots or changed the trailer footage to fit in with the brand guidelines.


Focusing on whether or not it has "visual flair" or not seems to be slightly missing the point. For what it's worth I found the visuals solid, but not mind blowing, and yes it does feel quite 'Netflix' in places. However beyond that, Sandman is probably the best example of a virtually panel by panel comic book to live action adaptation I've seen.

Just getting to watch David Thewlis as John Dee stealing every scene he's in and seeing what Gaiman managed to do with his second chance at doing 24 Hour Diner is easily worth the price of admission. Add to that the pitch perfect recreation of The Sound of Her Wings, and the surprisingly well done Dream of a Thousand Cats and Calliope and you have show that more than makes up for some of its missteps (I thought the collectors convention didn't work anywhere near as well as it did in the comics for example). If you don't want to watch the whole thing, I recommend giving episodes 4-6 + 11 a go.


IMO it was a good looking show and I'm not a fan of super hero movies or really Netflix content.


Sandman was a remarkably good adaptation and I enjoyed it thoroughly but... yes, it has that bland/cheap Netflix look. I think it's still worth watching though.


I am happy for Sandman to have this slightly surreal aesthetic, as it fits with the story line. It can be more annoying in "real life" stories.


There are a lot of reasons to not like an adaptation. I've watched Sandman, and I completely disagree with you. They did a great job. Even to the extent of doing certain scenes I didn't enjoy, but still thoroughly liked and respected. It was a breath of fresh air in an otherwise sea of "greatest common denominator appeal". Avatar, God of War, they all suffer from this.


At least generative AI can set the temperature to vary between bland and crazy. Netflix is stuck on T=0


Try turning off the colour improvement settings on your TV. It was a good series, I enjoyed it.


At first I thought this was about how Netflix arranges movies for you to select, where each movie repeats down a horizontal line and you are forever scrolling right in cardinally equivalent but combinatorially different sets of movies. :D

Things you might like:

A, B, C, D

GREAT MOVIES:

B, C, A, D


I don't really care about technical specs of cameras or film or whatever, but I think the author is largely just wrong. At least on their comparison between Moonshot and When Harry Met Sally. Those are just different films. The Moonshot still is CLEARLY using color with great intention. It's not a realistic scene. It's a carefully designed shot. The characters, the coffee, and the table are yellow. The background is green. That's interesting. It makes the scene feel very small and intimate. The Harry diner scene is cluttered and chaotic. You have a bunch of background characters that add to the scene by looking at the main characters. You get a lot from this still. That's also interesting.

I have not seen any of these films


> The Moonshot still is CLEARLY using color with great intention. It's not a realistic scene. It's a carefully designed shot. The characters, the coffee, and the table are yellow. The background is green. That's interesting.

I would be much more inclined to agree with that if every fucking movie in the past twenty years wasn't also using the exact same gold-cyan two-color palette: https://tvtropes.org/pmwiki/pmwiki.php/Main/OrangeBlueContra...

It's just lazy. There's nothing interesting about it. Because every movie has a digital color pass applied now, they feel obligated to do something with the color, but they have no idea what, so they just do the same thing over and over.


Well, first of all, this scene is clearly green and yellow, not orange and blue. This is not a common color palette. It's not a completely new color palette, because that's a dumb concept, but it's not common. Green and yellow are analogous whereas blue/orange are complementary.

Second, the orange / blue contrast is more of a film trailer / film poster thing than the actual films themselves. Most people will agree that trailers and posters are more purpose fit to marketing than art

Third, it's not just a lazy trope, it's also because contrast is useful, and alternatives are not as easy to work with. Pink and purple don't fit most natural contexts. Orange has fire and natural light hues. Blue has the sky, water, and is a good way to imply darkness without the content actually being dark and hard to see (when contrasted with orange).

Fourth, it's really not every movie. That's a lazy exaggeration. Googling top movies right now I see fewer than 1 in 10 movies has this color scheme on their promo image if we're willing to extend it generously to bright blue and yellow. Specifically Devotion (white, blue yellow); Emily the Criminal (blue/gold); Spiderman No Way Home (Blue, Gold, Red). It is also true for 0 out the top 10 shows and 0 out of the top 10 movies in the us currently shown on the netflix homepage (although netflix will swap these around for different viewers. I think I've seen some Glass Onion thumbnails that are bright blue/yellow).


I’m gonna have to totally disagree with you and say that the scene is definitely amber and teal, not green and yellow. Look at their skin tones, and the plates/walls.


My eyes are drawn to the apron and the average of the two plate colors personally. But sure, whatever. We can at least agree that amber and teal are still not at all the same thing as the orange blue contrast as they are different colors and not complementary (both taking a step towards yellow).


> every fucking movie in the past twenty years wasn't also using the exact same gold-cyan two-color palette:

You really should watch more movies. Colour has found new popularity after Spiderverse


We made a circle 'black and white' -> 'color' -> 'blue and orange'. Modern movies are too dark and lacking colors badly. All those to cheapen production I suppose.


> But every time I watch something made before 2000, it looks so beautiful to me—not otherworldly or majestic, but beautiful in the way the world around me is beautiful.

For a contemporary example that underscores this point, check out "All Creatures Great and Small" https://www.pbs.org/wgbh/masterpiece/shows/all-creatures-gre...


Counterpoint. Refn’s “Copenhagen Cowboy” - which was recently released on Netflix - is a visually stunning masterpiece of Neo Noir aesthetic.

However you feel might about Refn’s work - he is kind of polarizing - bland would be the last word you would use to describe it.

So I think the problem is less about formats and technology mr more about artistic vision, skill, and execution. There is a lot more content being produced now, and thus a dilution of top talent.


This week I read an article in Portuguese about how the Brazilian largest TV is having difficulties on creating new soap operas.

On social media I saw one interesting argument: the current generation that is producing the shows grew up watching GoT, and lost touch with how soap operas used to look. Until recently, Brazil used to sell soap operas to other countries ("The Clone" was a bit hit). By creating content that mimicked USA's tv shows, they lost the uniqueness of its content. Maybe something like that is happening on Netflix as well? People watched GoT/CSI while in college and now we have blandless everywhere.

Google translated article: https://translate.google.com/translate?sl=auto&tl=en&hl=pt-B...


In the US soap operas died because reality tv was cheaper to film and people found the fake drama in reality shows more interesting than the fake drama in soaps.


This same narrative is used in music all the time. Technology makes it ever easier to produce near-perfect music.

Every note is perfectly in tune and on time. Every drum hit is perfectly on the grid and even acoustic drums are typically layered with samples of perfect drum hits.

Fewer analog instruments are used in favor of synthesized sounds.

Some of this is aesthetics, but much of it is profit driven.

I don’t think it’s my, or anyone’s, place to say if it’s bad or good though.

If anything, I think we’re just unfortunate to be living in a time between organic analog and digital that’s so good it’s even more organic than analog.

30 or 50 years from now, I think this will all be sorted out. Music will be super easy to make and will sound like musicians in a room like the good ol’ days. Movies will feel rich and deep and real. Video games will finally not feel like Doom with higher resolution.

Of course, then people will long for the simple days of the 2020’s when movies, music and games were so much better.


There's a link after the main article's content to "The most interesting thing I read this week[1]," which was also pretty interesting:

[1]: https://www.garbageday.email/p/the-scooby-doo-psyop


I'll take this thread to plug my totally subjective hare brain theory/bias.

I've noticed that I like movies that are shot with the blue filter (or atleast thats what I call it) - everything has a tinge of blue (the filters are almost closer to blue-grey; something like #374358). For some reason, I have noticed that, to me, they also turn out to good movies. Example would be Margin Call[0] (though in some scenes the blue is mixed with a golden hue which actually makes it look nicer); another one is The Grey[1].

[0]: https://www.imdb.com/title/tt1615147/

[1]: https://www.imdb.com/title/tt1601913/


Certain camera/sensors and lens have distinctive visual aesthetics and also in the colour grading process, and much of it is a mix of art and science and the artistic vision of colour is embedded in the creative and technical aspects of film making from sets to costumes to what camera sensors and optics combination look like and how to change or enhance it in post production.


It wasn't shot with a blue filter. It was graded in post to have that look. Before RAW, it wasn't unheard of to have a "blue filter" from shooting with the wrong white balance setting, but it was a totally different blue filter look.


Just a current trend. You can watch older shows, wait a few years for the next trend, or watch shows made in different countries (not for netflix of course), if you like something else.


I recall thinking about this recently when watching a Netflix series. I'm far from a TV/film production connoisseur, but there's something about the aesthetic that makes it stand out. I wouldn't say it's better or worse, just different. With most things related to design, I'm sure the trends will shift over time as a means of differentiation. Perhaps, once AI video generation becomes more standard, we'll see entirely new production styles.


It's going to look goofy and dated in 10 years.


Yes, hello. It seems to be a matter of emphasis and sane defaults. Narrative is all about controlling emphasis. Film vs digital changes the default choices around which you build emphasis. Noise is an unreasonably effective tool for creating depth to contain emphatic range, so removing it from the defaults is going to create affective flatness on the margin.


not long ago there was a french short doc about the disappearance of colors

not sure if it's based on this https://www.youtube.com/watch?v=5hYlF3YIIjA

but there's an evolution toward neutral (colors and textures imo)

the post 20s are anthropologically mind boggling..


Stranger Things immediately came to mind when I read the headline. Aside from the amateurish writing and the storyline that quickly became complete non-sense, the look and feel of the show was very much the equivalent of a fast-food chain. Overly saturated colors, everything looks punchy and overly sharp.


Haley Nahman failed to talk about the ACTORS and how they act; the word "actor"/"actress" appears 0 times in this article. Just compare those boring faces with the personalities of former decades!


I saw The Pale Blue Eye on Netflix last night and thought it was very well done and aesthetically detailed.

But one example isn't a counter argument and I agree with OP's assessment. I just figured it was a production trend in general these days, with some outliers.


Since we're already complaining about contemporary video production, may I please ask all YouTubers to please stop using stock videos please. I had to stop watching a few YouTubers, because it was unbearable.


Current trend. It’s very similar how every PS3-era game had a brown color tone


digital can look like crap. but it can also look pretty good. its down to the artistry. easy to fuck up in ways film couldn't be fucked up. and don't discount the fact that everything is being watched on OLED which is not as good as LCD even in rendering subtle gradients such as skin tones.

I'm watching Snabba Cash right now on Netflix. The settings are gritty, the lighting is raw and creative, the colors are splashy and true. It looks great.


Is this any different in kind from the "visual blandness" of 3-camera TV sitcoms, or of prime time dramas?

Does it matter? It's a clean backdrop for a story.


The story is boring if it's not set in a compelling world. If I wanted a clean backdrop, I'd go to a stage play.


Another thing: both movies like M3GAN and shows like Wednesday have the prerequisite meme dance, ready for girls to replicate on TikTok.


So much commentary on the landscape of TV and film today reminds me of the Dogme 95 movement. Maybe those guys were on to something after all.


Another possible contributing factor: Spoilers.

Location shooting exposes production details to universal public view.


Article only has two examples. If they have a point there should be plenty of examples presented.


I think reasons are simple: Untalented staff in general. Especially art directors and set decorators of Netflix shouldn't be allowed near 5 ft of any camera.

These two professions are usually underrated but they usually make or break the show.

80s had so many talented art directors/set decorators but it seems Netflix is currently incapable of attracting similar talent.


> where the movie was set, and I realized I had no idea

I guess she missed the opening shot of Seattle?


I was triggered by the first few paragraphs of this piece. It was obviously and unmistakably Seattle throughout, with a few shots of Portland, OR for the hospital.

I find the rest of the piece as insightful and perceptive as the first paragraph.

The director is the same that made Malignant, which was also set in Seattle.

P.S. the real reason M3gan felt empty is because most of the gory parts were cut to achieve a PG-13 rating.

https://www.polygon.com/23546390/megan-sequel-release-date-u...


You know, you don’t have to watch any of this stuff.


TLDR;

But is the synopsis, background actors and set designs cost money and effort, so we didn't bother?


as someone who used to work at a digital intermediary (DI) company (think photoshop filters but for film) most of this is bollocks.

> But every time I watch something made before 2000, it looks so beautiful to me

No, the author is confusing nostalgia with aesthetics.

Heat for example is a great movie, fun story, good suspense. It looks ok, but its colours are all over the place. you don't notice it, because its an old film and old films should feel like that.

The quote:

> Tungsten allows for more accurate color in camera but LEDs are cheaper, cooler, and more convenient. So the solution is to film on a nice digital camera and fix the color in post.

very much nope. None of the stuff you've seen in films since about 2005 looks like it was shot. No film ever gets to screen without colour correcting. In fact every setup will have a macbeth chart film so that they can conform the colours so they are consistent.

More over, film is colour corrected in 16bit log, which is way more colours than most of us can observe, and a fuckton more dynamic range than can be displayed.

yes shite LED lights have worse spectral response than tungsten, but its fairly easy to correct for (hell if you are good enough you can film under sodium lamps and make it look good. )

but ARRL lights have full RGB, with loads of effects that were not possible before. I as a mere mortal can light a room like the Holywood golden age for <$1000(but not with ARRL lights, because they are $5k each). For the big players, you can do something ridiculous like this: https://ymcinema.com/2022/09/26/lighting-a-desert-magnificen...

> Another user mentioned that shooting on film required a level of forethought, planning, and patience that digital simply doesn’t

Yes, it also means that you can't take risks, because every foot of film cost a fuckton to process, you have to rush it to get developed, then view the rushes, print the stuff you like or re-shoot. reshoots are very expensive. If you miss a shot, then you are fucked, because it'll blow a hole in your budget.

digital allows instant playback, much less reset, and longer shooting in much more challenging lighting situations. For example Barry lyndon could be shot without custom lenses.

Digital allows multi camera shooting, meaning that you can get much better acting from everyone, because a "two shot" takes half the time.

So to the meat: "all of netflixs looks the same"

No, you are looking at fashion. Colour and feel of a TV show is completely under the control of the producers/directors/DoP. They colours you are seeing are a conscious choice. When the next Coen brothers movie pops up, you'll see a rush to have warm, low contrast rich colour movies again.

Marvel movies look like that, because they chose to.

Marvel Tv series look like that, because it was a choice.

They respond to fashion, which means that at the moment shit is dark and grundgy, because thats what's in.

Black and white will come around shortly.

TL;DR: the author is talking shit about stuff they are making assumptions about. none of the data there is accurate. Its just a reworked classic: "everything was better in the old days, the youth don't put any effort into the craft" which given that this is literally the tail end of the golden era of TV is utter risible bollocks.


From someone who is really appreciating last 10 years of movie and TV productions, heartfelt thank you for calling out nostalgia baiters.


> most of us can observe

Percieve.

Sorry for nitpicking


don't worry, this is how we learn


That's true.

But regarding nostalgia and colours I can tell you a personal anecdote:

I first saw Blade (1998, yes, that one with Snipes) on a pirated VHS copy on a not new Funai recorder on a 14" SHARP TV.

It was a dark, grainy film, blue on black, with occasional red and white splashes. Visually it was close to Sin City.

Years later I stuck on it on a public TV. It was so bright it's almost made sense for Snipes protagonist to wear sunglasses everywhere - it was bright as day even at night scenes.

I can't watch "a proper" Blade, but not because it's quite a shitty movie (it's fine for it's niche) but because I first saw a dark, noir-like action film.

And we watched Babe (1995) on that set too, but I needed to constantly correct the tracking because the video was out of sync all the time. Years later I found about Macrovision and years after that I put two and two together.


The author doesn't address that blockbuster movies have always found a "trendy look" and run with it. Is what she claims Netflix is doing any different than any other popular cycle in filmmaking?

"It's actually, specifically, about how movies these days look. That is, more flat, more fake, over-saturated, or else over-filtered, like an Instagram photo in 2012, but rendered in commercial-like high-def."

The "2012 Instagram filter" was pulled from the same playbook as the popular movies of the 2000-2010 era. However, the author isn't arguing that this trend has been taken too far; just stating that everything looks bland and the same. Of course, it does. Only some movies are original and try something new because audiences like familiar things. Successful visual storytelling methods are recycled just the same as scripts.

When comparing "When Harry Met Sally" and "Moonshot" the author says,

"... The latter is more polished and "perfect," but to what effect? It looks strange, surreal, both dim and bright at the same time. Everything is inexplicably blue or yellow, and glows like it's been FaceTuned. Meg Ryan and Billy Crystal, meanwhile, are sitting in a downtown New York deli that actually exists. The image is a little grainy, the lighting falling somewhere in the normal daytime range, and they look like regular human beings. ..."

One is a science fiction film the other is set in a New York deli. So they both don't want to evoke the same feeling in the setting to the audience.

I had to stop reading when the author said,

"At the risk of using an anonymous Redditor as an expert, lol, I found a comment under a thread called "Why do movies look so weird now?" that captures a lot of these same complaints:"

Then she goes on quote the Reddit comment. LOL right? The Redditor's comment was a single paragraph and summed up the author's entire point of view. She just went on to write a longer post that failed to say anything additional. Read the Reddit comment she quoted and skip the article.


> The author doesn't address that blockbuster movies have always found a "trendy look" and run with it. Is what she claims Netflix is doing any different than any other popular cycle in filmmaking?

I think this is pretty clearly the author's thesis--movies now look a certain way and they used to look characteristically different. I think the main point is that the author (and apparently hundreds of people here) find the current visual style really lacking compared to past stylistic movements. They aren't refuting the historical existence of the trend as much as the current character of it.


Let me summarize the author's viewpoint.

"I've changed/am unhappy now, I don't know why, a deep introspective look into my life is difficult, therefore older movies are better and I'll find a way to justify that"


On a related note, why is Netflix's interface so locked down? Why would it be bad for me to turn off autoplay for trailers? Why can't I say "never show me this show again, I'm never going to watch it"? And why, in this age of AI, and they still putting white text over a white background? That last one, especially, made me say "do people at Netflix eat their own dogfood"?


>On a related note, why is Netflix's interface so locked down?

Partly this the fault of Apple. They championed the idea that UI is an artistic expression and therefore 'configurability' and 'customizability' is akin to heresy. This view has now infected many existing products.

Part of the reason why many companies jumped on this bandwagon is also because 'customizability' is hard(er) to build in, and certainly more expensive to maintain.


> Part of the reason why many companies jumped on this bandwagon is also because 'customizability' is hard(er) to build in, and certainly more expensive to maintain.

It's also harder to A-B test with so many variables. If tests aren't statistics significant, the value of user analytics and UX experimentation decreases from a "% lift" perspective. It's harder to know if a feature change or a user defined config had a causal relationship to some other metric.

It may be the A-B testing tail wagging the dog.


That's a good insight. A bit like unit tests, A-B tests drive the design of the website to make them testable. Not sure that is ultimately the best choice for UX, but here we are. I suppose it makes it a lot easier to justify one's perf rating.


Maybe on the perf rating, but on a bigger scale it could make it way more complex for analytics departments to function. Complexity adds real costs when that department exists to increase revenue and retention, and iterate quickly.

Those types of tests serve two very different purposes. UI is also unit testable.

Unit tests are more of a binary pass/fail. A-B tests are looking for cause-and-effect relationships by comparing some metric between a control group and a variant group.


Neither of those is likely the reason. This is about A/B testing your way to success, growth and engagement. Allowing too much disabling of engagement driving features or too much customizations hurts metrics. It's the same reason FB forces you from the date sorted timeline back to "magic" sorting every few days.


Ironically, Apple customization has become really good. Not in terms of tweaking the look, but the functionality tweaks like mentioned above? iPhone is super customizable.


Can I put all the app icons in the bottom right corner where I can reach them with one hand yet? Or does it still force them to float at the top?


What about “make this arbitrary wav file a ringtone”. No?


It has to be an AAC file with the magic extension .m4r, but yep, custom ringtones are possible, have been for years.


First google result:

"How to turn autoplay previews on or off"

https://help.netflix.com/en/node/2102

On mobile:

1. From the Netflix app home screen, tap the profile icon or More

2. Tap Manage Profiles

3. ...


I have that turned off, but they still play.

I contacted Netflix support, and they basically said, "Sorry. That doesn't necessarily actually stop them. We can't share anything about whether we have plans to change that."

It has been like this for years. I don't know if it's a glitch with my account or if they are doing it on purpose. Neither possibility makes me a happy customer.


Call customer service, tell them you'd like to be removed from all A/B tests, because it's messing up your experience. They should still have a button to do that.


Most people are in the "On all other devices" category. Having to pull out a computer for what should be an in-app setting is ridiculous when you can sign up on some of those devices directly.


Are there third party frontends? Why can't you just pay them, authenticate, and use whatever frontend or player you want?

I suspect that the answer is ads. The Netflix app is an ad billboard on your phone and TV. Also, copyright and DRM.

Edit: I wonder if draconian copyright laws are ultimately to blame here. Nobody is allowed to provide such a service. This is getting a bit off topic though since this article is specifically about content aesthetics.


I feel the same way about Spotify. I'm fine paying an honest price for a useful service, but the upsells and dark patterns have gotten so aggressive I can't stand the client any more. I wish they'd at least allow third party clients to access basic functionality... but decisionmakers don't understand the attraction of a third party client in the first place, let alone feel like unlocking that functionality when it could negatively impact engagement with their dark patterns.


I don’t think I’ve ever seen an upsell in Spotify, to what does this refer? Are you not paying for Spotify?


Aggressive podcast placement is the big problem, but they also allow artists to "promote" their music by accepting lower royalty payments. And those artists show up much more often than other artists in radios and suggested playlists.

As someone who doesn't like the podcast walled garden Spotify is trying to build (I prefer my podcast market with competitive open standards and choice of clients, thank you very much), the podcasts really bothered me. As a paying user, I should be able to turn them off entirely, but they kept overhauling the UI to make them more and more prominent. I stopped using Spotify a year ago today, and I've been very happy managing my own music library with Jellyfin since then.


Because third party front ends will show competitors shows next to Netflix shows and none of them want that.

That said, we are sort of there. Google can tell you where a show is streaming and I think Apple TV displays shows from different streaming services. Just not to the point anyone that wants to can build one.


I’d probably call it promoted content, but yes, that seems a likely explanation.

And quite possible the ads-driven version, but I’ve never looked at that.


It's an ad though. They aren't going to advertise a new dishwasher but they are taking money from somebody to promote something.


But while there’s zero chance I’d willingly watch an ad without good reason, I might actually enjoy that promoted content because it just so happens to be of interest to me. That’s the difference between the two for me.


> Why can't I say "never show me this show again, I'm never going to watch it"?

Maybe someone is paying Netflix to suggest this movie (edit:content) to you ?


I've never heard this is the case, and if there was a secret deal that this was the case I think it would have been leaked years ago, but also it would be difficult for this to be the case and not have it be public knowledge because accounting details of large companies are something that is often inspected by news organizations and others on the lookout for something to complain about.

But maybe it can be that it is public knowledge and I'm one of the ten thousand who doesn't know? If so, do you have a link?


> But maybe it can be that it is public knowledge and I'm one of the ten thousand who doesn't know? If so, do you have a link?

Nope, it was just a thought. Sorry if I came off as implying shaddy things. I agree with what you wrote and it would be surprising something like that had been going on for years without anyone noticing (or revealing it).

I suppose it'd be less expensive to advertise a show outside of Netflix UI.


My interpretation >Maybe someone is paying Netflix to suggest this movie to you ? Answer:you We like have control, but we tend to prefer don't have control and have good enough decision makes for us


> … accounting details of large companies are something that is often inspected by news organizations and others …

Which is why Apple withholds fabulous amounts of detail in their financial disclosures.


Sort of? Netflix paid an upfront cost for its own content, but a recurring fee for third party content. So they try to push users to consume their content, which they'll own forever, as a bargaining chip and savings potential when the time comes to renegotiate those recurring fees.

So nobody's paying them, but Netflix effectively pays more (in the long run) to show you third party content and would prefer you watch content they produce for a one-time upfront cost.


This is actually backwards. First party content costs Netflix more. They license everything the same way -- paying someone else for the content. For example, they didn't make Squid Games or Wednesday. They just got an exclusive deal for a period of time. This actually costs a lot more than licensing already existing content, because they have to pay extra for the exclusive rights and they have to pay enough to cover the cost of making it.

They push you towards exclusive content because it's what differentiates them from the competition.


Good point. Is it correct to say that on a long enough timescale, self-produced content is cheaper than licensed content? Or is the production cost so high that that basically doesn't matter?

Makes a lot of sense to push the content that folks can ONLY get on your platform, since that's a reason for customers to stick around. It's too bad they overdo it to the point where I actively dismiss (and get annoyed by constant promotions of) Netflix shows unless a personal friend indicates one is actually worth watching.


> Is it correct to say that on a long enough timescale, self-produced content is cheaper than licensed content? Or is the production cost so high that that basically doesn't matter?

Depends on the production but it definitely isn't rule or trend. For example, look at this page with costs to make each episode:

https://movieweb.com/best-sitcoms-of-the-2000s-ranked/

Many of those are Netflix originals. What it means to be an original is that it was on Netflix first and it's the only place you can find it at first. And back in the day, that only applied to the USA (now Netflix does worldwide rights, but House of Cards for example premiered on other networks in Europe at the same time it was released on Netflix).

But once that contract expires, anyone can license it for the next go around. Generally Netflix does a long license and locks up IP rights to prevent that, but that just makes them have to pay even more for the content.

If you look at the cost per hour streamed, the originals definitely aren't the best -- stuff like a show from the 1960s that they got super cheap that ends up being super popular end up there. But the originals are the prestige items, so they're willing to be less efficient there for the unquantifiable brand boost.


I can't stand the fact that pausing Netflix to go fix up a plate of dinner inevitably leads to an obnoxious adroll for shows I've already seen or don't want to watch. And when I try to resume my show, I end up out of the video player, have to click the resume button, and I then deal with 20 seconds of garbage quality video as the bitrate settles and buffers.


I think there are three reasons:

1. They've decided to have a single interface for every device and Netflix supports a lot of devices. So it basically can't have any features that require more than a d-pad input.

2. They want to obscure the fact that their catalogue is really quite small. That's why are very limited manual filtering options and no advanced search. You'd very often get 0 results.

3. They're still in the "A/B testing can solve anything" and "we must optimise for engagement!" phase and haven't realised the problem with that. They probably A/B tested showing you stuff you'd already seen, found it increased engagement and said "ok it must be good".


How about Netflix stops showing me movies I’ve already watched at the screen expense of discovering new content? Just keep all those movies in the “Watch it again” banner and out from everywhere else?


I think people in "the middle" watch and rewatch the same things. I don't do that for movies, but I am guilty of doing it for some shoes like The office (still on Netflix in my region... For now)


If you ask me the true reason is that we've accepted DRM to exist in our software and on our devices.


On my laptop, I was able to disable autoplay for the trailers. IIRC it's in your profile settings.


Not just interface lockdown, but the insistence that They Know Better. Netflix, YouTube, pretty much any video service I've tried, actively prevents someone from excluding search results while simultaneously putting the offerings They Want You To See toward the top, almost without regard for the search prompt. I know that this would make the mile-wide, inch-deep problem for most catalogs readily visible to many people, but not being able to say, "DON'T GIVE ME THE RESULTS I DON'T WANT", does not endear a site to this user and many others.


"thumbs down" a movie to say "I'm never going to watch it", you won't see it again (great for getting rid of dumb movies in the big promotional banner on the home screen as well)


> On a related note, why is Netflix's interface so locked down?

A/B testing. The product org has always been driven by A/B testing, which means it sometimes finds a local maxima which they then stick to. They've also been driven by keeping things simple, both for the user and the developers. Having a single interface means the user gets a consistent experience (ironically broken by having so many A/B tests) and also means that testing on the backend is easier, because you don't have a geometric explosion of combinations of settings to test.

The fewer settings there are the fewer things to test when you want to change something.

> Why would it be bad for me to turn off autoplay for trailers?

Because A/B testing has shown that overall it's better for customer retention. They're willing to give up some users in exchange for the gains they get from having it on.

> Why can't I say "never show me this show again, I'm never going to watch it"?

Because people lie to themselves. A lot of people will say, "Schindler's List is an amazing movie!" and keep it on their watch list, but yet never watch it again, and then will say "Jackass 3 is the dumbest shit I've ever seen" and never put it on their list but watch it 15 times. People's intentions and actions don't always match. Netflix used to have a way to remove stuff from being shown, and then get calls into customer service from people who had chosen that button asking how to get the movie back because they actually want to watch it.

> And why, in this age of AI, and they still putting white text over a white background?

While you could fix this with AI, it would take a heck of a lot of computer power for a relatively small fix for few people (not a lot of people use captions). You'd have to review every film with every set of captions. That's a lot of hours of AI review.

> That last one, especially, made me say "do people at Netflix eat their own dogfood"?

When I worked there, most everyone watched Netflix every day. In fact, the only thing we didn't dogfood was the billing system because we all had free Netflix, and then when it broke, they gave everyone an $8/mo raise and told us all to sign up for paid accounts so that we could test the billing system with different billing days and different payment methods.

But very few that I knew of who used captioning. Most people don't like it and find it distracting. So when there were captioning errors they were usually caught by customers.


"Growth & engagement".


You can disable autoplay when logged in the browser, the app will respect that setting.


They're preparing to pivot to telescreen development.


[flagged]


What do you think is bad about it, and why?


Don’t watch Netflix movies. Watch actual lower budget/indie movies like Red Rocket that still look and feel like movies..


I think this is true for the people writing articles about "The Visual Blandness of Netflix" - which makes me wonder who they're actually writing to, beside the echo chamber. A24 is great for this crowd. But, they're already aware of it...

The majority of people watch TV and movies to be mindlessly entertained, not to be enlightened by some high-form of art. Netflix and Disney+ is just fine for this crowd.

You're not going to write an article and convince anyone their movie watching habits are wrong.


Once you're aware of something, it's very hard to make yourself unaware of it. Once someone pointed out orange-and-teal grading to me, I couldn't help but see it.

It's like UX - once you understand it a bit, you see things everywhere that can be improved. You can't help it.


Good point. I actually don’t have a huge problem with the “Netflix aesthetic”. I just try to mix in a healthy dose of older movies + newer indie movies to balance it out (think along the lines of Kelly Reichardt / Riley Stearns / Sean Baker / Lynn Shelton (RIP))


If the majority of people are just fine with watching Netflix and Disney+ how do you explain their steep drop in stocks?


Netflix added more than 7 million new subscribers in the last 3 months, bringing the number of households to over 230 million worldwide. And that's just people who are paying. Many people share their accounts so we can assume another few million people who regularly watch but don't pay.

https://variety.com/2023/tv/news/netflix-subscribers-earning...


But if you look at the chart two years, you'll see how even if it's building up, it's still a steep drop compared to their high point circa in 2021.


Short term knee jerk reaction. Most people in the world with broadband connections still don’t have streaming TV, but everyone wants to watch shows and movies. Netflix is best positioned, focused, and has a stellar leadership team.

Let’s compare notes on stock price in 5 years.


That's because of the change in interest rates and the dollar, economy-wide. If you think a stock is doing poorly, you have to show its alpha vs the market.

Also, Disney is more than a TV channel.


Disney being more than a TV channel is obvious, but if you bring that up, you also have to square up that the theme parks have been bleeding money

On the first point, then why are analysts still bearing on Netflix and given it a "no-invest?"


Interest rates


This is probably addressed to me. There are a few problems. I don't know which "actual" movies to watch. Netflix has a list for me. It's news to me, but I don't know what movies look or feel like. I guess I'm not very discerning, but the Netflix look doesn't bother me, if I can even notice it, which I wouldn't bet on.


I’d recommend browsing new releases on Vudu. Sure you’ll get some Netflix-aesthetic movies mixed in, but you’ll also get plenty of indie and movies that don’t..


> Don’t watch Netflix

I know a few people very proud of their DVD collection, and they have rare stuff you simply won't find on Netflix. Many of them make rips of the DVD so they have a backup in-case the DVD rots and is unusable (actual thing that could happen).


Note that TFA is not arguing that Netflix movies are bad as movies, it's just making an observation about how they look (which the author dislikes).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: