Hacker News new | past | comments | ask | show | jobs | submit login
Pain of the New: The Hobbit at 48 fps (kk.org)
335 points by mbrubeck on Jan 9, 2013 | hide | past | favorite | 205 comments



>Knoll asked me, "You probably only noticed the odd lighting in the interior scenes, not in the outdoors scenes, right?" And once he asked it this way, I realize he was right. The scenes in the HFR version that seemed odd were all inside. The landscape scenes were stunning in a good way. "That's because they didn't have to light the outside; the real lighting is all that was needed, so nothing seemed amiss."

I would love to see cinematographers experiment with natural lighting for shots. Kubrick (in)famously did this in Barry Lyndon[1] 37 years ago. To do this he used f/0.7 Carl Zeiss lenses that were designed for NASA.

Nowadays we are far less limited due to the incredible light-sensitivity of modern image sensors. And if that doesn't work we can go for larger sensors, which would still be far cheaper than doing the same with film.

I have a feeling Kubrick would be having a field day with current tech and would have been one of the first supporters of 48Hz.

[1] https://en.wikipedia.org/wiki/Barry_Lyndon#Cinematography


>Starting with The New World, Malick has instituted “rules” in his filmmaking, including using only natural light, no cranes, no big rigs, and handheld cameras only [1].

You might want to checkout New World if natural lighting interests you. Malick is known as the "director's director" and has done a lot of very interesting work. He loves trying to capture "unrepeatable moments", like how he used a clip of an in-costume Christian Bale taking a break smoking a corn cob pipe in New World when Bale didn't even realize he was on camera.

[1]http://stillsearching.wordpress.com/2011/05/21/39-facts-abou...


"Don’t allow yourself to fall in love with the camera. Keep it in its place and keep your eye open for those little moments you didn’t plan for. It’s my definition of what it is to be a movie director: A man who presides over accidents." -Orson Welles


Similarly, Werner Herzog said:

"Coincidences always happen if you keep your mind open, while storyboards remain the instruments of cowards who do not trust in their own imagination and who are slaves of a matrix... If you get used to planning your shots based solely on aesthetics, you are never that far from kitsch."


Great quote, haven't heard this before, thanks.


There's also the films of the Dogme 95 group (ie, Lars von Trier) http://en.wikipedia.org/wiki/Dogme_95

Notable rules for them being avoiding music that doesn't exist in the environment, shooting only handheld, only the barest of lighting.


I've seen many poor Dogme95 films, but I can heartily recommend Thomas Vinterberg's Festen as both a prime example of the experiment (he was one of the co-institutors of Dogme) and a great film on its own merits.


Do you have a source for the story about the corn cob pipe? I'd love to see that scene, but can't seem to find any mention of it online.


As I recall it was discussed in the DVD extras on The New World. I remember that scene in the movie, but it's been years since I've watched it.


Kubrick would have loved the high sensitivity of current senros, which makes it practical to shoot without artificial light* though it has to be said that film stocks have improved a great deal since Barry Lyndon was shot. But I'm much less certain about whether he would have embraced 48 FPS (not Hz, because we're not modulating a coherent beam). He had such technology fully available to him as far back as the 60s and chose not to employ it.

* Lighting is not just about making sure there's enough light. Some scenes look great just as they are. Most...don't. Your brain filters out a lot of what you don't care about it in the real world, but when you see it on screen it sticks out like a sore thumb. Lighting is more of an artistic than a functional thing in most cases, and is used to draw the viewer's attention towards or away from what the director wishes.


The unit Hz is used for frequency, per definition "per second". If "frames" are implied, surely 48Hz is acceptable?


It doesn't bother me, but it's not the standard nomenclature in the film industry.


> Kubrick (in)famously did this in Barry Lyndon[1] 37 years ago. To do this he used f/0.7 Carl Zeiss lenses

And about 250 candles (not exaggerating). There is nothing "natural" about that. Normal humans use maybe 10 candles in a room that size, the camera needed way more.

I never understood what made candles so much better than electric lights with a color filter for that scene. The number of candles made it completely unnatural.


This reminds me of the Dogme 95 movement from Danish directors[1]. It's a set of 10 filmmaking rules among which is:

> The film must be in colour. Special lighting is not acceptable (if there is too little light for exposure the scene must be cut or a single lamp be attached to the camera).

[1] : http://en.wikipedia.org/wiki/Dogme_95


> I would love to see cinematographers experiment with natural lighting for shots.

What are you talking about? Many films are made with natural light. Kubrick wasn't alone.


I think it is interior shots filmed using only candlelight, as described here: https://en.wikipedia.org/wiki/Barry_Lyndon#Cinematography

This was/is unusual.


indoors.


Yes. Some Dogme95 films. Pedro Costa's Colossal Youth. Amadeus. I'm sure I'm missing many others...


I believe there are some, but it's still unusual.


Here's a great piece on that lens[1]. Also, it's on view at LACMA through June 30[2].

[1] http://www.visual-memory.co.uk/sk/ac/len/page1.htm

[2] http://www.lacma.org/art/exhibition/stanley-kubrick


I hear writers moan and wring their hands about the high-framerate version of The Hobbit, how it doesn't look "cinematic" or what have you — but anecdotally, nobody I know had the kind of complaints you hear from film insiders. Nobody I know complains about it looking like a PBS show; nobody feels like they can see the actors' makeup. Those who saw both versions unanimously liked the HFR version better (a common reason being that it's "prettier").

They did agree that the first 10 minutes were painful, but I think that's a combination of the fact that it's not what you were expecting and the fact that the first 10 minutes of the movie were awkwardly shot and acted ("DRAAAAAGOOOON!"), which the burring effect of the lower framerate helps to disguise.

Sure, I believe that some people probably didn't like it as much. Different strokes for different folks and all that. But I can't help but feel like there is something of an "old guard" effect at work here, where people fetishize the incidental details of something they're heavily involved in, and those people are responsible for a lot of the noise.


My wife and I saw the HFR version and we were both put off by it. We're not exactly movie snobs or hard to please either...we've been to a theatre like 5 times since we had our first child 10 years ago.

I was completely unimpressed. It was painfully obvious that we were on a set instead of in some fantasy world. If this really is the future, then I hope that Knoll is correct that it will be solved by improved sets, props and lighting.


It must be solved by improved set, props, lighting, makeup... HFR is but one aspect of ability of technology to destroy the illusion of film. Improved sensor resolution affects this, too. As time goes on and Things Get Better, it's easier to notice the flaws beneath the makeup used for coverup. That is, until techniques catch up.


Just to counter all the "I didn't like HFR" people. I saw both HFR 3D and non HFR 3D. I liked the HFR version better. I read the some blogs that hated the HFR before I went and so was going to skip it but other friends said they had no problems with HFR so I figured it would be good to experience it.

There were a couple of places it was distracting which I assume was because it's "different". On the other hand there were places in the non-HFR that were distracting because they were shot for HFR and stuttered horribly.


I absolutely loved the HFR version. It felt weird for the first scene but after that it was a subtle addition similar to how 3D wasn't gimmicky for Avatar.


Disclosure: I'm a cinematographer. Prior to seeing the film, I was excited because I had read interviews with PJ about how HFR makes 3D so much better and cleaner. Looking back now, I don't know why I was fooled into thinking that there is something wrong with 3D. The supposed benefit of HFR is that it reduces headaches for people who get them from 3D. I don't get these headaches so there was nothing wrong with 3D for me before. Is it worth shooting all 3D films in HFR for the percentage of people who get headaches from 3D?

Regarding what you said about the "old guard" effect: It would be interesting to see some polls of audiences who have seen both versions. Unlike you, my non-film friends didn't have polarized reactions, they just thought it seemed different. The question I'm getting to is: Will the public fall in love with HFR enough for an industry shift in the way films are shot, despite the fact that many of the people in the film industry feel that the look of HFR cheapens the story?

All that aside, it's pretty interesting that we've gotten to the point where when single big films like "Avatar" and "The Hobbit" come out, it spurs conversations about huge industry shifts.


That's really interesting. Have you gone to watch both versions? I did just to make the comparison for myself. I don't usually get headaches from 3D, but I definitely found the HFR 3D do be much more attractive than the 24FPS version. I mean, no, it's not as mind-blowing as Peter Jackson might lead one to imagine, but I found (for example) a lot of the "geography porn" shots look nicer in the HFR version.

> Regarding what you said about the "old guard" effect: It would be interesting to see some polls of audiences who have seen both versions. Unlike you, my non-film friends didn't have polarized reactions, they just thought it seemed different.

I may not have expressed that well if that's the impression you got. They didn't all love it. Some loved it, some expressed mild approval along the lines of, "Huh, it was weird, but I think I like it." I was just saying that I didn't get a single "Ugh, I saw makeup!" or anything along those lines.


> Will the public fall in love with HFR enough for an industry shift in the way films are shot, despite the fact that many of the people in the film industry feel that the look of HFR cheapens the story?

I'm guessing they will. 24fps action scenes end up as a blurry numbing mess if there's too much motion. The 48fps action scenes were exciting and easy to follow. The landscape shots were also improved.


So I'm not a cinematographer but I find watching 3D at 24 FPS pretty annoying whenever the camera moves to quickly because you get a painful strobing effect. I had noticed it since the first 3D movie I saw and it has bothered me since. The Hobbit in 48 FPS was markedly better here (not perfect, but an incredible improvement).


I dunno, if it's a small group or not that gets headaches from the 3D but I definitely do. The HFR completely eliminated that though. I hope they keep it up (as long as they keep trying to do 3D that is) for purely selfish reasons :)


Almost everyone I know (even those who loved the movie) thought it looked a bit odd in HFR. None of them are "film insiders".


The only person I know who mentioned having seen the HFR version complained about it. There's obviously a selection bias toward extreme reactions there, though. I've not seen either version yet...


> Nobody I know complains about it looking like a PBS show

I went to see it with a friend before reading any articles on it. I didn't even know what HFR meant. And this is exactly what I thought about while watching it ... that it felt like a freaking TV show.

My friend didn't get the same feeling, but on the other hand he nearly felt to sleep.

Looking back, it was a good movie, too bad I ended up seeing the HFR version.


Humans like flaws.

When CDs first came out, people argued that they sounded "cold," even though they're near-perfect recreations of the music that was recorded. People like the hiss and compression of records and tapes.

This is also the same reason why people like Instagram filters. Normal iPhone pics are too good. Let's fuzz em up a bit.

Also, look at v1 of Facebook, Twitter, YouTube, and other popular sites - they were far from pretty.

There's a lesson here. Somewhere.


CDs sounded cold because converters were bad and engineers didn't have much experience mastering for it, not because they sounded too perfect.

Early CDs still sound bad, and I would take the respective vinyl over the CD any day still. Modern CDs sound amazing, if we leave the loudness wars aside for a moment.

I suppose there are two reasons why people use Instagram filters. Firstly, because most phone cameras aren't very good and the filters distract from the bad picture quality. Secondly, because it's in fashion, and people just do it because the cool people do.

I don't even understand where you were trying to go with the Facebook/Twitter/Youtube thing. It's a trivial statement and provides no illumination.

24 frames per second with a 180° shutter will always be more engaging than 48 frames per second with a 240° shutter or whatever they use, unless they manage to make the last one look more like the first one and add to it in some way. Like with modern CDs and vinyl.


It wasn't so much that engineers didn't know how to master, but that the analog-to-digital sampling was done at the CD bit rate (44.1Khz). That meant that the analogue signal needed to be brick-wall filtered to remove any information above 20KHz, and analogue brick-wall (hi-Q) filtering does really, really bad things to time and phase. Since the late '80s, the sampling rate on conversion has been much higher and the removal of high-frequency information (possible aliasing) is done in the digital domain. (And it really didn't help that early players did no dithering on playback either. Early CDs still sound horrible, but you really need to play them on early players to get the full dentistry-without-anaesthetics effect.)


Yes. That's the reason why I put converters first, engineers second. I shouldn't have said engineers though. What I really meant was the peripheral equipment they didn't have at hand to provide a proper transfer to the digital domain.

Steve Albini explains it succinctly here: http://www.electricalaudio.com/phpBB3/viewtopic.php?f=4&...


Yes

If I'm not mistaken, when sampling @ 44.1k your analog filter has to go from 0db @ 20kHz to "nothing" (-20dB or even less) @ 22.05k. Of course it doesn't (or at least, not nicely)

Now, if you record at 96k your analog filter has to do that from 20kHz to 48kHz (and that's not really needed most of the time as there's almost nothing there to begin with)

Downsampling digitally from 96k to 44.1k is easier and with much more control over the process.


This. I still have an old Sade CD from the early 80's and the sound on that CD is very flat.


I suppose that's why all those "Remasters" came out in the late 90s. (http://www.amazon.com/The-Best-Sade/dp/B00005AWMF)


> Humans like flaws.

> When CDs first came out, people argued that they sounded "cold," even though they're near-perfect recreations of the music that was recorded. People like the hiss and compression of records and tapes.

> This is also the same reason why people like Instagram filters.

This sounds more like "hipsters like flaws." CDs absolutely ROFLstomped both vinyl and cassette very quickly once they became affordable for normal people. People had the option of buying CD or cassette in the mid-'90s, and they generally preferred CDs. MP3 did the same thing despite similar complaints.


I think you're pretty close on this point. I share most of the views from the post above you, but I'd like to expand on them a bit.

First of all, I don't think people like flaws. If you take someone who has never listened to a vinyl record and ask them to listen to one and compare it with a CD recording, I'd bet they would cite the CD as being the better experience.

What people like is what they are comfortable with, and not just because of habit: distinctive patterns, especially in highly emotionally charged areas such as movies and music, generate emotional connections and attachment. Those who grew up listening to vinyl associate it's "fuzzy/scratchy" sound with the moments they spent listening to them. There's very potent emotional attachment to those flaws, and that's why people are attracted to them. When I hear the warbly sound and see the poor quality visuals of old VHS's, I'm taken back to watching Disney movies as a child.

As explained in this article[0], these emotional attachments are the reason why people enjoy Instagram to the degree that they do. The washed out, blurry, grainy photos generated by Instagram hit on the emotional connections to old photographs of our parents and of our childhoods.

Similarly, I think the same follows for 24 fps vs 48 fps movies; think of all the intense moments you've had while watching a movie in your lifetime. How many times have you laughed or cried or gasped at something happening in a movie shot in 24 fps? Given that any movie you've watched up till now has been shot at 24 fps, you've probably done it many times. Each time you feel those strong emotions, every aspect of that moment is associated with that feeling: the smell of the popcorn, the low lighting, the chair you where sitting in, the people with you. All of those details are now associated with that memory and that experience.

Now repeat that process of strong emotional association for every single movie or TV show you've ever seen. With this information, it's obvious why people are so attached to this small aspect of a movie: because there are strong emotions associated with the 24 fps frame rate.

[0] http://thesocietypages.org/cyborgology/2011/05/14/the-faux-v...


> People had the option of buying vinyl, CD or cassette in the mid-'90s, and they generally preferred CDs.

While agree with you, you haven't proved that this is due to a CD's objectively better sound quality. CDs are also more convenient than either tapes or vinyl: vinyl is bulky and much easier than a CD to damage. Tapes are smaller but also easy to damage. Both wear out with use faster than digital media.

Case in point: MP3 also took off massively despite objectively worse sound quality due to lossy compression. Because they were more convenient than CDs, and this convenience was boosted each time hard drive sizes and internet speeds increased.


> While agree with you, you haven't proved that this is due to a CD's objectively better sound quality.

I hadn't intended to, as that would be impossible. My point was more that "people preferred records and tapes for their flaws" is only clearly true of some early enthusiasts.

And it was not my experience that tapes were much easier to damage. Tapes were relatively durable unless you pulled the actual tape out of the cassette — and even then you usually got away with just minor signal degradation. CDs were extremely vulnerable to scratching, any scratch could render them pretty much unlistenable, and they were fairly brittle with a wide, thin surface area.


CDs are not all that vulnerable. If they're either in the case or the machine, they're fine. You can wipe dust off with a lot more vigour than with vinyl.

I've had tape machines that would pull the tape out and scrunch it up. Strands of tape everywhere is more than "minor signal degradation". CDs are not prone to repeated signal degradation or to turning into spaghetti.


Except in the case of MP3 it was a while before the encoders and bitrates became good enough that MP3 sounded as good as the CD. In this case, people weighed the convenience of portability of MP3 over the sound quality.

Of course, these days it's pretty much impossible to tell which is the MP3 and which is a CD recording but there was a long period of time when this was simply not the case.


Stopped reading at "ROFLstomped".


> Humans like flaws.

I don't think that's it. I think humans dislike change, and a lot of change (especially in technology) involves the removal of flaws. Almost every argument I've read that's critical of The Hobbit in HFR can be put in historical perspective by replacing "HFR" with "dialogue audio," "wide aspect ratio," or "color." You can find this sentiment anywhere you look.


I think this hits the issue. Humans grow "fond" of flaws/quirks they have to interact with for long periods of time.


I think that's the wrong way of looking at what happened, at least with CDs. When CDs first came out, the music industry had decades of experience mastering for LPs and cassettes, and no experience at all mastering for CDs. It took some time to figure out how to best use the format, and in the meantime thousands (millions?) of crappy-sounding CDs were released.

A record player naturally adds a certain amount of reverb to the music when it is playing -- that's the warmth. Recordings were engineered to take advantage of that. Then the same recordings were dumped to CD without compensating for the missing reverb. The CD would more accurately produce the sound on the master tapes, but those sounds were never intended to be heard that way. Frequently the digital recording levels were all messed up too.

From the article, it sounds like the exact same sort of thing is happening with 48fps. Right now the movie world is still learning how to use it. I kind of wish they'd practiced on films I don't want to see, but that's life.


> People like the hiss and compression of records and tapes.

I'd argue that they don't. People like what they know, and a different reproduction creates an annoying cognitive dissonance, like hearing a new interpretation of your favorite piece after having listened to the original for decades.

It's pretty much the source of the Uncanny Valley issue


Did you know a lot of contemporary music is still recorded onto tape in the studio because engineers and musicians like the character it adds to the signal? Consumer tapes sound hissy partly because the tape is so narrow that they have a low signal:noise ratio. Recording onto larger-size tape reels eliminates that issue, but the distortion/compression from deliberate overdriving of the inputs still sounds pleasing.


Not only that, but modern digital recordings often digitally add a track of tape hiss to warm up the sound. I don't mean a distortion plugin or anything, I mean a recording of a 2" tape machine playing nothing but hiss!

That being said, I agree with masklinn, we do this because people are used to hearing it, not because hiss is intrinsically good sounding.

Will the children of the early 2000s grow to yearn for the sound of 128kb MP3?


At least one study has shown that younger listeners prefer the 'sizzle' sound of MP3s over higher quality recordings, presumably because it's what they're used to.

http://radar.oreilly.com/2009/03/the-sizzling-sound-of-music...


"There's a lesson here. Somewhere."

The lesson is: just summarize the main points of the article (even reusing the article's own analogies) to get the top post on HN.


The most recent posts appear at the top and rapidly decay downwards. So unless it was from three hours ago, it isn't likely to be the top post for long.


CDs sounded terrible when they first came out, because hardly anyone knew how to master for digital and the recordings/mixes were demonstrably inferior. Tape is a lot more forgiving when you overdrive it it sounds quite nice because it distorts in linear fashion. Overdriving a digital recorder results in clipping, which sounds hideous.


>There's a lesson here. Somewhere.

"Did you know that the first Matrix was designed to be a perfect human world? Where none suffered, where everyone would be happy. It was a disaster. No one would accept the program. Entire crops were lost. Some believed we lacked the programming language to describe your perfect world. But I believe that, as a species, human beings define their reality through suffering and misery. The perfect world was a dream that your primitive cerebrum kept trying to wake up from."


The CD parallel doesn't work entirely- CDs prevailed because they were better quality than tapes and considerably more convenient than vinyl. I'm not sure that 48fps has the same argument to put forward.


I might be wrong about this, but I imagine that even early on, it was also much cheaper to make CD's than cassettes.


Quasi-pop psychology: The human brain is a remarkable machine. It is fantastic at extracting data from any stream of data it can find. It is aided in this by the neural ability to accommodate and model constant elements of the stream. In the simplest form, you can see this effect with the simple persistence of vision phenomena [1], but the effect can also be applied to fantastically complicated phenomena by our overpowered brains; for instance, can you see your blind spots? (And that's still comparatively simple.) The brain builds a neural net that adapts to the deficiencies of the medium (whatever it may be, movie, audio, etc), streams the data "through" this net, and undoes the medium-induced damage until we are no longer seeing a stuttering 24-fps image, but instead we are seeing John's desparate and heart-tugging last ditch effort to win the affections of Marsha before she finally ties the knot with another guy. And I mean that literally, not merely poetically, the brain is processing the image all the way to its semantic content, which is in some sense the "real" movie.

But the brain, for all its truly amazing capabilities, still has some trivial limitations that any computer would laugh at, and one of them is that the choice of model is not consciously driven, and it can not be changed on a dime. So you go to a 48-fps movie for the first time, and now your brain, sitting in a theatre, smelling the popcorn, watching the corny premovie trivia stuff, flicks in its standard "movie filter", but, alas, oh no, now a 24-fps filter is being fed a 48-fps signal! Now the amplifiers are overamplifying and the net is miscalibrated and what works its way up to the semantic part of the brain is not about John and Marsha but about "Hey! This signal is wrong! The inputs are out of spec! Hey! Hey! The output isn't coming out right! Hey! Hey!" In the audio case, "Hey, after running the automatic sound unwarmers the sound doesn't sound right! Hey!" You "need" the warm distortion of analog audio, you expect it, because you've basically got a hard-coded compensator that won't shut off right away undoing distortion that is no longer present, which is itself a distortion.

The semantic part of the brain, which is still not consciously driven, does not truly understand why it is getting this signal, because it's not like it did anything to create the original net or anything, so it just reports that it is unhappy, and starts casting around for the most likely reason that may be. Not being sufficiently introspective, it decides that rather than its own processing algorithms being miscalibrated for the incoming signal, it decides that the fault must lie in the signal, for which the conscious mind readily provides the seemingly-salient detail that "this is in 48fps", and, bam, explanation achieved. Man Was Not Meant To Know Movies In 48 Frames Per Second. Some people seem to act like this is some sort of Obvious Truth without asking how on Earth such a... specific law could have come to be encoded in our genes, or, yea verily, the laws of the universe itself or something.

Of course, as everyone grows a new neural net to handle the filtering of this new input into semantic content, the lower layers will stop screaming "Hey! Hey!", and the brain, being the magnificent seeker of Information that it is, when it has fully formed 24fps and 48fps nets available to it, will notice that 48fps has a much higher information content, and that it likes it better when given the choice, and indeed may very well decide that 24fps totally sucks.

You will someday stop noticing 48fps. You will someday specifically notice 24 fps and say, wow, how did I ever watch that.

But not today.

If you don't spend much time with 3D, you can observe this effect in yourself usually within the span of one 3D movie. At the beginning, the 3D jumps out at you and you can't help but catch yourself repeatedly "noticing" it. But you'll adjust by the end and not really "notice" it per se; it has faded to the background and you are once again only experiencing the semantic content of the movie. Your neural nets have adjusted. (They're very good at it.) And once you observe the "feeling" of an "unhappy" neural model, you can start noticing it almost any time you're doing something new, new language, new sport, new hobby, almost anything, really.

[1]: http://en.wikipedia.org/wiki/Persistence_of_vision


> You will someday stop noticing 48fps.

It took me about half an hour to stop noticing the oddness of it.

> You will someday specifically notice 24 fps and say, wow, how did I ever watch that.

I watched The Hobbit in 24fps a week later and even little initial things like Bilbo sitting on his porch and waving his hands at Gandalf were noticeably flickery and inferior. But after half an hour or so I got used to it ;)

In 24fps, it was noticeable how much of the screen was completely dark or so close to it in indoor and night scenes that it yielded nothing of interest. And the opposite in 48fps: Everything, everywhere had well-lit detail, even when it moved. It was occasionally too much.

I totally agree with the article that the film people haven't yet worked out how to best light indoor 48fps scenes. Too much clarity at 48fps, and not enough at at 24fps. I also totally agree with the article's basic conclusion - this doesn't mean that 48fps is worse, and it isn't. We just don't know how best to make it yet ... or how to see it.


Hmm, no. You don't necessarily want endlessly increasing information content because that's more work for your brain to filter. As I pointed out below, 48 FPS has been well within our technical capabilities for a long time - far longer than the current polarized-3d technology, for example. Yet there isn't a body of 48 FPS films out there, in contrast to 3d, super-widescreen, movies with vibrating chairs, 'smell-o-vision' and various other technical innovations.

TV frame rates (at least in the US) are ~25% faster than film, and have been for years. Standard definition video doesn't have the same resolution of film, but it's still quite viewable on a medium-size projection screen. HD is ~4x better and can be projected on large screens in excellent quality. So why hasn't higher-frame rate video become popular for film (by which I mean moveis, not celluloid)? There's no problem in projecting it (most commercial theaters have a HD projector next to the main one); there's no magic about projecting at 48 FPS, and indeed many affordable pro and semi-pro HD cameras can shoot at 60 FPS progressive.

You might as well argue that dance music should approach ever-higher BPMs because increasingly fast tracks incorporate more musical information. Now I do (sometimes) like very fast dance music of 200+ BPM but that's so high above normal human heart rates that it has an extremely limited audience because most people just don't want beats coming at them that fast.


Information theory is a tricky and subtle thing. A dance song at 300BPM may not actually contain any more "information" than one at 150BPM.

And the idea that the brain will get too tired processing that much data explains too much... you just explained why reality is too real for the brain to handle.

Also, I'm not claiming More Is Always Better, but in this case we've got a pretty decent bound being put in place by the bandwidth of the eyes and ears themselves, both impressive, but both quite a bit less than our brain convinces us that they are.


It may not, but then again it may. when I work on a fast tune I certainly don't feel that I can safely reduce the number of musical events therein and have it work well.

Reality is of course not too real for the brain to handle, but when we are being entertained we do not necessarily wish work at it in the same way as we do when processing reality - we accept a somewhat limited sensorium in exchange for enhanced semantic complexity. Consider that in things like sports or stage performances (two areas where high-framerate video has proved especially popular) the audience is dealing with a narrowly-tailored field of interest.

You're ignoring the elephant in the room: 48 FPS and a variety of other frame rates been affordably available for a long time on both acquisition and playback hardware, both analog and digital. What hasn't it caught on?


Nah. That's a bad analogy. The film equivalent to BPM is composition (scene switch, montage) speed. Much better to think of 48 fps as going from ~ 40 kHz to 48 kHz sampling rate: instead of fitting more things into a movie, what you film has higher temporal resolution. Net gain, easy to downgrade with a low pass filter... People can even notice the change, which is more than you can say for 48 kHz MP3 to 96 kHz FLAC, for instance.


> more work for your brain to filter

No matter how much resolution and FPS they put into movie screens, your brain still won't have to process more than it does when looking at the real world.

The analogy with BPMs makes no sense. Beats are perceived individually -- they're individually semantic. Exceptions like Moby's 1000 mentioned below aren't exceptions -- just because there are 1000 kick drums per minute doesn't mean the perceived BPM is 1000.


> most people just don't want beats coming at them that fast.

I'm sure people used to say this about Rock & Roll, or Jazz. The revolution in speed is probably just a decade away, judging by past acceptance.

I want increasing information content ad nausium, because it's simply a pale shadow of real experience. When we actually achieve comparable bandwidth with real life, we'll talk. The day the holodeck is seamlessly integrated in our lives is the day we can discuss limits, but by then it may not matter.


But oddly, if you have listened to enough techno, you may find that there is often more informational content in a 120bpm track than a 220bpm track. If a 4-bar bar is shorter, you can put fewer notes, fewer sounds, fewer subtleties of any kind in it.With the 220bpm gabba, there's no room for much besides the machine-gun beats.


The revolution in speed is probably just a decade away, judging by past acceptance.

I wouldn't be so sure of that. This is about 15 years old: http://www.youtube.com/watch?v=r0-BWdABmhY



It might only be coming into vogue now because strobing effects are far more noticeable in 3D than 2D and so the limitations of 24 FPS are becoming apparent as never before.


I completely disagree with you.

> Humans like flaws

Humans like perfection, humans like beauty.

> People like the hiss and compression of records and tapes.

People were used to the hiss and compression of records and tapes

> This is also the same reason why people like Instagram filters

Instagram doesn't downgrade the quality of a picture, it adds effect, plays with the colors, like professionals do with photoshop (but not at the same level). I don't think you understand what Instagram does to a picture.

> Also, look at v1 of Facebook, Twitter, YouTube, and other popular sites - they were far from pretty.

Look at the videos on youtube, before they brought HD. Could you go back to that?


I think it's the opposite - people don't like flaws. The higher accuracy of the CD and hi-res photos makes the flaws in data capture more obvious because there is less distraction. Adding hiss, pops, blur, or other distractions reduces our focus on the flaws.


My impression is that the popularity of Instagram filters is to cover up for the fact that 1) the image sensors for the iPhone (and other brands) are inferior to those found in DSLR cameras, 2) that the average user is an untrained photographer, 3) most photos are taken indoors with poor lighting, and 4) the average photo is boring.

Instagram filters quickly add interest to photos and mask poor lighting and noise.


There's a lesson here. Somewhere.

People don't like change but they'll probably get used to it.


Records don't hiss. And 16 bit audio really isn't "near-perfect" recreation.


No, but they usually playback tape hiss.


16 bit is good enough for human ears.


if by "flaws" you really mean "unique qualities" that give a medium its character.


Word.

One reason however that some CDs really did sound cold and clinical in the early days was because the transfers to CD were using EQ curves that toned down bass and heightened treble frequencies for pressing vinyls. It made the vinyl records more even, but it made the CDs sound weak and harsh. My brother has been spending time in his home studio remastering some of his 80's CDs with normalization and EQ changes mainly, and they sound tons better.

Like the article says, HFR may require similar "mastering" to suit the newer format better, changing filmmaking techniques that no longer apply like they did in the 24fps era.


This captures it concisely. There is years and years of experience lighting things to make them look good in the theatre. There is also years of experience in lighting things to make them look good on video. Unfortunately the video lighting people and the cinema lighting people aren't the same people.

On a related note, I saw a clip of Patton running on a television that was doing active motion compensation filtering. It made the movie look like it was shot on video, and in that mode the various prop items stood out. In one scene the jeep in the background, and the background in general, is clearly painted on a flat surface. Without the motion compensation on that jeep looked like it could be a real jeep in the scene. Very weird effect.


That motion compensation that TVs do is the most egregious sin of all, and makes 48fps film pale in comparison.

It's the first thing I turn off.


It is god awful.

I have a sibling who seems to have, somehow, adjusted. Probably out of bias: He just dropped $1600 on that fancy 240hz TV. This is just the new new thing. I HATE watching movies at his house. It ruins them.


> One reason however that some CDs really did sound cold and clinical in the early days was because the transfers to CD were using EQ curves that toned down bass and heightened treble frequencies for pressing vinyls. It made the vinyl records more even, but it made the CDs sound weak and harsh.

You seem to refer to the RIAA equalization curve.

http://en.wikipedia.org/wiki/RIAA_equalization

It is true that bass is reduced and treble is increased before recording a vinyl LP, but the opposite curve is then applied when the LP is played back. The overall frequency curve of the whole chain is flat (or as close to flat as technically possible).

The purpose of the whole thing was not to make "vinyl records more even". The actual reasons were:

1. Reduce noise (which is mostly located in the high frequency area, and is reduced at playback due to the complementary EQ curve)

2. Reduce the very high amplitude meandering motion of the needle that would be imposed by high-amplitude bass (thereby requiring a much wider space between neighboring grooves and reducing the total duration of the LP). Because bass is reduced while recording, the needle doesn't wander that much; at playback, bass is restored to the right level.

Moreover, the RIAA curve is extremely deep. It's + or - 20 dB at either end, and that's A Whole Lot. Any CD mastered with such a curve applied at recording (but without the opposite curve applied at playback) would sound downright awful. It's not a subtle effect, it's about the same like turning bass all the way down, and treble all the way up, on a classic EQ with two knobs. That's not "cold and clinical", that's broken.

Perhaps some clueless sound monkeys indeed made such an egregious error back then, on a few CDs, but the backlash was no doubt very strong. When LP fans talk about the "cold" sound of CDs, this is not what they refer to. They are talking about the lack of a certain kind of added harmonic distortions that give vinyl its specific sound, that some people apparently prefer over a clear signal, for some strange reason.

Source: I'm old enough to have actually built preamps for LP players (and cassette recorders). Often my stuff was as good as high-end professional electronics, at a fraction of the cost. But it often looked like something a robotic gorilla threw up after an indigestion.

> HFR may require similar "mastering" to suit the newer format better, changing filmmaking techniques that no longer apply like they did in the 24fps era.

I totally agree.

They use 48 fps, but the shooting and lighting techniques are the same like in the 24 fps era. Of course there seems to be a disconnect.


Lighting is so extraordinarily important to film shooting. Visit a set (not in a studio) and see the rumbling generators, the giant lights everywhere, and the cranes, diffusers, and reflectors needed to get the light into the scene.


A good illustration of the importance of lighting (even though it is in a virtual world) is "How to Train Your Dragon". One of the big reasons that looked so damned good was the lighting. They brought in Roger Deakins, cinematographer for most of the Coen brother's films and many other well known films, as a visual consultant.

On Deakins' forum on his web site, a puzzled reader posted asking what a cinematographer does on an animated film. Here was part of Deakins' answer:

---------

My role on 'How to Train Your Dragon' was as a consultant and very much as part of a team. It is, for sure, a less personal involvement if only because of the numbers of people involved in the process. However, from an early stage I was involved in discussions about the 'look' of the film with the directors, designer and effects super. This involved all aspects of the production from the choice of lens lengths to our approach to 3D. Following that we created a template of images to reflected the arc of the film. Next we created a number of animated reference shots, which depicted the characters in various lighting situations such as a candlelit room, a foggy day in a forest and in moonlight. For the next 12 months I was working with the layout artists (who set the camera) and the lighters (who light the rendered images) to follow through on the whole process. During part of that 12 months I was shooting a (live action) film but I was kept in touch with the production through a dedicated work station.

---------

Full thread here: http://www.rogerdeakins.com/forum2/viewtopic.php?f=3&t=1...

See also this NYTimes article: http://www.nytimes.com/2010/03/21/movies/21dragon.html?_r=0

The 12 months Deakins spent on this is a bigger fraction of the film's production time than you might expect. Normally animated movies can take 3 or 4 years, but after a couple of years production the producers of HTTYD decided that the story and characters they had would not work. They were doing a faithful reproduction of the book (in which dragons and humans have long been at peace...and the kids are 10), but that was aimed at a younger audience. They needed something that could also appeal to teens and adults, so they replaced the director with Chris Sanders and Dean DeBlois, and told them they needed to write a new story and new script, with older characters--AND they had stick to the release date, which was something like 16 months away. So, that 12 months Deakins was involved was effectively pretty much the entire development time of the film we actually got.

In one of those weird twists you'd expect to find in the movies, not real life, the reason Sanders was at Dreamworks is that when John Lasseter took over Disney animation, he suggested a bunch of changes to the movie Sanders was writing and directing for Disney, "American Dog". Sanders didn't like the changes, and ended up leaving. (Some reports say Lasseter fired him). "American Dog" got a new director and writer and story and a new name, "Bolt". What it did NOT get was a new release date.


"48fps is just above the threshold that the human eye/brain can detect changes"

No, it's not. The actual threshold is unknown, but it's generally assumed to be around 100fps. Like the threshold for color, it's a complex question.

You don't see stuttering in a movie (usually) because each frame has an exposure time that is almost the full duration of how long the frame will appear on screen. The result is that anything moving will blur on the film (instead of getting a crisp shot), and so motion sequences are much more natural.

I'm not entirely sure why 48fps was chosen, but I know that too much more and many of the projectors that are currently showing the HFR film would be unable to. It's also worth noting that in 3D, each eye only gets a new frame at half of the rate (because each eye only sees half of the frames), so each eye is getting refreshed at the rate of a regular movie.

I know that there were lots of other technical obstacles when filming in 48fps, such as color muting by the camera. This could have also played a role in the choice of keeping it at just 48fps instead of something higher.

I'm not entirely sure why 48fps was chosen,


>It's also worth noting that in 3D, each eye only gets a new frame at half of the rate (because each eye only sees half of the frames), so each eye is getting refreshed at the rate of a regular movie.

With active shutter 3d yes, but with polarized 3d no. I think most movie theaters are polarized 3d.


48fps was chosen because it was compatible with a certain type of 3D projector that most digital theaters had used. Thus theaters didn't need to invest in an entirely new projector just to show the HFR version. The IMAX-certified 3D projectors (note: not IMAX domes) that RealD was pushing around the time Avatar came out was capable of this.

Technically, I think those projectors can go up to 60fps in 3D, but I can't remember/easily find where I read it so I'm not that confident about that figure.


48fps was chosen because it was a direct multiple of 24. You can cut the framerate in half and project it in all the old 24fps projectors.


More importantly you have an entire video and audio editing/production pipeline built for the timings of 24 fps. If your going to increase the rate of all those systems, its far easier to double the rate than increase by some arbitrary fractional amount.


The rate that the eyes/brain cannot detect changes will depend on many things including size of objects and speed (and possibly smoothness) of motion.

Using polarised 3D both eyes should still get 48fps they just receive different(ly polarised) light. TVs using active shutter glasses do reduce the frame rate by half but will usually be reducing it from 96Hz or more

By the way - in the cinema at 24fps each frame is usually flashed twice which encourages the psycho-visual system to fill in the blanks and smooth out the motion so it appears sharper and less blurred than it would on a LCD/Plasma TV screen that usually holds the image for the duration of the frame. Some high end models can do black frame insertion if they are using the right type of LED backlighting and are set to the right mode.


R&D on Showscan showed 72-76 fps was the threshold.


From personal experience that is far too low, I'd like to see the source for this.


I've read it in Cinefex long time ago, might be there is something online too. Douglas Trumbull did some sort of a controller experiment in front of a crowd with 70mm projection.

Which shot images have you seen played faster than 72 fps? I've seen one test played back at 120 fps, but it was in a camera r&d lab in germany. 72 fps is really more than enough. Consider that 3D projection actually projects 3 frames per each eye, so 3D@72fps would actually mean projector is churning out 432 frames each second. Or you're talking about games, which is another thing entirely since input is a part of experience too. In that case, I agree.


They project each frame 3 times so that your eye can't pick up flicker from the light going on and off. I'd imagine that at the full 72fps it would not be necessary to flash each frame 3 times.

Finding a 'framerate' for the eye is difficult. Under certain conditions, 60hz (think florescent lighting) can be enough that your eyes won't notice a flicker. (your eyes hold a residual for some period of time, and you never see the darkenss). But if you are flashing a series of super-crisp photos, your eye is much more likely to pick up on stuttering because each photo has just as much light.

And then, if you have some action sequence in a movie where parts of the scene are really bright and fast-motion, and others are dim, the minimum amount of frames per second you need to create a 'perfect' experience for the eye is anybodies guess. It would probably also make a difference if you were a person used to looking for flaws in a picture, vs a person who is just trying to enjoy a movie and doesn't know anything technical about framerates, etc.

Regardless, all forms of currently available cinema have framerates with room for improvement.


While I am not positive, my assumption is that 48fps was chosen as it is exactly twice the normal frame rate, allowing them to easily toss away half the frames per second to create the 24fps versions of the film.


I saw this last night so it's fresh in my mind: the first few minutes were strange and new, I had to adjust and also there was the general curiosity about how HFS was going to look. I spent quite a lot of time looking at Bilbo's face (young and old) and marvelling how I could tell that they were completely hairless and questioning if they had to wax their beards off to get that look. You could see every pore and the clean clean makeup. After I got used to it it was similar to what the article says: the indoor shots were like watching live theatre and the outdoor shots looked spectacular. I came out of it thinking, there is massive scope here for artistry given time and expertise.


> The text-book reason filmmakers add makeup to actors and then light them brightly is that film is not as sensitive as the human eye, so these aids compensated for the film's deficiencies of being insensitive to low light and needing the extra contrast provided by makeup. These fakeries were added to "correct" film so it seemed more like we saw. But now that 48HFR and hi-definition video mimic our eyes better, it's like we are standing on the set, and we suddenly notice the artifice of the previously needed aids. When we view the video in "standard" format, the lighting correctly compensates, but when we see it in high frame rate, we see the artifice of the lighting as if we were standing there on the set.

This sounds entirely wrong to me, regardless of his appeal to experts.

There was no "film" versus "high definition". So far as I know, the Hobbit was not filmed in both 48 and 24, nor both in film and in digital: I think it was filmed on RED in 48fps HD digital and converted to 24 HD digital in post, by adding motion blur. Thus there was only one sensor type, aperture, shutter speed (likely 1/96), and ISO setting for both film versions. The blogger's description above seems to make assumptions which are simply not true.

If this guy saw some difference in lighting, this difference must be solely due to the 24fps motion blur conversion.


There was no motion blur conversion, the 24fps version was made by taking every second frame.

The film was shot at 48fps with a 270 degree shutter angle, which is equivalent to a shutter speed of 1/72s. This was done to balance the look of the motion blur midway between the look that you'd get shooting 24 and 48 with a standard 180 degree shutter angle (1/48s and 1/96s respectively). So the motion blur should still be a bit sharper than a usual film, when watching the 24fps version, but not as extremely sharp as it would be at 1/96s since it would cause more stuttering and strobing.

I believe the author's point is not that the lighting was different, but that it 'appeared' different, which I agree with. The difference in the appearance I put down to getting twice as much visual information per second. Even though the spatial resolution is the same for each frame, you're getting twice as much. For example a small detail like a facial pore may not resolve between two 24fps frames, but you can see it in the 'in-between' extra frame. This could be especially an issue since the original plates were shot at 5k, resized down to 2k, potentially 'anti-aliasing' away details.

This is probably having a similar effect to when HD spatial resolution came in for television, everything looked more 'real' and productions had to adjust makeup and lighting techniques.

(disclaimer: I work in the visual effects industry)


Edit, messed up frame rate calc, it's not 1/72s, it's 1/64s: 48/(270/360)=64


> If this guy saw some difference in lighting, this difference must be solely due to the 24fps motion blur conversion.

And to the fact that you are seeing half as many frames in the same time span.


> Those high frame rates are great for reality television, and we accept them because we know these things are real. We’re always going to associate high frame rates with something that’s not acted, and our brains are always going to associate low frame rates with something that is not. If they’re seeing something artificial and it starts to approach something looking real, they begin to inherently psychologically reject it.

Translation:

I learned to do things in this particular way, and I cannot unlearn it.


I'm genuinely curious: in cases like this, how do we differentiate between what is truly not pretty and what we don't like just because we're habituated to something else? Does the former even mean anything, since we humans are the only ones experiencing the technology?


Apparently, this is a difficult question to answer.

http://plato.stanford.edu/entries/beauty/


In this case, run some experiments with people from various cultures who have never seen film or television before.

Humanity is fairly diverse, and many behaviors or preferences have little to do with genetics.


As we should have learnt from the audiophile business, unless you are doing a blind study, people will write all kinds of BS about their impressions that tells you nothing about whether they really are able to perceive difference and whether the impression itself is positive or do they just like the feeling of trying something new.

Much of this sounds just like the people describing the "warm" sound they get after changing their cables to golden ones.


Yeh no. 48fps is the new 3D.

Some people will like it even love it. The majority of people won't and its use will be restricted to 'blockbuster' movies only.


I did this as well - watched both 48fps and 24fps, both in 3d and the former in imax. I didnt talk to Knoll, of course, but some notes follow -

1. The imax hfr 3d was on the whole awesome and I got a glimpse of where cinetech is heading. I was surprised by how good it was 'cos I expected it to be visually a lot more "soap opera"ish, iykwim. Peter Jackson has done some really good compromises so the film would look good in either format. It could've been a lot worse!

2. I was left with a craving to watch the 48fps again after I saw the 24fps. The craving was not for the outdoor scenes, but the indoor scenes which felt a lot more intimate in imax hfr 3d.

3. The outdoor scenes felt a bit bland compared to the 24fps! However, i think it was not the lighting that made it feel bland, but a feeling like i was moving through vacuum along with the camera. There is air out there in the scene and i was not breathing it or feeling the wind as the camera moves. These scenes worked in 24fps. Perhaps adding some sound indicating the air and wind might help during the sweeping outdoor shots.

4. Some cinematic techniques felt "old". The "zoom in on character and fly around" effect (on oakenshield) didnt work for me at all in hfr, but was spectacular in 24fps.

5. Slow motion needs to be reinvented. The slomo battle scenes between orcs and dwarves (iirc) had feeling to them in 24fps, but I thought "why are they moving so slowly? .... oh its slow motion!" during the hfr. It really needs something more to indicate that it is for emotional effect.

6. High resolution hfr 3d graphics totally rocks! The trolls were real and alive for me, as were the orcs and goblins. I think the digital team might've broken some new ground here rgd compositing scenes that's in some way different from what you see in games at 60fps. (Or maybe not!) At least, i cant wait to see Cars or WallE in imax hfr 3d!

7. Some 3d oddities (parallax) were disorienting in both. Ex as the camera pans to the young Bilbo letting off smoke rings, it looked like someone was pushing the bush behind him into place. But overall, 3d rocked in hfr for me compared to 24fps.

8. Traditional background score didnt work as well for me i hfr compared to 24fps. The scenes being more intimate and lively, i continuously had a feeling that the orchestra felt out of place. I'd much rather just have the sounds necessary for just the scene. Also the 3d placement of the sounds need to be more faithful to the geometry in the expansive shots. Some sounds just felt too loud for the distance.

Edit: minor bugfixes and clarifications plus new point on sound.


3. makes complete sense, you almost expect to hear wind against the microphone like in a natural history documentary

8. this could be a real game-changer. I've noticed most Hollywood films basically have a constant musical backdrop all the way through. European films tend to have more silence, which make them sound more 'stark'.

If this is true, Hollywood will have to completely change how they use audio as an emotional crutch to convey to the viewer the tone of the film.


I don't think there's an "imax 3d hfr" version of the film -- IMAX just happens to be one of the venues that's showing the 3D HFR version. They also show a different IMAX 3D version, but that's at 24fps.


Here in the Netherlands, only one of the five IMAX cinemas actually showed the IMAX 3D HFR version. The other four only showed IMAX 3D and 3D HFR.


This is really interesting, thank you. I haven't seen a lot of discussion about how traditional filmmaking techniques might need to be changed to work with HFR.


A bit of nonsense. I was just in San Diego visiting family and my Dad, brother and I went to an IMAX theater and saw the 48fps 3d version.

SPECTACULAR

A long time ago I did some entertainment dev for Nintendo and Disney. Both companies provided me with high end SGI Reality Engines that were good for 64 fps. Not being able to detect new frames really adds a lot to the experience.


According to the article, the outdoor scenes were awesome in 48FPS. It was the indoor scenes that looked fake and weird, due to the fact that 48FPS made it very apparent there was lots of makeup being used, because they had unnatural light blasting on the actors (where as there was no extra light added to outdoor scenes, obviously).

Shouldn't he have just toned down the extra lighting indoors, knowing 48FPS was going to pick up more detail?


> due to the fact that 48FPS made it very apparent there was lots of makeup being used

How does better temporal resolution (higher frame rate) enable better spacial resolution (see makeup detail)? In no or low motion, there can be no difference.


More updates per second to each pixel means each pixel is more faithfully tracking the real-life image, which in turn allows the eye to more accurately interpolate spatial details in the displayed picture.

As you say, for a truly static scene there is no difference. In practise there is always noise in the picture, so even in static scenes a higher frame rate will tend to reduce noise by averaging.


There is never no motion, unless you're in hard vacuum or a clean room. If you doubt this, shoot some video of a static scene, then take on frame out of it and repeat it for the same length of time. People's pulse makes the color of their skin change subtly. Lights flicker in phase with alternating current. Outside, the air moves and the color temperature of the sun changes subtly with it, even if the sky is perfectly clear.


> How does better temporal resolution (higher frame rate) enable better spacial resolution (see makeup detail)? In no or low motion, there can be no difference.

Because you're not watching a single still frame - it's more like your brain is taking in an integral of visual information over time. You don't need much motion to be able to perceive a visual difference (even if it's just a 'feeling'). Eg. http://iwdrm.tumblr.com/


The HFR camera uses a notably different lens, I believe as a result of the faster shutter rate. When you double the fps, you halve the exposure time, which calls for a different lens.


Why would 48FPS have any particular effect on lighting or any other detail other than movement? Why would makeup stand out more?

[I've only seen the Hobbit in 24FPS (unfortunately the 48FPS is only available in conjuction with 3D which I won't watch).]


I'm not an expert so someone please correct me, but I believe the answer is roughly: if the frame rate is lower than the "rate that your eye sees", then your brain replaces the frames that are missing with a blur so that the motion still looks fluid (hence why video at 24 fps still looks kinda fine to us, even though the frame rate is less than reality). This blur that your brain adds in would probably also cause your brain to have difficulty following details, since the details -- of the makeup, for example -- are constantly moving. However, if the frame rate is 48 fps, then your brain does not add the blur between frames since all the necessary data is available and it can more easily perceive details (like makeup).


I liked the goblin and Orc battles (indoor or dark, and thus probably sets or lighted) in HFR. I didn't get much benefit during the walking around New Zealand scenes.

By far the best part of the movie was New Zealand, though. I would have been happier with a 3h IMAX of NZ nature. Maybe with LOTR music.


Agreed. That explanation seemed a bit glib. It's not like adjusting the color balance to reduce sensitivity to light and color is difficult with digital video.

The points about the resolution of details on props make more sense to me.


Yeah I think the lighting in the scenes that take place in Bilbo's home tends to look bad even in the 2D standard-def 24fps trailers you can find online. It really doesn't have anything to do with HFR.


But the scenes they pick out in particular are in the hobbit hole.. this all happens within the first 10-20 minutes of the film, perhaps before you have adjusted to the high FPS.


And shoot every indoor scene twice?


I am one of these people who were REALLY excited about this technology. I was fully, 100% prepared to walk out of the cinema, happily proclaiming that HFR video is the best thing ever and that every single film from now on should be shot and shown in this technology. More frames per second MUST mean that it will be better for the viewers,right?

And then I went to a cinema,and could not get used to this effect. Everything the characters did, seemed accelerated. I did not think that the video looked amateurish or home-made - no, absolutely not. But each scene in Biblo's House or generally all inside scenes looked like they were playing at 2x the normal speed - the characters moved too fast, it was unnatural. But I know that it couldn't have been really moving at 2x the speed - the sound was in sync,so there was nothing wrong with the cinema. I have no idea,how this could happen - I have seen plenty of videos shot in 60fps and never noticed anything so disturbing. Sorry,but Hobbit in 48fps was unwatchable for me.


I also experienced this effect, but only for the scenes where Bilbo was writing with a quill. Everything else was fine for me.


I used to have this effect watching video game streams in 60FPS, and I have absolutely no conception of how this works at all (I mean games run at that FPS anyway, I cannot figure it out at all).

It goes away after a while and then it just seems smoother, it is very disorienting at first though I admit.


A cogent argument, well explained and articulated. However, I don't agree with the conclusion - that what audiences want is ever more realism. Realism diminishes things; the 'film look' adds a dream like quality which is highly desirable for narrative work.


Certainly audiences might come to reject some new techniques and accept others. But what current generations think of as "film look" will eventually be a historical curiousity, just as past generations' "film look" is a curiousity today. Future generations will have their own "film look" that will probably be more realistic and high-fidelity in at least some ways.

Though there are a few exceptions, generally people today don't spend a lot of time watching black-and-white 35mm silent films. When color film was first introduced it disrupted the look of film, but eventually filmmakers tamed color and found ways to make it as real or unreal as needed.

Maybe high-frame-rate video will be similar. Some scenes might demand different interpolation or blurring than others, and I think we've hardly begun exploring that space of possibilities.

EDIT: I can't resist this link: "Screen realism may be a little too real" -- newspaper review of Citizen Kane from February 1942. http://trove.nla.gov.au/ndp/del/article/8231805


To be honest, we have already been doing this with video for a long time. The two biggest things you do to video for a 'film look' are to shoot at 24 fps (or fake it in post) and set your shutter speed to 1/48 second to mimic the behavior of a film shutter. this gives a particular kind of motion blur that can be dispensed with on video (and film, incidentally, since many film cameras have adjustable shutters), but which people in the audience strongly prefer in most cases. The alternative is the HD look of sports and entertainment programming. It's boring, because it's too similar to real life.


I've pretty much stopped watching "action" films because they're too blurry to make out anything. A clear not-blurry image actually helps my suspension of disbelief.


I'd like to get to the point where frame rate is a creative choice and not a technical limitation. Perhaps some kinds of films/scenes/shots are better suited to different frame rates, for the impression and feeling they impart on the viewer. It was great to see Ang Lee start doing this a bit (but with aspect ratio) in Life of Pi, using it as another tool in the tool box, not an absolute right or wrong way to do things.


And yet, the movie that took the most oscars in 2012 was a silent black&white film.


"The Artist" may have won oscars, but it didn't even make the top 10 in the box office that year.


Thanks for the link. The parallels are interesting.


You could have said the same thing about higher resolution video, or color video, or talkies. But they've all been accepted, even though they increased the level of "realism".

As the article says in its opening line:

> New media technologies often cause an allergic reaction when they first appear. We may find them painful before we find them indispensable.

At this point, such reactions to new technologies are to be expected. People grumble for a while, then they go along with them. There will always be idiots like this at the beginning:

> One critic even suggested that directors should put a soft-focus filters to debase the clarity of the new digital recordings and restore the "painterly" aspect of classic films.

Of course, changes will be made to the filmmaking process so that any problems are ironed out. For example:

> Imagine you had the lucky privilege to be invited by Peter Jackson onto the set of the Hobbit. You were standing right off to the side while they filmed Bilbo Baggins in his cute hobbit home. Standing there on the set you would notice the incredibly harsh lighting pouring down on Bilbo's figure. It would be obviously fake. And you would see the makeup on Bilbo's in the harsh light. The text-book reason filmmakers add makeup to actors and then light them brightly is that film is not as sensitive as the human eye, so these aids compensated for the film's deficiencies of being insensitive to low light and needing the extra contrast provided by makeup. These fakeries were added to "correct" film so it seemed more like we saw. But now that 48HFR and hi-definition video mimic our eyes better, it's like we are standing on the set, and we suddenly notice the artifice of the previously needed aids. When we view the video in "standard" format, the lighting correctly compensates, but when we see it in high frame rate, we see the artifice of the lighting as if we were standing there on the set.

This is easy to fix - change the makeup and lighting process.


This is what I'm disagreeing with. Colorwas a big plus, and you'll find little opposition to more resolution among filmmakers - when we can afford it, we'll take as much resolution as is available, likewise color depth.

We've been able to shoot 48 fps for years; there aren't any major technical obstacles to doing so on film, even on relatively cheap cameras; for example, an Arri Super 16mm camera available for under $10k (a fixture on the indie movie scene for many years) can be cranked up to 50, 60, 75, 90, 120 fps dependent on the model. Later models allow one to dial in an exact frame rate rather than simply selecting one. High-speed shooting is a fairly standard feature, because you need to overcrank in order to shoot a scene in slow motion, and everybody looooves slow motion. I've worked with film cameras shooting at up to 240fps, and you can get specialized ones that go a lot faster (up to a few thousand FPS, if you have sufficiently fast film or sufficiently powerful lighting). Re-engineering a film projector to play back at 48fps is trivial. The only difference it makes in practice is that you have to hire more projectionists because they need to change the reels twice as often. Projectionists live near the bottom of the movie food chain so the additional expense isn't really a deterrent either.

So on a technical level, 48 fps is quite pedestrian. Nothing other than the cost of the celluloid and processing chemicals has prevented filmmakers from using it, and for all but the lowest budget films, those costs are a small fraction of the overall budget. It's only new(ish) in digital filmmaking, where until recently it's been a hassle to deal with the huge volumes of data involved in doubling the frame rate for everything, and few cameras had sufficient bandwidth because there was little demand for it. The reason that there are no 48fps movies shot on film is that nobody has ever liked the results enough to justify doing a feature on it and pushing it out to market. We've done that with plenty of new technologies in the past, from anamorphic projection to things like Cinerama, which required three synchronized projectors for the ultimate widescreen experience (which can only be viewed in 3 theaters in the US today).

Sometimes things hurt because they're new. But other times they hurt because they just don't satisfy us. In the case of 48 fps we could have rolled this out decades ago.


That's a very interesting opinion about what people like about film. To me, the one take-away point I remember from a film-making class I took in high school days with my best friend was the teacher quoting some eminent film-maker's idea that the essence of film is montage

http://en.wikipedia.org/wiki/Montage_(filmmaking)

(that is, editing), and since then I've been very aware of the director's and editor's joint choices of shots to build a scene in any film I watch. If I can get good montage, it matters not to me what technology is used to capture the images or display them.


That's certainly true. Film buffs and insiders lovelong complex single-take shots (the opening of Orson Welles' Touch of Evil is still a high-water-mark of technical craft), but over the length of a whole film (eg Russian Ark, where magazine changes in the camera were done when ducking behind actors or furniture) it become exhausting.


it can add a quality that is desired for the narrative work. Everything has its place.

Honestly, my main real complaint with the 48fps version of the film was that it seemed like the cinematographers took the liberty to pan faster, or at least it felt like they were panning faster. That was an unpleasant experience to me, but little else was.


I agree about the pans seeming faster, but in 24fps the pans keep the same angular rate but you don't notice any weirdness at all.


I agree, and I think that those who say they want realism really only want the illusion of realism.


I think a key take away is that film isn't supposed to be hyper realistic. When there are dragons on the screen, a slightly slower refreshed image for your eyes simply aids in the suspension of disbelief.

Of course, its easy to say "you ain't used to it so it's odd" but that seems dismissive. Film is manufactured fantasy, so imperfect reproduction is totally acceptable and perhaps preferred.


Why is it unthinkable that not everything new is also better? 3D is pretty much the same thing.

In animation you have the same issue with the uncanny valley - people accept a certain realism, but once you cross a realism threshold, you suddenly need to deliver perfection, or people notice something is off.

HFR, for me, plays into the uncanny valley.

I think the future is in adaptive framerates. HFR shines in the big sweeping shots and when quick movements or panning is required. but it absolutely sucks for close ups and dialogue heavy scenes.

another example for this are music videos - notice how they could have been HFR for a long time already, but aren't? they are being filmed sped up on purpose and then slowed down for the final product. just to make them appear larger than life.


I have not watched the Hobbit in HFR but i felt something very similiar when watching The Dark Knight on a new 3D LED 200hz whatever TV at my friends house. It almost felt like a documentary, so clear that you could feel the filming set instead of being transported into the movies world. It felt very strange and after reading this its exactly how it felt.

Can say i like it or not yet, but it concerns me that because of this production budgets will have to rise because you can literally see every detail. It will make every process of film making more costly, the same that has happened to video games in the last 10 years.


I remember several people saying exactly the same thing after seeing a DVD on an HD TV for the first time. 10 years later, nobody has that complaint anymore.


a DVD does not offer any HD resolution, you will need bluray for that


I don't have much to add to the discussion except to say that I don't think the cd vs vinyl debate that we had in the 90s is a good comparison for the current controversy with 48FPS. Even if you accept that the first cds really did sound more accurate than their vinyl counterparts -- and some posters are contesting this -- the underlying points of contention are different for 48FPS vs 24FPS. Cds, in theory, brought you closer to the ideal situation which would be listening to the music in its purest form, unadulterated (perhaps, live?). But this is not what 48FPS does. Although 48FPS brings you closer to reality, it is the wrong reality. As pointed out in the article and by others, 48FPS makes you more aware of the artificial contrivances of the set -- the extra makeup, harsh lighting and the fake props. This is not what movies are supposed to do! The thing to strive for in movies is the fantasy realm that you're trying to depict. 48FPS puts a much greater burden on the filmmaker to live up to this fantasy expectation and when he or she falls short, as many are claiming PJ did, you have people complaining.


This article does a good job of explaining that "different" thing you see.

The best way I could describe it to people was "remember the first time you saw porn shot in HD?"


> Because 48 frames per second is just above the threshold that a human eye/brain can detect changes

This is false. A human eye can detect changes at 60Hz, for sure.


I ctrl-fed this. Only 2 paragraphs in and tremendous factual inaccuracy there. 90 or 120 hz would be appropriate where average people can't detect the change anymore, but some people can distinguish minute frame changes even beyond that because different people have different eye characteristics.

I'm sure some people can't distinguish 48. I'm sure some can distinguish 120. People are different. I personally don't notice the difference beyond 90, but I can easily notice a difference between 120 and 60.


From one extreme to the other. It's actually interesting that the Hobbit has gone to 48HDR, considering that image mastering of Lord of the Rings was very poorly done for it's theatrical releases. The digital master used for the final prints seemed to be bit-rate limited, causing very noticeable (for me anyway) compression artifacts when I saw the final installment.


There is a larger point here about technology in general. The things that are supposed to make us happier are not, in fact, making us happier. To the technologist, more pixels and faster frame rates are an undisputed good. As Kevin Wines from THX tells me, “When film was the only medium for cinema, using lower frame rates was based on economics. With film, directors and producers would have had to use more film, making it cost prohibitive to use high frame rates, but now that we use digital data, there’s no fiscal reason not to use high frame rates.” No reason, of course, other than that the impact on viewers may be different than what the technologists were expecting. To people in the user-centered practices, every product, and every feature of every product, is a proposition, a question we ask users. Like the ophthalmologist giving an eye exam, “better now, or now?”

http://onforb.es/VPE5u2


I also felt that the shots, now full of visual information, moved too quickly between scenes. I wanted to spend some time staring at the shot, taking it all in (especially in an early shot of a marketplace), had no chance to.

I felt like someone was making me sprint through a museum.

I'm in favor of HFR, but agree, we really need visionary directors to bring out the full potential.


I don't get all the fuss around it but I'm reassured to read such a blog. I was really scared at first that directors would drop the >24fps because of the reception of The Hobbit.

In my mind The Hobbit was a persistant slap in the face, like everyone else I had to adapt, I felt like it was going fastforward the first minutes of the movie. But once I got used to it, it was like I was in the movie. I usually have a problem with action movies because of the lag that I can see. I've noticed I can see it really well due to me twitching my fps when playing FPS competitively. The Hobbit was different, I don't think I can go back to 24fps movies. Maybe for normal movies but not for CGI or action movies that requires a high FPS so it displays in a smooth way.

And now the bad thing I'm thinking about is that I'll have to buy a new TV so I can watch that kind of movie again. I thought that full HD and 3D were just some small evolutions, but HFR is a revolution.


I just saw the movie (HFR 3d at Metreon), and while I mainly disliked that it seems to be a single solid 2.5-3h movie split into two badly edited 3h movies, the HFR was pretty good.

Not a fan of 3d, though; everything good about 3d seems to be handled just as well through depth of field, and once a lot of viewers will be on 2d, there is never non redundant 3d; it is either pointless in a scene, or backed up with depth of field, composition, color, or other ways of indicating depth and importance.

The HFR really sucked, IMO, in the early Shire scenes, which were boring indoor things. I've seen HFR before so I don't think it was adjustment. It worked well in battle or action scenes. A movie like Black Hawk Down or maybe sci-fi would do really well with HFR I think; not drama or fantasy.

What did seem to work in 3d were some of the 30 minutes of text and graphics beforehand. I am excited about 4k or 8k realtime rendered graphics for user interfaces.


When I viewed the film in 48 fps, I noticed some visible "stuttering" in some scenes, like a video file being first stuck because of CPU load and then played too fast. This happened a couple of times in the early scenes. Did others see this? What could be the reason for it? Theater specific problem or some production time effect?


The frame rate was the least of The Hobbit's problem. If the movie had been good, I don't think anyone would be discussing this. When a movie's opening half hour is as utterly dull as the scene at Bilbo's house, what else can the audience do but take the time to closely examine every little detail of makeup and lighting.


I found the 48 fps aspect of the Hobbit to be the most significant improvement to realism since 3D. The lighting looked a bit stylized in some scenes, but since I interpreted it that way (and hadn't seen the slow, jumpy version first), it didn't trouble me, and otherwise the whole film was a wonderful smooth glide that provided the best suspension of disbelief I've had in a long time.

I'm really puzzled by people clinging by habit to slow FPS as making a film more epic - even more so by the tendency in anime to use an even slower frame rate. Both are now just obsolete cultural idioms, much like an audiophile who only believes in his high fidelity experience if he can hear the snap and pop of dust in the record groove. Baffling.


> "Instead of the romantic illusion of film, we see the sets and makeup for what they are. The effect is like stepping into a diorama alongside the actors, which is not as pleasant as it might sound… Never bet against innovation, but this debut does not promise great things to come." – C. Covert, Minneapolis Star Tribune

As a fan of the off-Broadway immersive theater production, Sleep No More, I have "stepped into a diorama alongside the actors" several times in the last year. It is a phenomenal experience.

Artistic expression does not require "romantic illusion", and even if it did, heightening the audiences sense of realism doesn't preclude it. I haven't seen The Hobbit yet, but I welcome the innovation.


I thought that most scenes in The Hobbit 48 fps were fine, but a few scenes /seemed to be playing in fast forward/. Especially in the opening scenes, Bilbo was walking through his hobbit-hole at faster than natural speeds. I have a distinct memory of feeling like the speed of motion was out of sync with the sound.

This effect was only apparent in a few scenes in the film, but I'm pretty sure it was real, as I watched the film twice, once at each speed, and only noticed it in 48 fps. It almost seemed like the producers didn't film certain scenes at 48fps natively, and sped up their 24 fps film to double speed instead. Did anyone notice similar artifacts?


Yes I had the same thing. It looked like the film was playing at 2x the speed, but the sound was in sync,so I don't know what was going on.


So basically Peter Jackson's crew did a bad job with lightning, makeup and sets. Faced with new medium they failed mastering it at first try.

I wonder how long we'll have to wait for a movie that does the same for 48fps that Avatar did for 3d.


One of the most interesting aspects of the HFR debate -- and basically the main reason for it -- is the brightness, which as Kottke notes seems to make makeup and internal lighting look fake.

I don't believe the film itself is any brighter. What happens is that with 48fps your eye works less hard compensating for motion blur and is freed up to perceive each frame more brightly.

From what I've read, the effect is limited only by how fast projectors can change an image, so I suspect we're only at the precipice of discovering the ups and downs of HFR.


"Because 48 frames per second is just above the threshold that a human eye/brain can detect changes" - Sure about that? Did you try 60, 100, 120, 200 fps too?

These comments are so funny. I think that all video games that have other than 10 fps CGA graphics are "unreal" and very distracting. Praise to alley cat and sopwith 2. ;)

Latest 3D games are super distracting, you can't even always tell if it's about movie, video film, or game.

Every new Windows version is really bad, but after a while everybody is asking, who's still using the old version?


> I was surprised though that the movie in 48HFR looked so different. (The 3D did not have an effect.)

Strange, but I saw it in 2D 24fps then in 3D at 24fps, and the 3D made a huge difference to me. Asok and the Goblin King looked fake and didn't work for me in 2D. An animator friend of mine tells me that most of the emotion portrayed in animation is through body language, so maybe this was optimized by the artists to work in 3D, but not quite perfect when translated to 2D.


A slight bit off topic from the article here but was the 3D version the only one shown in 48fps? I saw the 3D version a fews days ago and was actually disappointed that I wasn't able to experience the 48fps experience.

To me the 3D effect seemed to cause a lot of jitter and poor fps so I was a bit ticked the whole time. But now having read this article I'm even more ticked off that I didn't even realize I was watching 48fps.

...and that makes me wonder why I couldn't tell the difference.


There's three versions: 24FPS 2D, 24FPS 3D, and 48FPS 3D.

You most likely saw the 24FPS 3D version, unless it was specifically advertised as "HFR". The main aim of HFR is to make the 3D work better and not give the usual headaches, etc. people get when watching 3D.

I haven't actually seen the 24FPS version yet, but I imagine there's a possibility the 24FPS version has a much faster shutter speed than would typically be seen in a 24FPS movie, which could give it a 'poor fps'/jittery feel like that you experience in a PC game with low FPS.


Ironically, I saw the 48FPS 3d and that made it far worse. Any scene with motion was literally unwatchable with both eyes open - and this was true for every one of my friends that went that night. We suspect it was something wrong with the projector itself, as the 3d previews at 24fps looked fine. I'd really like to see it again properly if the operators get things straightened out, as the low-motion scenes looked incredible.

Contrast that to Avatar, for example, which was an awful 3d experience for me as well, but mostly because I was stuck in the front row (I arrived "only" an hour early)


I saw the 2D-only version and there wasn't anything particularly jittery about it. It had the usual special effects issues with physics of unrealistic things looking, well, unrealistic (especially apparent in the dragon attack and the long shots when everyone is trying to run out of the troll mountain), but I didn't notice anything amiss that I could attribute to the 48fps->24fps down-conversion.

Either the 3D/24-fps combo makes things weird or JungleGymSam saw the movie in a place with a janky projector.


It was labeled as HFR and cost me a few extra bucks at that. Thinking back more I do remember a few scenes where I felt like things were smoother. Perhaps my expectations were too high or other people are simply way more sensitive to the effect?

I'd like to see it at 4K, 48fps, and without 3D.


There's also an IMAX 3D version. IMAX used to be filmed on 4x larger film stock, but now that everything is digital, I'm not sure if the IMAX version is different than the standard 24FPS version.


I believe there were three versions: 2d 24fps, 3d 24 fps, and 3d 48 fps. Something like 400 theaters in the country had the HFR version. You probably saw 3d 24fps.

Not all 3d projectors can do 48fps, and theaters wanted to hedge their bets about people hating it.


I saw the movie first in 2D 24 FPS and found it passable with the expected super-heavy storytelling.

The second time was in 3D HFR and I thought it was really enjoyable and odd at the same times. HFR distincly made me think of reality TV or sport though and I would put all acting under increased scrutiny. I was definately more focused on the instant and less on the story. Overall I want to see what HFR films can do in the future.


Are you sure you watched the HFR version? Not all 3D showings had HFR.

I think you may have only seen the regular version. HFR was really noticeable (for me, at least).



From now on, frame rate will be just like aspect ratio: another choice for the filmmaker. Some movies will still be in 24fps, some will be 48fps (or some other value), and eventually some will probably be variable from scene to scene.

As for the Hobbit, I think HFR could certainly have used a better "ambassador" film. Maybe James Cameron (who has also talked about 48fps from time to time) will do a better job of it in his next movie.


I'm guessing you'll only see 48Khz employed for recordings of real events, like musical performances, that Cirque de Soleil film they're pushing just now, and lavish documentaries in exotic locations such as nature films.



I think the authors main points can be summarised by the 'Uncanny valley' - http://en.wikipedia.org/wiki/Uncanny_valley

The key point is that as an artificial situation gets closer to reality we actually experience a perceptual drop in comfort level of our response to it.


I'm a little confused. I saw the movie last night in 2D. From the article, it seems that that was at 24fps.

But one of my biggest gripes was that the makeup seemed noticeable and unbelievable, and axes and hammers looked painted. Doesn't this have more to do with the quality of the images than the frame rate? Is it even worse at 48fps?


Here you can download the original trailer in 48FPS. Watch it at 2X Speed on VLC to see what 48FPS looks like.

https://rapidshare.com/#!download|931p5|868604620|The_Hobbit...


Looks cool to me. Except the part when Bilbo is on the horse. And 2 seconds before and after that. That part looks like cheap TV.


The first movie I saw in HD was Bond's Casino Royale. And somehow I noticed something I've never seen before. Every new shot I had the feeling you could read the expression on the actors face of an "aaand action!".

So maybe too real isn't fun to watch unless it's an documentary?


Presumably both the 24 and 48 fps versions were shot with the same cameras, with the 48 fps raw being mixed down to 24. Why would that affect sensitivity? Shouldn't both versions be equally sensitive? If not, couldn't this effect be compensated for in post?


Are we dealing with an uncanny valley situation here? At hfr and high res, are films just short of being good enough in such a way that they are freaky? Will another decade push it over the edge into real-realism and then it comes into it's own?


I've to try it, but I fail to see how linear improvement in frame rate will buy anything. Movies are impressionist experiences, not data sampling. I don't wanna see reality, I want someone to make my imagination dance and trip.


I'd really like to know the technical reasons why the lighting appears different in 48 fps than 24 fps. His paraphrase from John Knoll was interesting, but didn't entirely satisfy my desire to know what really is going on here.


The lighting explanation makes sense to me, but the one thing I'm not clear on is how a computational frame rate reduction magically adjusts the lighting back to what it would have been if it was originally shot in 24 fps.


I don't get it, have none of you played video games?

I saw The Hobbit in 3D (HFS) on an IMAX screen and I didn't notice anything amazingly different. Perhaps because I am used to playing computer games at 60FPS or more??


I saw it in HFR 3D and it looked great. No idea what the fuss is about.


Why 48 FPS? Just because it's double the typical 24 FPS? Still it seems pretty arbitrarily chosen. They should just go with 60 FPS, which is what every screen has.


While 60 makes sense in the US, Europe wants 50 (because they use PAL, with 25 fps). In this case, since film has always been done in 24 fps, 48 was chosen for compatibility with that (and cinema projectors).

Frame rate conversion has always been strange though, to make 24fps content 30fps they perform something called 'telecining'. This mixes up different rows of frames to get the new frames. For PAL, they actually just speed up the video (and audio) to 25fps; so shows actually go for less time.

For modern TVs, the screen usually has a rather high refresh rate (60+ Hz). So it can show older framerates (24, 25, 30) quite well by timing frames across multiple refreshes of the screen. These newer framerates can cause trouble, but to compensate you even have some 120 Hz TVs showing up. A lot of modern TVs even have some form of motion compensation, to add new frames into the video. The techniques used for this try to make an approximation of between frames, and usually look 'too smooth', due to the motion blur not being correct.


after my eyes were actually adapting to the 48 frames/s i really liked the experience. what i found strange though was, that when displaying landscapes there was actually a depth-of-field in the movie - which did not make it a fully "real" visual experience. i guess this is rather a technical limit of the cameras the movie was shot with than intended. anyone got some details on this?


What makes you think filmmakers are interested chiefly in visual "realness"? There's very little in the medium of cinematography that your mind accepts readily as "real" besides sound, depending.


A more accurate digital format reveals more flaws or approximations in art and so requires more refined art.


Are all film critics snobby gits? They sure sound that way from the quotes.


I thoroughly enjoyed the Hobbit at its frames per second.


minor niggle: the word is Kodachrome, not kodakchrome


tl;dr too much makeup


I know what's wrong with high FPS: with quality being improved, including the time domain, you more clearly see the scene being filmed, but that scene is actors, decorations and computer graphics, as in your video game, but higher quality, and it is not so convincing anymore. The films, the movements in them, are not used to be so polished in the time domain. With the new time resolution you see all the motion more precise, and it just sucks, it is not as good as you used to think of it when it was in 24 FPS. Despite the stupid myth, the 24 FPS is way below what you need to feel the picture as realistic. Now the moving picture starts looking realistic, but what you realistically see is actors, decorations, and CG, the old approaches are not convincing anymore. The film makers just need to adapt to the high-definition time domain, to make films look cool again with the new level of precision.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: