Hacker News new | past | comments | ask | show | jobs | submit login

James Cameron ("Avatar", "Titanic", etc.) used to argue that high frame rate was more important than higher resolution. If you're not in the first few rows of the theater, he once pointed out, you can't tell if it's 4K anyway. Everyone in the theater benefits from high frame rate. This may be less of an issue now that more people are watching on high-resolution screens at short range.

Cameron likes long pans over beautifully detailed backgrounds. Those will produce annoying strobing at 24FPS if the pan rate is faster than about 7 seconds for a frame width. Staying down to that rate makes a scene drag.

Now, Cameron wants to go to 4K resolution and 120FPS.[1] Cameron can probably handle that well; he's produced most of the 3D films that don't suck. He's going to give us a really nice visual tour of the Avatar world. For other films, that may not help. "Billy Lynn's Long Halftime Walk" was recorded in 3D, 4K resolution and 120FPS. Reviews were terrible, because it's 1) far too much resolution for close-ups, and 2) too much realism for war scenes. Close-ups are a problem - do you really want to see people's faces at a level of detail useful only to a dermatologist? It also means prop and costume quality has to improve.

The other issue with all this resolution is that it's incompatible with the trend towards shorter shot lengths. There are action films with an average shot length below 1 second. For music videos, that's considered slow; many of those are around 600ms per shot.[2] They're just trying to leave an impression, not show details.

[1] https://www.polygon.com/2016/10/31/13479322/james-cameron-av... [2] http://www.cinemetrics.lv/database.php?sort=asl




You neglect to mention the fact that we are so used to seeing 24 fps that anything above it doesn't look like a movie.

Why do home videos have that "home video" look? The biggest reason is the 60 fps frame rate - it just doesn't feel cinematic. Even the 48 fps of the Hobbit films felt too "lifelike" and not cinematic enough.

A lot of prominent directors, as you've mentioned, say they'd like to move towards a world with higher frame rates. But that'll be a bitter pill to swallow for a viewing public that unconsciously believes "cinema" means 24 fps.

3D in particular is very difficult to watch at frame rates as low as 24 fps - a big reason it makes so many people nauseous, and a big reason so many directors are saying we need higher frame rates.

High resolution may not be a huge positive but it is definitely not a negative. There's nothing inherently cinematic or better about low resolution like there is about 24 fps, and if excessive sharpness feels jarring in a scene, the cinematographer can elect to use a lens with a softer focus.

And the strobing effect you mention - unless we're talking 3D (where motion blur feels wrong), a low shutter rate and consequent good amount of motion blur easily avoid strobing.


Viewers will get over it. There were industry people opposed to sound, to color, to wide-screen movies, and to digital cameras. They're mostly over that. Adding grain in post-processing is on the way out.

(Film is really dead. There are some directors still making noise about shooting on film, but it's digitized for editing.)


we've had >24fps for a number of years.

It just doesn't look film-y. I suspect it won't be until VR that we'll see proper high framerate.

Its just such an oddity that I don't think people will take the risk, given the expense in retrofitting cinemas. (plus virtually no TV is actually capable of properly doing 60fps)

just one point:

>Adding grain in post-processing is on the way out.

thats with us to stay. most film grain from 2008 onwards is digital (yes even on film films) because most will have gone through a VFX pipeline with DI. Grain is stripped out and put back in after.

grain is a good visual cue for a number of things, just like desaturation of colour. its a tool thats not going to go away


It doesn't look film-y is exactly the kind of excuses we used to hear in the past. The reality is that once you've watched movies in 48/60fps you can't really go back to slow framerate movies as you see them blurry and stuttering. I personally can't wait for 24fps to be a thing of the past. Especially for action movies.


At which point it will look like a youtube video to many, not like a movie in the cinema. High frame rates haven't been successful for several years, I don't see why this should change. Same for 3D. Maybe there will be another trend that enables it, but as of now 3D wasn't a great success.


> but as of now 3D wasn't a great success

I too find 3D gimmicky often, but probably because the technology varies grandly between production, cameras and theater displays. On the other hand, there are 3D movies every day in every big cinemas and 3D TVs as well. So I'm not sure we can say that it wasn't a great success.

> High frame rates haven't been successful for several years

The number of cinemas that can display 48fps is not great, the number of cinemas that can display 60fps is zero? So I don't know how you can say that "High frame rates haven't been successful for several years".

Actually if you look on Youtube, high FPS videos are successful.

> At which point it will look like a youtube video to many, not like a movie in the cinema

There are some great youtube videos out there, don't know why you're saying this. Cinema is what you're defining as 24fps, sure because you're used to it. If tomorrow we start watching a lot of 60fps movies then you will define it as Cinema. Objectively 60fps is better for action movies anyway, the rest will follow.


> The number of cinemas that can display 48fps is not great, the number of cinemas that can display 60fps is zero?

any cinema that can do 3D can do 48FPS, at least.

RealD uses a single projector, with an electrically controlled circular polarising filter on the front.

This is why 3D in cinema for any kind of action is terrible, because you get juddering nastyness.

To get round this, some places project at 96 FPS(well I thought it was 144, but that might be the limit of the projector where I worke)


> any cinema that can do 3D can do 48FPS, at least.

You'll have to explain to me why they weren't showing The Hobbit in 48fps then. You sometimes had to go to a different country to see it in 48fps.


because buying the film in HFR costs extra...


AFAIU 3D TVs are not a thing anymore

http://www.businessinsider.com/3d-tv-is-dead-2017-1


I think it will change because high frame rates look much better. It's not what people are used to, but what people are used to changes over time.

3D has two major problems. First is that the technology sucks. The glasses are heavy, bulky, and don't do a particularly good job of filtering out the opposite eye's channel. Second is that filmmakers don't understand how to do 3D at all. Every 3D film I've seen loves to add parallax where there should not be parallax. They don't understand that binocular depth perception only works out to a few dozen feet, which causes anything with observable parallax to be perceived to be nearby, and that in turn causes large objects to look tiny. Seeing a spaceship or airplane or mountain that looks like a toy because the filmmakers decided to "pop" it out of the screen is the exact opposite of a cinematic experience.

High frame rate doesn't have this problem. The technology is good, and using it properly in films doesn't appear to be a failing.


Exactly!

Fake parallax that just turn epic scenes and scenery into tabletop models.

It's obvious, but why do they ruin their efforts like that. Don't they watch their own movies after post 3d editing?

I think even Avatar made it too far. I think I've whatched some animated films that didn't blow totally, but almost every other film that I have seen in 3d was a disappointment.


This is complete speculation, but my guess is something like: the people who might understand this (skilled directors and such) are used to 2D and don't much care for 3D, and the people who push 3D (executives) are too obsessed with making things "pop" to realize what they're doing.


True 3D cameras are a massive pain in the arse

Either they are huge, to get two cameras side by side, or they have a half-silvered mirror arrangement (with colour disparity)

Add to that the rigs wobble (vomit inducing) and the distance between the cameras is far to wide, it all looks a bit poop, or requires a huge amount of post work to make fly.

So the normal thing to do is manually cut out each object (rotoscope) and adjust the divergence to place it in 3d.

every object, every frame.

it mostly looks a bit poop.

Not to mention is normally done quickly, like clash of the titans was converted in ~1 month.


Having worked in the industry, I've seen UltraHD with a proper setup. (with the 192 channel sound)

for documentaries and sports, yeah, its brilliant.

But for "film"? it sucks.

why?

Because it looks like a play, but with bad acting. Everything that we have learnt, subconsciously, about film, is based on 24 FPS. Any action of any narrative substance in a modern film is linked to a slowmo. This relies on 24FPS. Because things are smooth, we register it as different.

Now, I suspect where High framerate will be a thing is in VR. But thats a new medium, in the same way the talkies were.


Watch a bunch of 60fps movies and I assure you that you will look back at 24fps movies and think those look weird.


... I have, I do, I see lots of them.

ultraHD is 8k @ 60fps.

its great for sports and nature documentaries. films look like plays. actors look stilted and wooden.


In the 2D animation industry it's mostly irrelevant since we are animating on ones, twos, and fours (every single, second, or fourth frame at 24fps) depending on what is happening in the shot.

There is also not nearly as much tweening as you might expect. Sometimes animating on ones just cannot accurately give you the same effect as letting a persons brain fill in the missing info. Which is why watching The Hobbit in 48fps pulled me out of the movie at some points; I appreciated the extra clarity, but there were details that would have otherwise just been a blur that became distracting.


> there were details that would have otherwise just been a blur that became distracting

That's a good explanation which fits with my experience. I wonder, could that aspect of 48fps be mitigated by bumping up motion blur (i.e. lowering shutter speed) during such moments.


When I first watched the Hobbit, I didn't really notice any difference other than pan-intensive scenes didn't look as washed out. I don't recall thinking it was too lifelike or realistic. When I heard that it was because it had a higher framerate, I decided to start using frame interpolation via SVP[0] on my video player to artificially create more frames between each original frame, and I'm really happy. It isn't perfect, in some scenes there can be artifacts, but it mostly looks great. In action-packed fighting scenes you can finally see what's happening, and not just one big mush of colored abstracts.

I liked it so much that I even went out of my way to buy a TV that had a built-in frame interpolation that's been said to be better than most other high-end TVs.

[0]: https://www.svp-team.com/wiki/Main_Page


I finally understand people who don't like 3d movies. Your comment makes me feel ill! I hate high framerates in movies. Just ruins it imo.


Is this a real concern of ordinary people? I am not a movie expert, just somebody who watches movies a lot and I have never thought to myself "this new movie with 48fps doesn't look like a movie, it's too lifelike, not cinematic enough".

I assume people who think like that would be very marginal minority of movie experts. I don't believe average viewer would even think of such argument.


It's the opposite, actually. "Experts" are pushing for higher framerates, but "average viewers" complain that something feels wrong, home-video-y, and just plain weird on high-FPS movies. They can't put their finger on it, they certainly won't mention the framerate, but they tend not to like it.


Because ordinary people don't spend the time to appreciate it, it just look too different for them. Exactly how color movies looked too different at the time.


You're pretty wrong here. Critics panes color movies, consumers like them. This is the inverse of that.


Not true. This pattern is the same you saw with iPhone "nobody would use that" or CDs or ipads, or video games or ... there are always people who are against progress.


It's not a technical argument they make, it's just a gut reaction to 48fps looking different. The fact that it's more lifelike can actually make it feel fake, because it can feel like you're watching actors on a set.

Personally I do think people will get over it eventually, especially for less "cinematic" stuff. Or possibly variable framerate will become a thing, and directors will choose different framerates for different effects.


I see. I remember seeing Hobbit movies for the first time and they looked slightly different from other movies. But to me it seemed like a better quality so I didn't complain.

Though, I heard some people say that scenes in old LOTR movies looked more realistic. This was mostly because Hobbit used more CGI though, for example orcs in Hobbit movies were all CGI but before in LOTR they had real actors to play orcs.


> The fact that it's more lifelike can actually make it feel fake, because it can feel like you're watching actors on a set.

I personally didn't get that feeling either. Just got a feeling of "it looks better". Especially that dragon scene.

> variable framerate will become a thing

It already is a thing with slow motions, but it is not what you're thinking of.

Have you ever tried to do slow motion with a 30fps video? It doesn't look good. So your idea will probably look bad, inserting 24fps sequences in a 60fps movie will just look laggy.


4k is less resolution than 35mm film, which was used very successfully for most of the history of cinema. Chris Nolan shot Interstellar & his Batman films on 70mm, which exceeds 8k resolution.

So I think you're quite wrong about 4k being "too much resolution for closeups" or being "incompatible with... shorter shot lengths".


4K is about 25MP. That's around the resolution limit of 35mm film in still photography. The exposed area is considerably smaller in movies as the film is run vertically rather than horizontally.


Without getting into the old and terribly complicated film vs digital argument, I think it's widely agreed that 35mm and 4k are at least in the same ballpark of resolution, and that IMAX exceeds 4k, so I maintain that shooting 4k does not present any new issues around closeups or quick cuts.


How is it 25 MP? 4096 × 2160 pixels = 8.8 MP.


Oh, oops, I thought 4k referred to the vertical resolution. In that case, yeah, film possibly has a bit more resolution still.


If you do that for three colors separately that is around 25MP.


24 frames/second is a minimum. Their FAQ says you can shoot at a high framerate.

Therefore, the best way to interpret "Bitrate of at least 240 Mbps (at 23.98/24 fps) recording" is probably that if you shoot at 24 frames per second, then you must have a bitrate of 240 Mbps. Not that 24 frames/second is the only framerate allowed.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: