Can’t find an “ask tech” community, so asking here.

Most screens have a refresh rate of at least 60Hz, our storage space is massive, and it hurts to watch videos that have fast moving objects or scenes.

Anti Commercial-AI license

  • randombullet@programming.dev
    link
    fedilink
    arrow-up
    11
    ·
    22 days ago

    Because the classic motion blur is still based off 24fps

    If you shoot 60fps, then using the 180° shutter rule you’re looking at 1/120 second which is much darker than 24 (1/48 or 1/50 second).

    Also slow motion which was shot at 240fps will now just be 4 times slower rather than 10 times slower.

  • kitnaht@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    22 days ago

    What you can search to find more information on the subject is the “Soap Opera Effect”. – Essentially, people are so used to 24.99fps that higher framerates are considered to ‘look funny’ when the camera pans, etc.

  • 8263ksbr@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    22 days ago

    Additional to the other commentators: there is still a lot of handmade visual effects (painting/masking on each frame by frame) costs would explode. as far as I remember “the Hobbit” was partially made in 60fps. While I loved it, the make-up crew had suddenly problems to hide all the glue on the dwarfs beards and skin looked suddenly realistic instead of “Hollywood”.

    Here is some random link about that: https://www.forbes.com/sites/johnarcher/2014/12/16/hate-the-hobbit-in-high-frame-rate-you-just-need-to-see-it-again/

    • Squibbles@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      22 days ago

      I also remember a behind the scenes thing about the Hobbit talking about specifically the cameras they were using added a colour hue that required them to make all the makeup much more red. So if you see on-set photos/video they all look really red/flushed in order to compensate for the high speed camera colour shifting

  • frezik@midwest.social
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    22 days ago

    Bandwidth is one. Going from 30fps to 60fps doesn’t necessarily double your bandwidth (due to how video compression works), but it doesn’t help.

    On the recording side, things get even worse. You want very light compression, preferably none if you can. At higher resolutions, that can stress the limits of hard drive throughput, and flash storage is still expensive.

    At really high resolutions and framerates, the CPU usage of just moving data around can be a stress point. You don’t want to have a CPU fan running in the background during a shoot. You need either a passive cooling solution on a low end CPU, or a bunch of thermal mass to absorb heat for the length of the shoot and then let the fan go crazy as soon as you stop. Oh, and that heat has to stay away from the camera sensor or you get thermal noise.

    Then there’s lighting. Doubling the framerate means half the light hitting the sensor per frame. So you use more light. But studio lights remain expensive because they need high CRI and (especially at higher framerate) need to be flicker-free. The power supplies for LEDs to be flicker-free tend to be less efficient or more complicated, so they’re not mass market with economies of scale. Or you can use older, hotter, and even less efficient forms of lighting. The kind that can literally cook an egg on its backside.

    Alternatively, you can turn up the ISO, but now you’re introducing thermal noise again. Edit: can also use a lens with a big apature, but this is also expensive and affects the depth of field (the part of the image that’s in focus).

    It’s all stuff that can be solved with some amount of money, but even the LTT or Mr Beast level channels struggle to get there.

  • thingsiplay@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    22 days ago

    Maybe a better place to ask would be one of the following places:

    For your question, are you talking about streaming videos? At least on YouTube videos are at 60 fps. The higher the fps, the more bandwidth and space is required on the servers. Therefore YouTube will limit 4k at 60 fps to paying customers. And for the casual smartphone videos, isn’t 30 fps at 4k not the default? People like to have higher resolution, because they don’t understand what fps probably is (I mean the non tech mommies).

    Are you talking about movies? Movies often aren’t filmed at 60 fps, therefore it would be lot of waste to stream them at 60. I’m not sure if this is still true with modern films though. I didn’t keep up with tech much.