• leave_it_blank@lemmy.world
    link
    fedilink
    arrow-up
    24
    arrow-down
    1
    ·
    1 year ago

    I must admit, when I got my 144hz monitor I was excited, coming from a 60hz monitor. But even if a game runs at 144 fps I don’t see much of a difference, many people do, but I don’t. It’s a bit smoother, but not much.

    But if a game runs at 30 fps it’s horrible. The Crew, for example, can be switched to 30 or 60 fps, that’s night and day!

    • EmiliaTheHero@possumpat.io
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      Yeah, 144hz makes a significant difference for competitive FPS games (especially fast paced ones like Overwatch), but I hardly notice a difference when playing single player or PvE oriented games.

      Hell, on some games (e.g. Borderlands 3 and CP2077) I actually prefer to play on my 60hz monitor since a smooth 60hz is much more enjoyable IMO than an inconsistent 100-144hz experience. My computer is admittedly pretty old though.

      • flashgnash@lemm.ee
        link
        fedilink
        arrow-up
        9
        ·
        1 year ago

        144hz in overwatch feels like putting glasses on for the first time, my brain can actually track movement properly

        Most other games I barely notice the difference though

      • onion@feddit.de
        link
        fedilink
        arrow-up
        8
        ·
        1 year ago

        You can cap the fps in software, no need to switch monitors.

        Also personally I always notice the difference, even when scrolling webpages

      • leave_it_blank@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Yes, many do. I’m just one of the unlucky ones. But at least I can see the difference between 1080p and 4k. It’s the little things in life…

    • QueriesQueried@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      Just to make sure since it does happen a lot, you did change your monitor refresh rate in your OS right? Windows for some reason really likes to not default to higher than 60hz. You’d also probaly want to enable variable refresh rate in your GPU settings if available. And if you do have VRR, some games are weird and have a specific Vsync option for it, others you can just use VRR on normal Vsync just fine.

      • Orygin@sh.itjust.works
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        Was gonna say the same. I’ve had this discussion before… “Dude 144hz is a scam it’s the same as 60 for me” my brother in Christ, did you enable it in windows?!

        • QueriesQueried@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          Yea it honestly shocks me - I mean… not really but yknow - that Microsoft has not done anything about it. Surely someone from the team that keeps trying to jam Edge down peoples throats could just port that shit over for when people have 60hz+ monitors plugged in.

        • leave_it_blank@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Yes, everything is configured correctly, but thank you, I too read often that people forget that, so it was the first thing after connecting the monitor I checked.

      • leave_it_blank@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Yes, yes and yes :) But thank you, I’ve seen enough posts where people forgot it, so this suggestion definitely was worth mentioning!

        • QueriesQueried@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Dang, I was hoping that was the issue since it’s so easy to fix haha. May I ask what monitor you have just out of curiousity? I guess the other thing I would mention is potentially the Overdrive settings being weird, if it’s too low it’ll look smeary and if it’s too high… well it’s smeary, but “reversed” compared to low. Sorry it isn’t working/noticeable for you though, it is great when it’s present.

          • leave_it_blank@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            It’s the Gigabyte M32U. It’s a good monitor, like I said it’s just me, some people just seem to not profit from it. It is a bit better than 60 Hz, but it’s not the big difference to me compared to 60.

            I followed and tried many things from here: https://pcmonitors.info/reviews/gigabyte-m32u/, including overdrive. I bought a new cable, tinkered with the settings, both monitor and driver, it’s just me :) maybe it has to do with age!

            • QueriesQueried@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              There’s a chance you might not notice it for sure, though most can tell immediately when they get put in front of a standard 60hz display. It might be worth a look at the UFO test for both your eyes and monitor. It should be very noticeable if your eyes are doing the tricking, or the monitor isn’t performing correctly on that site. If you have a newer phone that has a 90hz+ display, you can also use that as a sanity check.

              I haven’t heard of that site before and their writing seems… odd. Theres still a couple things that it could be, though they get more funky. It could be that FRC is enabled on the monitor, which on some caused issues with high refresh rate, or adaptive sync (gsync/freesync). It could also (still, if you’re unlucky) be the cable, or port on your GPU, or the GPU itself if it doesn’t support Display Stream Compression if it’s too old. There’s also a chance that the GPU straight up can’t do 4k while your settings are set to 120hz, or vice versa, or even more fun, it might claim to be doing one of those, while doing neither (or just one, but saying it is doing both/neither). Monitor issues are the worst lol.

              Anyways, sorry if I couldn’t help. I’m certain there’s a pretty good chance it is not your eyes, but between Windows… being as it is, and monitors being notoriously annoying to diagnose, it’s not a fun one to track down.

              • leave_it_blank@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                The monitor is working, like I said there is a difference to 60 Hz, it’s just not that big of a deal as I was expecting it to be. I don’t feel missing out when playing on my old 60 Hz. When I run the UFO test it’s visible it’s working, and the games I usually play reach high frame rates. I used the cable that came with the monitor and got a new one (primarily because the original cable was a bit too short), it’s on me.

                But don’t get me wrong. The workspace alone was worth the upgrade, so I’m not depressed or something like that. And I have fun playing my games, and that’s what matters.

                Thank you for your tips, I really appreciate it!!!

    • miss_brainfart@lemmy.ml
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      1 year ago

      Two things are important here:

      1. The faster something on screen moves, the higher your framerate needs to be for a certain level of motion blur.

      A 2D point and click adventure at 30fps could have comparable motion blur to a competitive shooter at 180, for example

      1. Framerate is inversly proportial to frametimes, which is what makes it harder to notice a difference the higher you go.

      From 30 to 60? That’s an improvement of 16.67ms. 60 to 120 makes 8.33ms, 120 to 240 only improves by 4.17ms, and so on

      Ah, something I want to add:
      That’s only explaining the visual aspect, but frametimes are also directly tied to latency.

      Some people might notice the visual difference less than the latency benefit. That’s the one topic where opinions on frame generation seem to clash the most, since the interpolated frames provide smoother motion on screen, but don’t change the latency.

    • Holzkohlen@feddit.de
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      It’s super dependent on the game. Baldur’s Gate 3? 30 fps is more than enough. League of Legends? Yeah, I’ll take those 144hz, tho to be honest I don’t notice a big difference compared to 60 fps.