Especially with details like fur or hair. I'm used to older games having blocky and angular (more or less depending on when they were made) models with jagged edges due to aliasing with simple transparency effects used for stuff like hair.

Modern games just look less defined up close. Is it a bunch of post-processing effects? Does my monitor have especially chunky pixels?

  • makotech222 [he/him]A
    ·
    2 years ago

    Disable

    • Depth of Field
    • Motion Blur
    • TAA/DLSS
    • Chromatic Aberation

    Enable

    • Render Resolution x 1.5 or alternative upscaling

    That fixes most of the blurriness of modern games. It really sucks because at max settings these games look like a blurry fucking mess.

    • MoreAmphibians [none/use name]
      ·
      edit-2
      2 years ago

      I think sometimes these settings are hidden under a generic "post-processing" setting. If you can't find the above see if you can turn post-processing down and see if that helps.

      • makotech222 [he/him]A
        ·
        2 years ago

        yeah, always check the pc gaming wiki online for a game, usually someone will document how to disable specific post process effects.

  • beanyor [she/her]
    ·
    edit-2
    2 years ago

    I think the biggest contributor to what you said is that complicated effects like hair or clouds tend to be rendered over the course of multiple frames now. Instead of rendering the complete effect every frame, they render ~25% of the effect and then combine that with the previous couple of frames to get the final result.

    I first heard about it when Horizon Zero Dawn used this technique in order to make realistic clouds: https://www.guerrilla-games.com/media/News/Files/The-Real-time-Volumetric-Cloudscapes-of-Horizon-Zero-Dawn.pdf

    Like it said at page 93 of the document I linked, it decreases render time by a lot without needing to significantly change the underlying algorithm, and thus techniques like this are very attractive to (overworked game industry) graphics programmers in order to achieve their performance goals.

    I think it looks pretty good for smooth things like clouds, but now it's used all over the place. And because it is only actually approximating the full image, it can lead to obvious loss of detail when used in the wrong places.

    • save_vs_death [they/them]
      cake
      ·
      2 years ago

      i think i noted in FF7: Remake, when i flicked the camera around i would sometimes see weird trailing artifacs from certain sources, i am now sure this was why (that and me managing to frustrate the built-in checks that are supposed to stop this from happening), huh, neat

  • macabrett
    ·
    2 years ago

    This combined with how detailed games are getting has made it really hard for me to like... understand what I'm looking at when playing a game. Like, there's something about modern games where I just cannot follow what I'm supposed to be doing or where I'm supposed to be going unless it has a giant UI element pointing me in the right direction.

    If a game has cartoon-y graphics or is an indie game with a less detailed world, I have no issue. 100% only a problem with AAA games.

    • hexaflexagonbear [he/him]
      ·
      2 years ago

      That's partly a design choice. I think it's because photorealism makes it that if you make important objects visually distinct or even glowing it breaks immersion in a sense when the norm is that there isn't a distinction between generic world objects and important ones. But as you said, now you need all these UI elements to point you where to go, which also breaks immersion.

  • culpritus [any]
    ·
    edit-2
    2 years ago

    As beanyor mentioned, there's a lot of shaders that exist now that are used for these types of animated effects, and they are not usually made to look detailed necessarily.

    Shaders can provide all kinds of interesting effects, but I think many devs want to have a 'cinematic' look, so this leads to a lot of shaders that simulate things like lensing effects, depth of field, vignetting etc. This ultimately degrades the sharpness of the image compared to older games that didn't do these kinds of things. It can look cool in specific instances, but when you want to look at some arbitrary element of the game, it really can get in the way as well. This is part of the reason chasing photo realism doesn't make a lot of sense for many games. Having an art style that is not trying to look like a movie can be very engaging and evocative without looking oddly blurry or smudged.

  • Commander_Data [she/her]
    ·
    2 years ago

    DLSS is the biggest contributing factor to this, imo. Allegedly it's much better with DLSS 3.0, but I don't have a RTX 4000 series card to confirm or deny.

  • Grandpa_garbagio [he/him]
    ·
    2 years ago

    I know exactly what you're talking about and i have no idea what it is either, feels like no one notices but me so I assumed it's because I'm not on 1440p or whatever the standard is now

    • Lovely_sombrero [he/him]
      ·
      2 years ago

      1440p on a 27" monitor seems to be the sweetspot at the moment. But you also need to turn off all posprocessing effects like motion blur and film grain.

  • hexaflexagonbear [he/him]
    ·
    2 years ago

    Temporal anti-aliasing has replaces traditional anti-aliasing. It's much less computationally expensive, but it tends to lose sharpness, so it requires an artificial sharpening filter. It's also an average between frames, so cutting between scenes and moving the camera quickly reveal additional artifacts.