What upscaling was supposed to be for.

  • Lucy :3
    link
    fedilink
    English
    491 day ago

    On older Graphics cards, aka. the 7800XT? Man, I’d argue the 7xxx gen is not even last gen, but like half-current gen.

    • @Brett@feddit.org
      link
      fedilink
      English
      121 day ago

      Ha, when reading the headline i thought GCN would get some upgrade in the Linux driver or something. But not a possible FSR4 compatibility with 2 year old cards to make the imagequality less worse.

  • @kugmo@sh.itjust.works
    link
    fedilink
    English
    241 day ago

    Incredibly sad state of graphics when people say upscaling looks better than the raw image your GPU was supposed to be displaying.

    • @the_riviera_kid@lemmy.world
      link
      fedilink
      English
      141 day ago

      God, I hate up-scaling with a passion. Lowering your render resolution does a much better job of improving performance and maintaining quality. This was something everyone seemingly knew up until recently. More importantly it doesn’t take extra processing power just for the end result to look like crispy fried shit.

      • Lucy :3
        link
        fedilink
        English
        21 day ago

        And (M)FG changes nothing but marketing numbers.

      • Beacon
        link
        fedilink
        11 day ago

        iirc some GPUs have dedicated upscaling cores, so theoretically in that scenario there should be little/no hit to performance when doing upscaling

    • @brucethemoose@lemmy.world
      link
      fedilink
      English
      9
      edit-2
      1 day ago

      I mean, DLSS looks great. Can’t speak to FSR, but if it’s anywhere close that’s incredible.

      I’m speaking as a major pixel peeper. I’ve spent years pouring over vapoursynth filters, playing with scaling algorithms, being really obsessive about video scaling, calibration, proper processing bit depths, playing games at native res even if I have to run on mostly low, omitting AA because it felt like blur, modding shaders out myself… And to me, DLSS quality or balanced (depending on the situation) looks like free lunch.

      It’s sharp. Edges are good. Overprocessing artifacts are minimal. It’s not perfect, but infinitely better than naive (bilinear or bicubic) scaling to monitor res.

      My only complaint is improper implementations that ghost, but that aside, not once have I ever switched back and fourth (either at native 1440P or 4K) and decided ‘eh, DLSS’s scaling artifacts look bad’ and switched back to native unless the game is trivial to run. And one gets pretty decent AA as a cherry on top.

    • @inclementimmigrant@lemmy.worldOP
      link
      fedilink
      English
      4
      edit-2
      1 day ago

      100% agree.

      It also infuriates me that when this was all of this upscaling was introduced back in 2018, this was touted as a way to extend the useful life of your GPU. And now in 2025, it’s basically buy a new card and it’s mandatory to use it if you the newest games to be playable.

    • @fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      124 hours ago

      Does anyone say it looks better than native? Or do they just accept the lower resolution + scaling is “good enough”.

      That said I hate it. Give me perfect integer scaling or nothing at all.