TL;DW:

  • FSR 3 is frame generation, similar to DLSS 3. It can greatly increase FPS to 2-3x.

  • FSR 3 can run on any GPU, including consoles. They made a point about how it would be dumb to limit it to only the newest generation of cards.

  • Every DX11 & DX12 game can take advantage of this tech via HYPR-RX, which is AMD’s software for boosting frames and decreasing latency.

  • Games will start using it by early fall, public launch will be by Q1 2024

It’s left to be seen how good or noticeable FSR3 will be, but if it actually runs well I think we can expect tons of games (especially on console) to make use of it.

  • kadu@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 year ago

    You’re not wrong about the reason, but why AMD is behind isn’t the point - they’re fundamentally behind, on technologies that actually do have a large impact. And tensor cores are hardware, there’s no getting around that.

    I’m not buying a card that doesn’t support DLSS ever again. Even if the performance is not needed, deferred rendering forces at least a couple of temporal reconstruction steps and they look horrendous, there are entire communities revolving around discussions about how ugly this looks. DLSS genuinely looks better - in many tests, better than native.

    So here’s a fundamental improvement to modern rendering and AMD simply can’t keep up. Let alone the gap in ray tracing performance.

    • Dudewitbow@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      Which then comes with the question of price/perf. Its not that its a bad idea that DLSS is better than FSR, but when you factor in price, some price tiers start to get funny, especially in the low end.

      For the LONGEST time, the RX 6600, which by default, was about 15% faster than the 3050, amd was significantly cheaper, still was outsold by the 3050. Using DLSS to cover the performance of another GPU does natively (meaning objectively better, no artifacts, no added latency) is when that argument of never buying a gpu without DLSS becomes weak, as the issue for some price brackets is what you could get at the same price or similar might be significantly better.

      In terms of modern gpus, the 4060ti is the one card everyone for the most part, should avoid (unless your a business china that needs gpus for AI due to the U.S government limiting chip sales)

      Sort of the same idea im RT performance too. Some people make it like AMD cant RT at all. Usually their performamce is a gen behind, so in situations like the 7900 xtx vs the 4080, could swing towards the 4080 for value, butnfor situations like the 7900xt, which was at some point, being sold for 700$, ots value, RT included was significantly better than the 4070ti as an overall package.

      • kadu@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        1 year ago

        I think we will fundamentally disagree here, not that I’m claiming you’re wrong.

        If AMD performs better without DLSS, but with DLSS the Nvidia card can pull ahead, the Nvidia card is ahead. There’s no “fake” performance. I don’t care how a frame gets rendered if it looks good and performs correctly.

        • Dudewitbow@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          Which is what.im.sayong, the condition of course that the gpus are priced close enough (e.g 4060 vs 7600). But when theres a deficiency in a cards spec (e.g 8gb gpus) or a large discrepancy in price, it would favor the AMD usually .

          Its why the 3050 was a terribly priced gpu for the longest time, and currently, the 4060ti is the butt of the joke, and someone shouldnt use those over the AMD in the said price range due to both performamce, and hardware deficiency(vram in the case of the cheaper 4060ti)

          • kadu@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            Unless they want DLSS to avoid the ugliest possible TAA artifacts. Or they care about ray tracing. Or they want DLSS 3 to play games that aren’t latency critical at significantly higher performance.

            Being better at rasterization would be the end of the argument 10 years ago. It isn’t today.

            • Dudewitbow@lemmy.ml
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              1 year ago

              In the case of the 4060ti 8gb, turning on RT puts them past the 8gb threshold killing performance, hence hardware deficiency does matter in some cases.

              • kadu@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                In the very few games where this was true, recent updates fixed that - mostly because they were hammering VRAM as a quick compensation for their reliance on console decompression hardware.

                Seriously. Give me a title where this is true, and we can search for data on the most recent patch.

                • Dudewitbow@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  1 year ago

                  Many fixed that, by having to adjust on the fly loading, which speed can vary depending on how often it has to swap. A game that still has odd issues with 8gb vram is Halo Infinite, mainly because its hard to test as the problem arises when getting to the open world part of the game, and requires about 30 minutes to get to the point where it happens. It was discussed in a HUB video a month or two ago. Models and textures like bushes start to look worse from that point on

                  Games are adjusting assets on the fly, so even though the framerate may seem “normal” the visual quality nowadays might not be.