Old Nvidia gaming GPUs enjoy huge performance uplifts from new mod — DLSS 3 to FSR 3 mod enables frame generation to deliver up to 75% better performance on previous GeForce RTX GPUs

GeForce RTX 2080 Ti
GeForce RTX 2080 Ti (Image credit: Nvidia)

Don't fret if you don't own one of the best graphics cards. Since AMD released its FSR 3 source code a few months ago, modders have been busy creating new mods that inject FSR 3 frame generation into DLSS 3-supported titles. These new mods enable users to run FSR 3 in some games where FSR 3 is not implemented. Digital Foundry reviewed one of these FSR 3 mods and found it can breathe new life into older RTX 20-series and RTX 30-series GPUs, boosting performance up to 75%. The only major downside with these new FSR 3 mods is that they only support GeForce RTX series GPUs, not any AMD GPUs.

Digital Foundry used a global FSR 3 to DLSS 3 conversion mod that works with any game that supports DLSS 3 natively. Like FSR 2 mods of the past, the FSR 3 mod requires users to inject two new .dll files into a game's directory that features the FSR 3 source code. For RTX 20 and 30 series users, the mod Digital Foundry also has a registry edit to trick Windows into thinking these GPUs support DLSS 3. This is necessary so that DLSS 3 can be activated inside the game, which in and of itself will activate FSR 3 due to the modded .dll files.

Digital Foundry discovered fantastic performance uplifts with the FSR 3 mod in action. The review outlet tested several modern AAA titles, including Cyberpunk 2077 and Spider-Man: Miles Morales on an RTX 3080. In Cyberpunk, the RTX 3080 saw an impressive 67% uplift 1440P with DLSS performance upscaling and RT overdrive enabled. Without FSR 3, the RTX 3080 covered around the 60-70 FPS mark, but with FSR 3 enabled, the GPU could achieve over 100 FPS consistently, with frame rates going up as high as 120 FPS.

Miles Morales saw a similar performance uplift as Cyberpunk 2077, hovering around 65-75% better performance with FSR 3 enabled. The game was run at 1440P with DLAA enabled and max quality graphics with RT enabled. With FSR 3 turned off, the game hovered around 50-60 FPS, but with FSR 3 enabled, the game ran at about 100 FPS.

However, Digital Foundry discovered that the FSR 3 mod suffers from additional artifact issues not present in most games with proper FSR 3 implementations, including artifacts such as HUD elements, sizzling hair, and shadow flicker.

The FSR 3 mod isn't perfect, but it demonstrates how advantageous FSR 3's open compatibility is compared to Nvidia's DLSS 3 implementation, which is strictly limited to RTX 40 series GPUs. With FSR 3, older GPUs can benefit from frame generation technology, significantly boosting their lifespan. Having 65-75% more performance is equivalent to the same performance improvement you'd get from upgrading your GPU to one two generations newer. The only caveat is that you'll need to get used to FSR 3's additional latency penalty to enjoy that additional performance truly.

The icing is that FSR 3 can now be modded into games that already support DLSS 3. This is very advantageous since relatively few titles still natively support AMD's new FSR 3 frame generation tech.

Aaron Klotz
Freelance News Writer

Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.

  • oofdragon
    Fake frames does not equal perfoamce boost for the last time. Heck it doesn't even equal motion boost, it's just a gimmick like those old "120hz" tvs that actually run 60hz but looked super unrealistic
    Reply
  • Jagar123
    oofdragon said:
    Fake frames does not equal perfoamce boost for the last time. Heck it doesn't even equal motion boost, it's just a gimmick like those old "120hz" tvs that actually run 60hz but looked super unrealistic
    Yep, as Daniel Owen on YT puts it, frame generation is a motion smoothing technology, not a performance enhancing technology. I wish outlets would stop reporting this as performance boosting, especially because you need a good base frame rate to even utilize this technology.
    Reply
  • rluker5
    I'd like to see a comparison of the frame interpolation techniques of Nvidia, AMD and TV manufacturers.

    I've used that AMD driver one and TV ones and the interpolation on my TVs seems more consistent and has less artifacting than AMD's driver based one. I imagine Nvidia can beat the frame interpolation on my TVs, but not the latency on my Samsung at least.
    Reply
  • bit_user
    rluker5 said:
    I'd like to see a comparison of the frame interpolation techniques of Nvidia, AMD and TV manufacturers.

    I've used that AMD driver one and TV ones and the interpolation on my TVs seems more consistent and has less artifacting than AMD's driver based one. I imagine Nvidia can beat the frame interpolation on my TVs, but not the latency on my Samsung at least.
    Invasive framerate-enhancing technologies (e.g. DLSS 3) should typically have superior quality to your TV, since the game engine is computing accurate motion vectors instead of trying to reverse-engineer them through optical flow.

    Interestingly, Nvidia made a case in their DLSS 3 presentation that accurate motion vectors aren't always best. I think an example they gave was of a shadow moving across a textured surface. If the surface texture is low, then you get fewer artifacts if the interpolation follows the shadow edges than the texture. I think this was the case they made for why their newer GPUs have a hardware optical flow engine, and the DLSS inference stage performs fusion between the two options.

    Ideally, the interpolation engine would also get lighting information from the game and could interpolate the texture & shadow (or other lighting effects) differently, but I doubt they're that sophisticated. One thing they have to balance is how much work they impose on game developers, in order to add support for these post-processing stages. If it's too much work, then fewer games will adopt it and some may even use substandard implementations that have worse quality than if they'd correctly implemented a simpler method.

    I think the requirement of motion vectors was probably seen as reasonable, since a lot of games were starting to compute these for TAA, anyhow.
    Reply
  • d0x360
    rluker5 said:
    I'd like to see a comparison of the frame interpolation techniques of Nvidia, AMD and TV manufacturers.

    I've used that AMD driver one and TV ones and the interpolation on my TVs seems more consistent and has less artifacting than AMD's driver based one. I imagine Nvidia can beat the frame interpolation on my TVs, but not the latency on my Samsung at least.

    The one on a TV will add massive amounts of latency I wouldn't use it.

    Latency aside the little chip in literally any TV won't compare to even a APU for video processing of any kind so you will get significantly better results without the massive latency penalty. The chip in a TV is roughly 500x slower than a GPU core from the last decade and on a small fraction of the TV chip is going towards that terrible interpolation.

    As far as image quality goes FSRG or DLSS frame gen will look infinitely more temporarily stable without the the artifacts and it will also run at the proper refresh rate in everything but the first 2 FSRG games and every modded game with maybe a couple exceptions.

    I can't advise you strongly enough to never use your TV to do interpolation on games it's just not a good idea even if you're not sensitive at all to the latency which you must not be and that's not a dig some people aren't.
    Reply
  • Makaveli
    Hopefully they can continue to improve on the mod. The UI issues presented in the video would be an issue for me.
    Reply
  • Pierce2623
    While I don’t care about BS frame generation, I would be interested in hearing from the fanboys who insisted that Ampere and Turing were incapable of supporting DLSS3 rather than just being artificially locked out by Nvidia.
    Reply
  • bit_user
    d0x360 said:
    The one on a TV will add massive amounts of latency I wouldn't use it.
    Well, some TVs have a "game mode". That usually disables things like motion interpolation, but there could be some TVs which have a low-latency version.

    d0x360 said:
    Latency aside the little chip in literally any TV won't compare to even a APU for video processing of any kind so you will get significantly better results without the massive latency penalty. The chip in a TV is roughly 500x slower than a GPU core from the last decade and on a small fraction of the TV chip is going towards that terrible interpolation.
    You're basing this on what, exactly? I'm pretty sure TVs use hard-wired circuritry in ASICs or FPGAs, for motion interpolation. Nvidia built an optical flow accelerator into their GPUs, for the past two generations. Why do you think TVs don't have their own optical flow engines?

    d0x360 said:
    As far as image quality goes FSRG or DLSS frame gen will look infinitely more temporarily stable without the the artifacts
    My TV is 10 years old and its motion interpolator doesn't have temporal stability problems. It doesn't work terribly well when the image jumps by a large amount, but it just falls back to using the non-interpolated frame, rather than showing a bunch of artifacts.

    d0x360 said:
    I can't advise you strongly enough to never use your TV to do interpolation on games it's just not a good idea even if you're not sensitive at all to the latency
    I would sometimes turn on my TV's interpolator to play things like racing games and adventure games. It was glorious.
    Reply
  • scottsoapbox
    Fake frames do not equal a performance boost.

    The fact that the author on a tech blog does not understand this is embarrassing.
    Reply
  • Sleepy_Hollowed
    scottsoapbox said:
    Fake frames do not equal a performance boost.

    The fact that the author on a tech blog does not understand this is embarrassing.
    I mean... it's marketed as such, and you know what they say about marketing (ahem, propaganda).

    I'd love for either middleware tools to get good, or everyone stop thinking that they can run current games in 4K max settings without breaking the bank.
    Reply