As a gamer, I always built my computers with one goal in mind: a smooth gaming experience. Two decades ago, that meant something entirely different than it does now, and for the most part, many of us were happy to run the latest games at 1024×768, if not less. These days, with ever-growing hardware demands and games that aren’t optimized too well, we’re dealing with a whole host of different issues.
At a time when 60 fps is no longer enough for many, and 30 fps isn’t possible in many games without upscaling and frame generation, it feels like getting the PC of your dreams is an unattainable goal … and yet, that PC relies on software to make those frames happen a lot of the time.
It kind of makes me wonder: when did hardware upgrades become so meh, and software upgrades become so crucial? And is that a good thing or a bad thing?
Native 60 fps didn’t die, but it definitely changed
It’s a constantly moving goalpost.
Chasing 60 frames per second (fps) is something all PC gamers have been doing for ages, but these days, that goal doesn’t mean much to a large subset of gamers.
Sure, many gamers are still running 60Hz monitors, but 144Hz+ displays aren’t expensive anymore. You can easily get your hands on hardware that could support more than 60 fps, which means that for many, that goal is now set higher.
The problem is that modern AAA games make it hard to max out the capabilities of a 144Hz monitor, let alone something even more substantial. There’s now a big discrepancy between the hardware gamers want to have, the results they want it to achieve, and the real-world performance it’s able to provide in certain titles. Some games will be a breeze, but others, not so much.
As a result, playing at a stable 60 fps is an ever-moving goalpost. Some gamers need more, while others are satisfied with 60 fps or less. Some want to play at max settings, and others don’t see much of a difference.
With these constantly shifting goals, meeting certain requirements has never been harder. Luckily, the last years have brought us a major crutch in the form of upscaling and frame generation.
Frame gen changed what performance even means
And it changed PC upgrades forever, too.
Before we dive in, there’s an important distinction to be made here: upscaling vs. frame generation. Upscaling is basically the trick that lets the game render fewer pixels than your monitor shows, then rebuild the image to look like it was rendered at your target resolution. So, instead of brute-forcing native 1440p or 4K (which is a painful process for entry-level GPUs), the GPU renders a lower internal resolution and lets the upscaler use motion data and pixel history to give it a boost and reconstruct it. It’s really neat.
Frame generation is a whole different beast, although it often gets lumped in with upscaling as if it’s the same thing. It doesn’t just upscale a frame, but it creates whole extra frames (these days; earlier versions created pixels) in between the ones the game actually renders. It can boost perceived smoothness even if your GPU wasn’t really made for running, say, Cyberpunk 2077 at max settings.
Introducing frame generation was a big move for Nvidia. At a small perceived cost to image quality, Nvidia was able to improve frame rates drastically across hundreds of titles. Nvidia’s controversial RTX 5070 vs. RTX 4090 claim may finally be on the verge of coming true, too.
With the introduction of upscaling and frame gen, 60 fps became a much easier goal to aim for across a wide range of titles. Sure, variables still exist, and DLSS frame gen works best when the GPU itself can offer a semi-decent frame rate before any of the software stack comes into play.
But none of that matters: Nvidia’s DLSS, followed by AMD’s FSR and even Intel’s XeSS, redefined what performance even means. For the first time in a long time, the responsibility for solid performance moved from the hardware to the software in a big way.
DLSS 4.5 isn’t the problem, it’s the symptom
But what else could be the problem?
A few years in, most of us gamers have learned to embrace DLSS (and similar tech). It’s mature technology now, which means that artifacting is much less of an issue than it was in the early days of DLSS. The addition of DLSS 4.5, with its second-generation transformer model, made it even better, with improved image quality and stability across modes. You can now safely play on DLSS modes that preserve quality and still see steady, healthy fps, even on older cards.
But for many, frame gen and upscaling feels like cheating. Where are the days of running dual GPUs to max out Crysis? Why are these so-called “fake frames” all we get from upgrades?
I get both sides of the argument, honestly. In a way, it is disappointing to see entry-to-midrange GPUs offer limited gen-on-gen hardware upgrades. Instead, it’s all in the AI capabilities and the latest iteration of frame generation software. High-end Nvidia GPUs still get a major boost every generation, but it only really applies to the top xx90 card—the RTX 4080 to the RTX 5080 wasn’t a massive jump, for instance.
Whether we like it or not, this is the reality of PC hardware right now: incremental upgrades hardware-wise, but a lot to be gained software-wise. I’m not here to tell you if that’s good or bad, but I will tell you this: DLSS is not the problem here, it’s the symptom of a few different things coming together.
Is DLSS an easy way out or a miracle fix?
Or maybe it’s neither of those two things.
PC gaming ports and overall optimization have been clunky in the last decade, to say the least. Gamers are often greeted by console ports that barely work, games that munch on VRAM like it’s candy, and steep hardware requirements that are hard to justify.
I’m not trying to judge any single game studio here. It’s really more of an observation on the state of AAA gaming. Sure, a lot of things get smoothed out over time, but the first few days are often a mess, and not everything gets resolved at all. In some games, you just have to embrace the suboptimal performance and work around it as best as you can.
That workaround is often not buying more hardware, but simply toggling on upscaling and frame gen. It can make the difference between “unplayable” and “decent.”
But at this point, it’s hard to say what came first. Did games start getting optimized for frame generation, assuming that most users will have access to it if they want to play at high settings? Or did GPU manufacturers learn to settle for less hardware-wise, because how much can you really scale a GPU before it’s no longer a consumer product?
Certainly, the semi-disappointing hardware upgrades have been made up for with frame gen, but it’s not a blanket fix for every problem under the sun.
In some titles, even high-end GPUs struggle to hit that mythical 60 fps figure at maximum settings at 4K. And that doesn’t feel good, which lends credibility to the “fake frames” side of the coin. If you’re spending $1,000 and up on a graphics card, shouldn’t you be able to game at max settings for a few years without turning on any sort of artificial frames?
DLSS vs. no DLSS often starts heated debates among gamers. I have to say that a few years in, I’m definitely in the camp that’s learned to embrace it. It’s clear to me that a lot of the more substantial GPU-side upgrades in the next decade will rely on software more than on hardware.
The upside? You won’t have to buy the most expensive GPU to run your games at a reasonable level. I expect that as the technology matures even more, we’ll have fewer problems and better experiences with each generation.
The downside? Among other things, one major issue will always be compatibility. Games that don’t support frame gen or aren’t optimized for it may run poorly no matter what you do, and VRAM requirements will continue rising.
In a decade, PC gaming will likely be unrecognizable, but the hardware we use to run our games may be shockingly similar to what we use today. It’ll just be capable of doing a lot more with AI.

