mascot
Mobile Menu
 

Have Graphics Cards Gone As Far as They Can?

With the announcement of the 5000 series of graphics cards this week, there was an odd reaction. Even the highest end 5090 – £2,000 or more – wasn’t enough.

The talk about “real” rendering vs processing like DLSS and frame generation came to a head in PC spaces. A small, vocal minority were extremely angry that their new very expensive graphics card was mostly focussed on tricks to make the image look better instead of genuine, traditional raster techniques. That was how the discussion was framed. As though there has never been a single graphical trick utilised until this point.

But despite how silly trying to differentiate between real and fake in rendering is, it raises a very real point. Not for the first time in recent years, we have to ask: is this as far as you need to go for graphics improvements?

It’s something we’ve been talking about with consoles for a while. Not that things can’t always get better. They can. The techniques for upscaling will improve. A few more frames and a few more “real” pixels will be eked out, creating a better bed from which to upscale. But the upscaling will remain an integral part of the process from here on out. And it changes the definition of what good enough means going forward.

I’ve been playing God of War: Ragnarok. At 1440p, I can get a solid 120fps at all times. It looks flawless. There are people who want more – higher resolutions, higher framerates. We’re into diminishing returns. It needs a lot more power to not look that much better.

Ragnarok is not a significantly better game at 240fps. It’s not a significantly better game at 1440p than at 1080p, or even at 720p.

Graphics Cards: The End of Evolution?

We all have a lower limit we can tolerate. That is fine. But when 720p can look like 1080p or above, and 30fps can look like 90fps (and potentially even feel better than 30fps with Reflex) the conversation must change. And, like with consoles, the water becomes muddy, and the question of “good enough” becomes confused.

Ragnarok is flawless. The Ultra models look real to me. How can it get any better than this?

The answer is with a bevy of ultra costly advancements that make your computer chug no matter how much you’ve spent. Indiana Jones uses raytracing no matter your settings, and it scales really well. Is full path tracing significantly better than the standard settings? Some would say yes, and will demand a £2,000 graphics card that can run it without any “fake” trickery. But for most of us, what we have is good enough, trickery and all. You can find the right combination of settings, of upscaling and AI advancements that will let you find your sweet spot.

There is no doubt in my mind that games, like Indiana Jones, will begin to rely on things like raytracing more and more. Games will get more difficult to run, and the scalability won’t always be as good as it is in an Id Tech game. New advances in DLSS and frame generation will ensure those left behind have some recourse in keeping up.

We may not see any major advancements in pure graphics rendering any time soon. We’re not suddenly going to making the leap to 8k and 400fps without DLSS and frame generation, even in the most expensive GPUs. But most of us don’t need it, don’t want it. Our graphics cards will keep us going for many years to come.

 

Article By

blank Mat Growcott has been a long-time member of the gaming press. He's written two books and a web series, and doesn't have nearly enough time to play the games he writes about.

Follow on:
Twitter: @matgrowcott