If you’ve paid any attention to the tech space recently, then I’m sure you’ve heard the shiny new 50 series GPU’s have finally been released. First met with, “wow this is amazing, where’s the but”, then I saw a demo. As I watched a video of an in house demo from NVIDIA, at first it looked unbelievable a GPU running Cyberpunk 2077 at 200-250 FPS, when I looked closer artifacting was everywhere, for those who don’t know artifacting is when a computer leaves little outlines or inaccuracies that look out of place or can even be a sign of issues in your computer. The artifacting with Ai Frame generation is usually minimal as long as you feed the Ai enough content to generate more FPS. The old king of Ai frame gen I personally loved, it was called DLSS (Deep Learning Super Sampling) among frame gen it could also up your resolution artificially, it worked by inserting an Ai frame in between the real frames computed by your computer allowing you to double your frame rate without sacrificing visual quality. The problem with DLSS 4.0 (exclusive to the 50 series lineup) is it introduces Multi Frame Generation, this lets users put up to 3 Ai generated frames in between real ones. This is where artifacting happens, even on and flagship 5090 in the demo there was artifacting all over the place, this happens because only one in four frames is real which means in 4 seconds of game play the Ai makes up 3/4’s of what you see which is bound to have little things the Ai gets wrong. The 90 level used to be kept for the GPU’s so powerful that no game no matter the settings could make it bow, GPU’s that drew so much power they literally melted their power connectors, well at least 50 series still melt power connectors. But if 3/4 of what I’m looking at is fake then the game isn’t bowing to the card it’s bowing to Ai at the cost of clarity.

What about non-gaming scenarios?

The problem gets worse when you think of computational tasks that don’t need to just look nice but actually need the raw power, for example, vfx artists are constantly needing faster GPU’s for rendering. When testing with the 5090 founders edition (comes directly from NVIDIA) a user on Tech-Spot saw an only 35% increase in raw performance over the 4090 FE, which is still good but nothing compared 51% increase from the 3090 to 4090. With AMD stating that they will not compete with the 5090 at all, saying they’d rather make cards for consumers and a competitor card would be too expensive, we all know this means that they can’t compete not that they won’t, before you disagree, this comes from a company that had a CPU called the thread ripper with a price of $999. So with AMD’s current flagship being 75% slower than NVIDIA’s 5090, NVIDIA is slowing down in the race of performance that they were dominating, this could mean many things one of which is, vfx artist or people with similar jobs go back to previous cards like the 40 or 30 series until they slow production too much and there are not enough raw preforming cards available, or competitors like AMD start to see the need for pure performance and dig deep to design a competitor to 5090 and beyond.

Concerns going forward…

The issue I have with this is not that it’s bad now it’s that it is starting to go down that spiral, what if next year it’s only a 20% increase but they have newer Ai. But why should you as the consumer care, after all if it looks good right? But the fatal flaw with Ai frame gen is it is accurate not consistent, I’m fine with it being less accurate but things you see in games whether your walking in a field in fallout 4 or swinging in the streets in Spider-man their are little things like the texture of a brick or the gritty look of a pile of sand that I worry over Ai reliant cards will miss, leading to inconsistency. So, NVIDIA, please use Ai as a tool not a short cut to sell cards because Ai is the thing right now. Let us go back to a time when the premium price meant an enjoyable experience not a glitchy facade.

Sources:

https://www.pcworld.com/article/2592474/the-fate-of-nvidias-geforce-rtx-50-series-is-in-dlss-4s-hands.html

https://www.techpowerup.com/review/nvidia-geforce-rtx-5090-founders-edition/46.html

https://www.galaxus.de/en/page/the-rtx-5090-disappoints-in-terms-of-raw-performance-36477

https://www.ndtvprofit.com/technology/nvidias-reliance-on-ai-makes-gpus-cheaper-but-not-without-caveats

One response to “What ever happened to pure performance? NVIDIA’s insistent over use of AI in bank breaking GPU’s”

  1. Ms. Hibbard Avatar

    I love your passion for this subject! However, you use 3 acronyms in the first paragraph that I (and other readers) may be unfamiliar with. Consider providing more of an intro for those of us less tech savvy folks who aren’t familiar with the topic.

    Like

Leave a comment

Recent posts