I’m almost afraid that some people will hate me for this article. Not because I deliberately want to provoke, but because it expresses a truth that many people don’t want to hear. The realization that the development of classic screening performance has been stagnating for years is perhaps uncomfortable for some. But these thoughts have not just been with me since today. They have grown over the years, and I still remember the exact moment when I first realized where the journey would take me.
It was with Battlefield V, when ray tracing was advertised for the first time. Suddenly, entire worlds were reflected in the floors, as if someone had beaten good old Master Proper through the scenery. It was an effect that was technically impressive, but at the same time felt strangely artificial – as if reality itself had been given a graphic filter. At the time, I described ray tracing as a beautiful die, but a very slow one, because while it raised the image quality to a new level, it also pushed the performance into a range that made playing the game more of a test of patience. But at the same time, I realized that this was the future: no longer higher FPS through raw processing power, but new rendering techniques that rely on tricks and optical illusions to make the image look more realistic.
Since then, I have become increasingly aware that classic GPU development is heading for a dead end. The belief that each new generation would automatically bring a huge leap in performance was justified in the early 2000s, but today this mechanism simply no longer works. Instead of real progress in rasterization, technologies such as ray tracing, AI upscaling and temporal reconstruction dominate, giving the impression of higher performance without improving actual raster performance to the same extent.
So this development was quite foreseeable, and yet I’m surprised at how long the myth of constant leaps in performance has persisted. If I now address this so openly, some will dismiss it as pessimism or accuse me of trying to denigrate technical progress or defend NVIDIA’s new 5000 series. But that is not my intention at all. Rather, I see it as a sober analysis of what has become apparent over the years – and perhaps this is precisely the point at which it will be decided whether the GPU architecture as we know it will still play a central role at all in the future.
For years, the development of graphics card performance has shown a pattern that at first glance suggests progress, but in reality conceals stagnation. Particularly in the area of rasterization – i.e. the classic rendering method without the use of ray tracing – it is clear that the increase in performance is far less dynamic than the manufacturers’ marketing statements suggest. Here, NVIDIA and AMD would have been more honest and more transparent in admitting the realities for the end customer.
Above all, the switch to smaller production processes is the reason why performance is still increasing (at a manageable rate). More efficient nodes enable higher clock rates and more transistors per chip area, which is reflected in increased computing power. However, it is often overlooked that the architectural advances that were once responsible for significant leaps in performance have increasingly failed to materialize in recent years. At the turn of the millennium and well into the 2010s, new GPU generations often led to impressive performance increases, often in the region of 30 or 50 percent and sometimes even more per generation. This was due to fundamental improvements in the architecture, such as more efficient shader units, wider memory connections or new algorithms for rasterization and compression. Today, the situation is different: Pure raster performance is still growing in small steps at best, as architectures are reaching their physical limits and there is only limited potential for optimization.
Another indication of stagnation is the lengthening of product cycles. Whereas there used to be real leaps every two years, nowadays generation changes are barely noticeable and there is only a maximum of one-upmanship in between. The advances in miniaturization, made possible by new manufacturing processes such as 5 nm or soon 3 nm, maintain the appearance of further development, but do not change the fundamental problem: screen performance is no longer growing at a rate that could keep pace with the expectations of technological evolution.
This development raises the legitimate question of whether the classic GPU architecture has already reached its zenith. If performance increases can only be achieved through external factors such as more efficient production or alternative rendering techniques, this could indicate that a fundamental paradigm shift is imminent. Whether this will take the form of new hardware approaches, an increased focus on specialized AI-supported rendering processes or a further shift to cloud gaming remains to be seen. What is clear, however, is that the major leap in performance in the field of rasterization that many expect will probably not happen in the near future.
It is clear that the increases in quality and performance are increasingly linked to other technologies, such as ray tracing or upscaling methods like DLSS and FSR. Although these technologies represent progress in their respective areas, they conceal the fact that classic screening performance is stagnating. This is particularly noticeable in a direct comparison of older and newer GPUs in games that do not use modern rendering techniques. This shows that even the latest generation of high-end graphics cards often only offer a moderate increase in FPS, which is disproportionate to the increasing power consumption and rising prices.
Back then, when I first became aware of ray tracing in Battlefield V, I smiled at it and dismissed it with a rather flippant remark. It was a saying that seemed appropriate to me at the time, because the visuals were more impressive, but the performance dropped so much that you had to seriously ask yourself whether it was worth it. Today, years later, ray tracing is no longer quite so ruinous for the frame rate, but the fundamental problem remains: Raw performance is stagnating, and instead new technologies are constantly being introduced to polish up the image without increasing actual processing power. Upscaling, frame generation and ever more aggressive temporal algorithms are saving the optical illusion, while hardware development has long since ceased to make the progress it actually promises.
So you could say: more images per second through fewer real images per second. Because that is exactly what many modern technologies boil down to. Smoother animations and higher resolutions are no longer achieved through raw performance, but through calculations designed to compensate for the lack of rendering. But even if these methods undoubtedly have their justification, the feeling remains that they increasingly serve to conceal the actual stagnation. And perhaps this is the real paradigm shift: in the past, a leap in performance was achieved through pure hardware power, today it is suggested through software-side tricks. What used to be seen as a stopgap solution for weak hardware is now sold as innovation.
Another point that makes this development particularly bitter is the price. Whereas in the past each new generation of graphics cards was not only faster, but often also similar in price or sometimes even cheaper than its predecessor, today even mid-range models have reached price regions that were previously reserved for high-end cards. It’s easy to complain about an RTX 5080 or 5090, but the real problem starts much further down – where the real price-performance battle used to take place and where 600 or 700 euros for a mid-range GPU is now considered normal. Crude comparisons have to be made to show that a GeForce RTX 5070 with AI frames achieves the performance of an RTX 4090 at an extremely attractive price. Applause and whoever believes it…
It’s almost ironic that NVIDIA, the biggest profiteer of the current hardware era, no longer earns its money with gaming graphics cards. The huge billions in profits come from the AI sector, where data centers are willing to pay sums that cannot be compared with the gaming market. For NVIDIA, classic GPUs for gamers are now more of a side business that is taken along but is no longer the focus. And you can see that. Prices are rising, while the actual performance apart from ray tracing is hardly increasing in large steps.
Nevertheless, many find it difficult to accept these price increases – while at the same time we watch other products become almost resignedly more expensive. Food and energy have risen in price much faster and more sharply than graphics cards in recent years, and yet the resistance to more expensive GPUs is particularly loud. The reason is probably that technology has long been an area where progress and price stability have been taken for granted. Nobody expects a loaf of bread to get better every year while costing the same, you’re happy if it doesn’t get worse – but that’s exactly what was considered normal in the IT world for decades.
However, this expectation is now being disappointed. If you spend 1,000 euros on a new graphics card today, you no longer get the leap in performance that you would have received for the same amount ten or twenty years ago. Instead, a lot is sold via software optimizations or upscaling technologies such as DLSS, which may make sense in certain scenarios, but do not hide the fact that raw performance is hardly increasing at all.
The result is a market that is moving further and further away from what many gamers were used to. While hardware used to convince with performance, the argument today is about algorithms. While an upgrade used to be a clear improvement, today you have to ask yourself whether it is really necessary or whether an older card with a few tricks won’t last a few more years. And while many have long trusted that things will get better with each generation, the realization is slowly dawning that this progress is no longer happening at the pace we once took for granted.
In the end, the question remains as to how customers should deal with this development. On the one hand, it is frustrating to see graphics cards becoming more and more expensive, while raw performance is barely increasing. On the other hand, it is up to everyone to decide whether they want to play this game or not. Nobody is forcing you to buy a new GPU every two years if the old one is still doing its job without any problems.
Perhaps it’s time to question your own buying reflex. Do I really need a new graphics card just because a new generation is being released? Or will my old card last for years with a few software tricks? The truth is that many upgrades are now made out of habit rather than genuine necessity. Marketing does its bit to trigger this reflex, but in the end it’s the customer’s wallet that decides. Those who buy into it pay ever higher prices for ever smaller advances. On the other hand, those who approach the matter with more sense and less compulsion to buy not only save money, but also disappointment.
The dilemma, however, is that this conscious consumption also has its price – not for the buyer, but for the development itself. If fewer people buy new graphics cards, manufacturers lack the incentive to drive real innovation. The standstill that we are already experiencing today could then only get worse. But maybe that’s not such a bad thing. Perhaps we should simply accept that we don’t have to go through every cycle while others continue to plunge into the financial abyss.
And that’s exactly why the best strategy is: let the lemmings run. Instead of spending thousands on minimal leaps in performance, buy yourself a big bucket of popcorn, sit back and watch in amusement as the market reduces itself to absurdity. Because if one thing is certain, it’s that the spectacle is far from over.
Just buy it!
When Avram Pilch (does anyone still know him today?) published his legendary “Just buy it” appeal on Tom’s Hardware back then (and secretly edited it a few times afterwards), you could almost get the impression that he had either already recognized the holy truth at the time or had simply had a particularly thorough sojourn in the green marketing car wash. The message was clear: don’t question anything, don’t think too much – just buy. An attitude that has fueled exactly what we see today: constantly rising prices, stagnating performance and a customer base that all too often convinces itself that it’s all right.
The question you have to ask yourself is: what did he get for it? Was it just an inner enlightenment after years of hardware consumption, a kind of Stockholm syndrome towards the GPU industry? Or was it something more tangible? Maybe it was just the reassuring feeling of being part of an industry that has long since stopped caring about the price-performance ratio of its products? But perhaps it was nothing more than a thorough brainwashing. After all, the belief that every new GPU is not only better, but also worth every penny, has to come from somewhere. Perhaps from a deep trust in the manufacturers, perhaps from the fear of missing out – or perhaps simply from the simple refusal to admit that the great leap forward has long since failed to materialize.
Either way, we remember “Just buy it” as the motto of an era in which buyers blindly assumed that progress was a matter of course and that the customer was a dumb herd animal. Today, when reality has long since caught up with this, perhaps a new motto would be appropriate: “Just think first.” But that’s exactly what manufacturers and certain influencers would rather avoid. And even if I find certain technologies interesting and challenging for the very reasons I’ve just talked about, that doesn’t mean I fully endorse them. That’s exactly what I had to get off my chest today, because you should also read between the lines in my reviews. I like to be infected and inspired by new technology, but it won’t influence my purchasing behavior. That’s where I tend to be conservative. I stopped smoking many years ago and I haven’t drunk alcohol for a long time, so the pixel deprivation should still work.
Vote tomorrow and vote wisely. The term ballot box does not imply that you have to bury your vote there. Everyone will have to sort out the rest for themselves on Monday at the latest. But if you want to preserve democracy, then you have to give it a voice – your own. To whomever, because that is also part of it. However, anger and indifference are not good advisors.



































115 Antworten
Kommentar
Lade neue Kommentare
Veteran
Veteran
Urgestein
Urgestein
Urgestein
Urgestein
Urgestein
Urgestein
Mitglied
Veteran
Veteran
Urgestein
Mitglied
Urgestein
1
Mitglied
Urgestein
Mitglied
Urgestein
Alle Kommentare lesen unter igor´sLAB Community →