As AMD and Nvidia start releasing their latest graphics cards, there’s one thing that’s clear as day: AMD is moving to re-establish itself as the market leader when it comes to affordability, and I couldn’t be happier.
I’ve had the privilege of playing many of the best PC games of the last few years using the best graphics card on the market, as well as many of the best cheap graphics cards at any given time, and a number of things are coming into focus in ways they might not have. been visible before the RTX era.
First, we all know that graphics cards are getting more expensive, especially high-end cards, and in the Ampere-e-Big-Navi era, there has been a closing of the gap between the two major card makers in terms of price (excluding the RTX 3090 and RTX 3090 Ti, which had no competing AMD Radeon RX card to compare).
Additionally, we can recognize that simply using these cards is becoming much more expensive, both in terms of the additional hardware required and your actual electricity bill. In fact, I wrote a hard-hitting opinion piece on this topic not too long ago.
Now that AMD has released its Ryzen 7000 series chips, and especially after it has announced its Radeon RX 7000 series graphics cards, I realize that I may have been too hasty to lump AMD together with the worst offenders in this regard.
When good enough is good enough
One of the things chasing high-end graphics cards is that you really get to the point where you have way more power than you really need, and the RTX 4090 is a perfect example.
It’s unquestionably the most powerful consumer graphics card on the planet, but unless you’re a creative pro who needs that level of raw performance, it’s absolute overkill for everything else.
Yes, it can play Cyberpunk 2077 in 4K with all settings maxed out and go above 40fps natively, but what’s the point? You can do much better with an RTX 3080 using DLSS set to performance. And honestly, it looks so good, especially if you’re not comparing the two side by side.
And that’s considering that Cyberpunk 2077 is one of the most demanding games on the market. Most PC games don’t go that far.
Meanwhile, the RX 7900 XTX seems to be somewhere between the RTX 3090 and RTX 4090 in terms of performance, which is pretty much all you’ll ever really need for gaming.
Beyond that point, you’re just paying the extra $600/£600 for the bragging rights. Even the Nvidia RTX 4080, which has yet to go on sale, has a significantly higher MSRP. So even if you compare the Radeon RX 7900 XTX to its declared competitor, it comes out ahead.
Ultimately, if the RX 7900 XT and RX 7900 XTX come close to their promised performance, it’s going to be pretty hard to recommend anything else to anyone other than the super-enthusiastic set.
About those power cables…
There’s also the issue of the 12VHPWR power cable that Nvidia adopted from the RTX 3000 series.
This cable, which takes four standard 8-pin connectors that come with all recent power supplies and converts them into a single 16-pin power connector, has been in the news lately. RTX 4090 customers have reportedly seen their very expensive graphics cards burned by failing power adapters and, in at least some cases, native 12VHPWR cables from ATX 3.0 power supplies.
We didn’t see anything wrong with the power cord on our RTX 4090 test unit and without the results of an official investigation by Nvidia and its partners or any independent tests that might verify the issue, it’s best to treat them as possibly isolated incidents involving these individual cables rather than a more systemic problem (for now).
but you know what It is a systemic problem? Creating a proprietary power adapter that requires additional investment from consumers who have already invested a lot of money in a graphics card. Sure, it does come with an adapter, but there’s something to be said for a graphics card that only uses the same 8-pin connectors that everyone else uses, which is the path AMD chose to go with the RX 7900 XTX. Point, AMD.
And those power requirements…
There’s a new benchmark that indicates the RTX 4090 Ti is on the way, and while the RTX 4090 Ti looks impressive by the numbers, the RTX 4090 already has a power requirement of 450W, which can be overclocked well north of half a-damn. kW level. What will an RTX 4090 Ti look like? Do we want to know right now?
There is advertising campaigns running now (opens in new tab) to get those in the UK to avoid the expected high energy costs this winter by taking a 30-day trip to Europe because it’s cheaper than heating their homes. Is it exaggerated? I have no idea, but the resigned shrugs I’m seeing from some UK colleagues over the prospect of higher energy bills tell me it’s at least a capital T true, if not fact.
Climate change and the many problems inherent in this nightmare aside, Nvidia and Intel seem to have decided that the way to stay on top is to force your way into dominance by pushing as much power as possible through your transistors, the which is an increasingly expensive proposition.
Even in the US, power bills are higher than they used to be, and running an obscenely high-powered graphics card or processor or both for the pleasure of 30-40 fps over the 90+ fps you’d get with a lower-powered card is simply not a worthwhile trade-off for the vast majority of people.
It was probably the biggest complaint in my opinion piece mentioned above, and it seems to be one that AMD is at least making progress. Keeping the RX 7900 XTX’s board power to just 335W is incredibly squeezing the kind of performance it claims with relatively little power consumption, I’m convinced.
Add to that, the AMD Ryzen 7000 series isn’t the most powerful on the market, and it’s not exactly low-power, but it is ahead of Intel’s dogged approach to throwing more power at the problem for better performance.
We have yet to see how the RX 7900 XTX and RX 7900 XT perform, so only time will tell, but at this point, I’m already selling AMD in this generation, and I can’t imagine I’ll be the only 1.