The Nvidia GeForce RTX 2080 is finally a real, tangible GPU and with six times the power of Pascal-based GTX GPUs, it’s a beast. But how does Nvidia’s best graphics card ever stack up to its eternal rival, AMD, with its Radeon RX Vega 64?
Well, it’s never going to be apples-to-apples comparisons when it comes to products from different companies. However, since they both aim to generate incredible visuals for your favorite PC games, we’ll pit them against each other on paper before taking them into the lab later this year.
That being said, from their design, to their projected performance, to how much that performance will cost you, let’s look at which graphics option for you is better: the Nvidia GeForce RTX 2080 versus the AMD Radeon RX Vega 64.
Right off the bat, the designs of these two graphics card are drastically different: The AMD Radeon RX Vega 64 uses a traditional, air-based cooling mechanism with a single fan drawing cool air in and driving it out the rear vents while the RTX 2080, uses a dual-fan approach which involves a heat sink in the process of dispersing hot air. (Of course, the idea here is for greater temperature control, but we won’t see that bear fruit until a full review.)
AMD has a slight leg up here on Nvidia in producing an official liquid-cooled version of the RX Vega 64, whereas Nvidia has yet to publicly discuss any such option for its newest card. We’ll see whether Nvidia’s dual-fan air cooling system bridges that gap.
The other major important factor of design, connectivity, is where the RTX 2080 pulls ahead with the capability to house a USB-C port in addition to the usual DisplayPort and HDMI ports. This comes standard on Nvidia’s Founders Edition version of the card, and is likely available for manufacturers to implement if they so choose.
The RX Vega 64 has no such option, but of course includes all of the latest connectivity standards otherwise.
This distinction is important, because USB-C is poised to become the de facto official connection for virtual reality (VR) hardware and applications in the near future. The RTX 2080 is ready for this next phase in simplifying VR, and the RX Vega 64 is not.
Of course, we have yet to run any benchmarks on the RTX 2080 because of how fresh-off-the-presses this card is. However, we can compare the two graphics cards on paper using their ratings for various basic performance metrics.
Before we tackle the RTX 2080, let’s lay out what the RX Vega 64 is capable of first. This graphics card operates at a 1,247MHz processor frequency, or clock speed, which is able to boost up to 1,546MHz using basic tools.
The RX Vega 64 GPU itself contains 4,096 stream processors and houses 8GB of High-Bandwidth Memory 2 (HBM2) video RAM that can process up to 484.3 gigabytes of data per second, or GB/s, with a speed of 1.89 gigabits per second, or Gbps.
Now, the RTX 2080 runs at a base clock speed of 1,515MHz, just a hair shy of the RX Vega 64’s boosted speed, and at 1,710MHz when boosted. While Nvidia’s CUDA cores and AMD’s stream processors are different, they generally accomplish the same task: rendering pixels and carrying out other compute tasks.
To the RX Vega 64’s 4,096 stream processors, the RTX 2080 has just 2,944 CUDA cores. However, the CUDA cores are more versatile than the stream processors, able to carry out a wider variety of compute tasks, whereas stream processors are more specialized for efficiency.
Back to more meaningful comparisons, the RTX 2080 also has 8GB of video memory, but on the new GDDR6 standard – a follow-up to GDDR5X– rather than AMD’s HBM2. This memory actually has marginally less bandwidth than the RX Vega 64, able to process up to 448GB/s, but at an exponentially higher speed of 14Gbps.
The RTX 2080 does all of this with 215 watts (W) of power, while the RX Vega 64 requires 295W from your system’s power supply.
In the end, judging by processor and memory speed alone, as well as power draw, the RTX 2080 looks to be a clear winner here. However, this doesn’t account for differences in processor design and other factors, so only a full run of benchmarks will hold the true answer.
Now, for the ultimate deciding factor: how much these things cost.
AMD’s current suggested retail price for its RX Vega 64 is $499 (£549, about AU$630), but third party manufacturers, like Gigabyte and Asus, are still selling the card for far more than that on account of increased demand from cryptocurrency miners.
Meanwhile, the RTX 2080 is considerably more expensive, with the Founders Edition costing $799 (AU$1,199, about £602). That’s a much higher price than the GTX 1080 of yesteryear, a mere $599 (£600, AU$925) in comparison.
Of course, you should also consider the implied cost of these graphics cards, judged largely by their power draw. The RTX 2080 requires less power than the RX Vega 64, which could see you needing a beefier power supply to support the latter – an incurred cost, if your system isn’t properly equipped for the job.
Which card is the better pick?
As pricey as it is, the RTX 2080 offers features in games that simply aren’t possible with the RX Vega 64 – namely real-time ray tracing for more realistic lighting and shadows in games. That fact alone, not to mention the price, makes this versus bout a difficult one.
Sure, on paper, the RTX 2080 wins in almost every regard. But, that doesn’t account for what you actually need from a graphics card.
If you seek the absolute most up-to-date capabilities from your graphics card, then clearly the RTX 2080 – or even the slightly cheaper $599 (AU$899, about £451) RTX 2070 – is your ticket to the latest and greatest.
The RTX 2080 looks as if it’s going to burn through games at Ultra settings and 4K resolution, much less make them look better with ray tracing. However, if you simply want to be able to game at 1080p or even 1440p resolutions with the settings all the way cranked up, the RX Vega 64 is by far a more cost-effective way there.
So, even in 2018, the classic dichotomy between Nvidia and AMD graphics cards remains: splurge for Nvidia if you need the absolute latest and greatest, but AMD’s best will serve more mainstream gamers just as fine for far less.
Let’s block ads! (Why?)