Every nerd loves a good tech war: Windows vs Mac, Apple vs Android, Intel vs AMD. They give us something to armchair argue about over beers with friends—or to rant over in the comments of illustrious tech blogs. After spending the weekend playing with AMD’s new Vega 64 and Vega 56 graphics cards, I think I can safely say an old tech war is back on—even if AMD’s latest salvo feels paltry. Nvidia might be leading the discrete graphics card industry, but AMD’s two newest cards are cheap and fast enough to finally compete. And that can only mean good things for PC users.
AMD Vega 64 & AMD Vega 56
$400 (Vega 56); $500 (Vega 64)
WHAT IS IT?
Discrete GPUs that are competing against graphics titan Nvidia.
They’re fast enough.
Don’t expect any flash.
AMD, which purchased Nvidia’s previous competitor, ATI Graphics, has been losing the GPU war for a while. Nvidia is currently producing the majority of discrete graphics cards found in computers today. According to Jon Peddie Research, by the end of 2016 Nvidia had more than 70-percent of the discrete graphics market. AMD trailed far behind with just 29-percent. So AMD decided to focus on building cheap cards to go in cheap machines—like the 550 I reviewed back in April.
The AMD Vega 64 and Vega 56 are an attempt to lure people away from the ~$600 Nvidia 1080 with cheaper options that start to approximate its performance. When the Vega microarchitecture the cards are based on was announced back at CES, people didn’t immediately leap out of their seats. AMD didn’t have a cool hook like when Nvidia announced it had spent “billions” to develop its latest card. All AMD had was a promise of speed when the cards arrived this summer.
Eight months later, the Vega 64 and 56 are here. The AMD Vega 64 retails for $500, while the AMD Vega 56 retails for $400. Both also have 8GB of RAM built in. The big difference between the two is the number of compute units—think of those like the cores in a CPU (the more the better). The Vega 64 has 64 compute units and the Vega 56 has 56.
When I compared the Vegas to the 1080, what I found was far less exciting that what I’d hoped for. While the AMD Vega 64 was, on occasion, faster than the 1080, it never blew my hair back. Instead it was sort of like going to the car lot and having to choose between a Honda Civic or a Toyota Corolla. They’re both very nice, cost the same, and do the same damn thing.
What was really astounding was the performance of the $400 AMD Vega 56. Despite being considerably cheaper than both the Vega 64 and the Nvidia 1080, it played Overwatchand Civilization VI only marginally slower. The 1080 managed 112 frames per second playing Overwatchon Ultra at 4K, the the Vega 64 did a slightly faster 114, but the Vega 56 pulled of 99 frames per second at 4K with the graphics cranked to Ultra. That’s not just a little respectable, that’s really damn good. In Civilization VI the difference was even smaller, with the Vega 56 only taking about 1.5 milliseconds longer between frames than the 1080 or Vega 64.
In one case the Vega 56 did even better than the $500 Nvidia 1080, and was on par with the more expensive Vega 64 too. When I rendered a frame in Blender, graphics software that allows you to create large 3D images that heavily tax a discrete GPU, the Nvidia 1080 took 9 minutes and 34 seconds. The Vega 64 rendered the same frame in 9 minutes and 28 seconds. The Vega 56? Just 9 minutes and 29 seconds. With that kind of neck and neck performance there’s no reason to really buy either a Nvidia 1080 or Vega 64 over the Vega 56.
But speed isn’t the only factor to consider when buying a discrete graphics card. See, the cards that go in your desktop PC are very power hungry. If your power supply can’t provide enough juice, the GPU is worthless, and that is one place Nvidia performs far better than AMD every time. The Nvidia 1080 demands 180 watts of power to run. The Vega 56 requires 210 watts, and the Vega 64 requires a whopping 295 watts.
All the extra juice means you have to use not one, but two 8-pin power connectors from your power supply. The Nvidia 1080 requires just one 8-pin power connector.
With the new Vega cards AMD is trying to get around the power constraints of it GPUs by offering some software solutions. The first, and most notable, is the Radeon Chill feature, and it’s actually sort of clever. It operates under the assumption that people don’t really need the fastest video card, they just need one fast enough for their monitor. Both AMD and Nvidia have a technology that allows cards to “sync” with monitors to deliver top-level graphics without straining the GPU (Nvidia calls its tech G-Sync, while AMD calls it FreeSync).
But you have to have a special monitor that works with the syncing technology and no monitor works with both the AMD and Nvidia sync tech. Radeon Chill works with any monitor. You simply tell it how many frames you actually want to see per second. Got a monitor that refreshes 60 times a second? Set the max to 60 frames per second and Radeon Chill makes magic happen. In addition, the software doesn’t try and churn out 60fps if nothing is on screen. Instead it recognizes moments with static visuals and dramatically cuts down on how much power is being used—and you can tweak the number as well. Want it to never go below 30fps? Just set the slider in the AMD software.