Is the Radeon RX Vega 'Too Little, Too Late'?

AMD’s flagship GPU arrives with some big questions.

31

Officially announced last night at the SIGGRAPH conference in Los Angeles, AMD’s new Radeon RX Vega GPUs are the talk of the PC hardware town. Arriving a little over two years after the company’s flagship Fury-series cards, the RX Vega line shares some similarities to its predecessors. Cards from both series are offered in air-cooled and liquid-cooled variants and also give users the option of buying a card for small form factor systems. Formally announced after the big show, the RX Vega Nano will fill a niche that was previously occupied by AMD’s own Fury Nano. GPUs in both series offer some strong DirectX 12 and Vulkan API performance. The similarities mostly end there, however.

In the lead up to the RX Vega reveal, AMD's marketing has been positioning its new GPU as a competitor to nVidia’s 2016 flagship, the GeForce GTX 1080. Early leaks and data from AMD itself have shown that the top-end Vega GPU, the Radeon RX Vega 64, trades blows with the GTX 1080. The company has priced the RX Vega 64 accordingly, matching the $499 MSRP of nVidia’s card. Unlike its Fury predecessor that launched against the GTX 980 Ti, the RX Vega 64 will not be competitive with nVidia’s current flagship offering, the GeForce GTX 1080 Ti.

Not being the fastest GPU on earth is not really a big problem for the RX Vega 64, but arriving over a year after its intended competition and doing so with nearly twice the power draw certainly decreases the allure of the brand new GPU. With any GPU launch, availability can often be an issue. PC Gamers that have waited more than a year for AMD’s response to nVidia’s Pascal cards will continue to wait until there are enough cards in the retail channel to ensure that the inflated pricing that is typically seen with new launches is alleviated.

The wait for that price stabilization may take longer than normal due to the ongoing GPU price spikes as a result of cryptocurrency mining. AMD’s mid-range GPUs have been unavailable at MSRP for months now. Many PC gamers who owned those cards prior to the mining boom have sold them at a profit to miners, with more than a few choosing to wait for the RX Vega launch to pop a new GPU into their systems. The additional demand will go beyond what is typical with a normal graphics card launch. Due to its reduced cryptocurrency mining capabilities, the nVidia GTX 1080 has not suffered stock depletion as badly as its lowered-powered sibling the GTX 1070 or AMD’s Polaris GPUs. The GTX 1080 can still be had at or near its $499 MSRP, with several models often going for less during sales.

With low supply and high demand ensuring that higher prices will be a part of the RX Vega 64 launch, does it make sense for PC gamers that are looking for a new high-end GPU to go with AMD? The extremely high power consumption of the RX Vega cards coupled with limited availability will work together to create an option that appears to offer little value to the end user when compared against the GTX 1080. Buyers who already own a Freesync-enabled monitor may be more inclined to spend extra to go with AMD so that they may make use of the adaptive-sync features of their monitor.

AMD appears to be betting on this scenario as the company made the difference between its FreeSync and nVidia’s G-Sync experiences a focal point during recent RX Vega demos. The company will also be selling special RX Vega “packs”, some of which include discounts for a Freesync monitor and its own Ryzen CPUs if gamers are willing to pay extra for the RX Vega 64 upfront. The RX Vega 64 pack includes the liquid-cooled version of the card, discount coupons for the monitor/CPU, and a couple of pack-in games for $699, the same price that nVidia sells their GTX 1080 Ti. AMD is betting that these packs will offer an enticing value for prospective buyers and get them locked into AMD’s gaming ecosystem. For PC gamers who plan on using their current monitor and system, the $699 pack loses much of its appeal.

For those still on the fence about which GPU to buy, is the RX Vega bundle pack something you would consider? Do you think that the RX Vega 64 by itself works for $499, even with the increased power draw? Let us know in the comments!

Contributing Tech Editor

Chris Jarrard likes playing games, crankin' tunes, and looking for fights on obscure online message boards. He understands that breakfast food is the only true food. Don't @ him.

From The Chatty
  • reply
    July 31, 2017 10:00 AM

    Chris Jarrard posted a new article, Is the Radeon RX Vega 'Too Little, Too Late'?

    • reply
      July 31, 2017 10:10 AM

      It sounds like you're saying this card doesn't meet the Jarrard Standard.

      • reply
        July 31, 2017 11:00 AM

        [deleted]

        • reply
          July 31, 2017 11:18 AM

          Hopefully Vega will be able to claw itself into your good graces.

          • reply
            July 31, 2017 11:23 AM

            It'll be interesting when they finally unmask those benchmarks

            • reply
              July 31, 2017 11:32 AM

              That Vega sure does look nice though

              • reply
                July 31, 2017 2:34 PM

                Maybe they'll come to their senses and slash prices

    • reply
      July 31, 2017 10:26 AM

      Aww bummer

    • reply
      July 31, 2017 10:26 AM

      2017: The Year of the AMD Desktop

      • reply
        July 31, 2017 10:45 AM

        Ryzen 1600x sounds good. Vega doesn't.

        • reply
          July 31, 2017 10:47 AM

          If DX12 and mGPU were more mature, Vega would be in a good position for 4k 60fps+ in crossfire. But only like 3 or 4 games have decent DX12 mGPU implementations. Even DICE is dragging their feet with it in BF1.

    • reply
      July 31, 2017 10:31 AM

      Meh. Yes, probably. They card itself might be decent however, but it's not the leap people were expecting. 1080 performance with higher temps and power draw and 1080 can be found for the same price if you look hard enough. Decent bundle? Might be worth considering. Otherwise it's another miss.

    • reply
      July 31, 2017 11:04 AM

      The power draw on these cards is insane. They would be a tempting alternative to nVidia for the HBM2, but that power draw is nuts.

      • rms legacy 10 years legacy 20 years mercury super mega
        reply
        July 31, 2017 4:17 PM

        I bought the Fury X at launch, and was eager for Vega, but this powerdraw news makes me want to wait for a lower process node card. Fury X's are still going for ~$300 on ebay, it's semi-tempting to do the swap, but ugh.

        Plus, there are few desireable Freesync2 monitors available, and nextgen VR displays are at least a year off. It's frustrating, but I'll probably keep the Fury X and wait til, what? Q2 next year?!?

    • reply
      July 31, 2017 11:09 AM

      Any luck on fixing your pubg woes crabs?

      • reply
        July 31, 2017 11:26 AM

        [deleted]

        • reply
          July 31, 2017 11:34 AM

          That sucks. Friend of mine who went from his old HP ZR24 to a XB271 said that spotting was much better as well. Mostly I think due to brightness/contrast difference. Are you using your 970 on the 4k screen?

          • reply
            July 31, 2017 11:37 AM

            [deleted]

            • reply
              July 31, 2017 11:45 AM

              Damn, pushing that 970 to the limit! Were you waiting on Vega before buying a new gpu?

              • reply
                July 31, 2017 11:52 AM

                [deleted]

                • reply
                  July 31, 2017 12:01 PM

                  Yeah I figure it's another year or so before they get HDR all ironed out. Too bad you don't have gsync/freesync that actually would be really useful for 4K. Does it still look good when you run it @ 1440p?

    • reply
      July 31, 2017 11:12 AM

      Thus far I'm filing Vega under "quite disappointing, might be a workable value proposition with the bundles"

      I'm still interested to see what the draw stream binning rasterizer brings to the party when they finally switch it on though.

    • reply
      July 31, 2017 11:24 AM

      [deleted]

    • reply
      July 31, 2017 11:42 AM

      I don't get why power draw is so high on this newer node. wtf.

      • reply
        July 31, 2017 11:48 AM

        It's basically at maximum possible clocks and way past the efficient point for the process node.

        Part of why this is such a turd - you can't overclock it like the 1080 cards.

      • reply
        July 31, 2017 11:54 AM

        They are having to resort to very high clock speeds to get to a performance parity with nVidia. That requires a ton of juice on this design.

    • reply
      July 31, 2017 3:13 PM

      I have a freesync monitor, I knew that vega was going to be around a 1080 in terms of performance which is fine it's still an upgrade from what I have now, but double the tdp of a 1080?? That could be a deal breaker.

      I'll wait and see what reviews look like before deciding, but if I had to decide right now I think I would have to pass.

      • reply
        July 31, 2017 3:59 PM

        That makes no sense to me at all... How the hell do they justify needing a 500-600W powersupply JUST for their video card!? How is it that Pascal is so much farther ahead!? My 1070 is great with PubG, but still... COME ON.... AMD should have literally stopped production and just gone back to the drawing board. Drafting off Ryzen is a bad idea with a failure of a video card...

    • reply
      August 3, 2017 10:42 AM

      How much ethercoin can it farm per hour.

Hello, Meet Lola