Is the Radeon RX Vega 'Too Little, Too Late'?

AMD’s flagship GPU arrives with some big questions.

31

Officially announced last night at the SIGGRAPH conference in Los Angeles, AMD’s new Radeon RX Vega GPUs are the talk of the PC hardware town. Arriving a little over two years after the company’s flagship Fury-series cards, the RX Vega line shares some similarities to its predecessors. Cards from both series are offered in air-cooled and liquid-cooled variants and also give users the option of buying a card for small form factor systems. Formally announced after the big show, the RX Vega Nano will fill a niche that was previously occupied by AMD’s own Fury Nano. GPUs in both series offer some strong DirectX 12 and Vulkan API performance. The similarities mostly end there, however.

In the lead up to the RX Vega reveal, AMD's marketing has been positioning its new GPU as a competitor to nVidia’s 2016 flagship, the GeForce GTX 1080. Early leaks and data from AMD itself have shown that the top-end Vega GPU, the Radeon RX Vega 64, trades blows with the GTX 1080. The company has priced the RX Vega 64 accordingly, matching the $499 MSRP of nVidia’s card. Unlike its Fury predecessor that launched against the GTX 980 Ti, the RX Vega 64 will not be competitive with nVidia’s current flagship offering, the GeForce GTX 1080 Ti.

Not being the fastest GPU on earth is not really a big problem for the RX Vega 64, but arriving over a year after its intended competition and doing so with nearly twice the power draw certainly decreases the allure of the brand new GPU. With any GPU launch, availability can often be an issue. PC Gamers that have waited more than a year for AMD’s response to nVidia’s Pascal cards will continue to wait until there are enough cards in the retail channel to ensure that the inflated pricing that is typically seen with new launches is alleviated.

The wait for that price stabilization may take longer than normal due to the ongoing GPU price spikes as a result of cryptocurrency mining. AMD’s mid-range GPUs have been unavailable at MSRP for months now. Many PC gamers who owned those cards prior to the mining boom have sold them at a profit to miners, with more than a few choosing to wait for the RX Vega launch to pop a new GPU into their systems. The additional demand will go beyond what is typical with a normal graphics card launch. Due to its reduced cryptocurrency mining capabilities, the nVidia GTX 1080 has not suffered stock depletion as badly as its lowered-powered sibling the GTX 1070 or AMD’s Polaris GPUs. The GTX 1080 can still be had at or near its $499 MSRP, with several models often going for less during sales.

With low supply and high demand ensuring that higher prices will be a part of the RX Vega 64 launch, does it make sense for PC gamers that are looking for a new high-end GPU to go with AMD? The extremely high power consumption of the RX Vega cards coupled with limited availability will work together to create an option that appears to offer little value to the end user when compared against the GTX 1080. Buyers who already own a Freesync-enabled monitor may be more inclined to spend extra to go with AMD so that they may make use of the adaptive-sync features of their monitor.

AMD appears to be betting on this scenario as the company made the difference between its FreeSync and nVidia’s G-Sync experiences a focal point during recent RX Vega demos. The company will also be selling special RX Vega “packs”, some of which include discounts for a Freesync monitor and its own Ryzen CPUs if gamers are willing to pay extra for the RX Vega 64 upfront. The RX Vega 64 pack includes the liquid-cooled version of the card, discount coupons for the monitor/CPU, and a couple of pack-in games for $699, the same price that nVidia sells their GTX 1080 Ti. AMD is betting that these packs will offer an enticing value for prospective buyers and get them locked into AMD’s gaming ecosystem. For PC gamers who plan on using their current monitor and system, the $699 pack loses much of its appeal.

For those still on the fence about which GPU to buy, is the RX Vega bundle pack something you would consider? Do you think that the RX Vega 64 by itself works for $499, even with the increased power draw? Let us know in the comments!

Contributing Tech Editor

Chris Jarrard likes playing games, crankin' tunes, and looking for fights on obscure online message boards. He understands that breakfast food is the only true food. Don't @ him.

From The Chatty

  • reply
    July 31, 2017 10:00 AM

    Chris Jarrard posted a new article, Is the Radeon RX Vega 'Too Little, Too Late'?

    • reply
      July 31, 2017 10:10 AM

      It sounds like you're saying this card doesn't meet the Jarrard Standard.

      • reply
        July 31, 2017 11:00 AM

        Not possible to make an accurate call on that just yet. When NDA lifts, the picture will become more clear.

        • reply
          July 31, 2017 11:18 AM

          Hopefully Vega will be able to claw itself into your good graces.

          • reply
            July 31, 2017 11:23 AM

            It'll be interesting when they finally unmask those benchmarks

            • reply
              July 31, 2017 11:32 AM

              That Vega sure does look nice though

              • reply
                July 31, 2017 2:34 PM

                Maybe they'll come to their senses and slash prices

    • reply
      July 31, 2017 10:26 AM

      Aww bummer

    • reply
      July 31, 2017 10:26 AM

      2017: The Year of the AMD Desktop

      • reply
        July 31, 2017 10:45 AM

        Ryzen 1600x sounds good. Vega doesn't.

        • reply
          July 31, 2017 10:47 AM

          If DX12 and mGPU were more mature, Vega would be in a good position for 4k 60fps+ in crossfire. But only like 3 or 4 games have decent DX12 mGPU implementations. Even DICE is dragging their feet with it in BF1.

    • reply
      July 31, 2017 10:31 AM

      Meh. Yes, probably. They card itself might be decent however, but it's not the leap people were expecting. 1080 performance with higher temps and power draw and 1080 can be found for the same price if you look hard enough. Decent bundle? Might be worth considering. Otherwise it's another miss.

    • reply
      July 31, 2017 11:04 AM

      The power draw on these cards is insane. They would be a tempting alternative to nVidia for the HBM2, but that power draw is nuts.

      • rms
        reply
        July 31, 2017 4:17 PM

        I bought the Fury X at launch, and was eager for Vega, but this powerdraw news makes me want to wait for a lower process node card. Fury X's are still going for ~$300 on ebay, it's semi-tempting to do the swap, but ugh.

        Plus, there are few desireable Freesync2 monitors available, and nextgen VR displays are at least a year off. It's frustrating, but I'll probably keep the Fury X and wait til, what? Q2 next year?!?

    • reply
      July 31, 2017 11:09 AM

      Any luck on fixing your pubg woes crabs?

      • reply
        July 31, 2017 11:26 AM

        Nope. If anything, I made them worse by reading my 4K TV as a monitor guide and taking the ride on the super-high resolution train. The game is still gives me very spikey framerates when I am near several enemy combatants. In town battles under the stress of redzone artillery cause the spikes to drop under 30fps sometimes.

        I will say that going from 1080p to 4K has made enemy spotting so much easier. While my avg fps is in the toilet at that resolution, I can easily spot distant enemies on the horizon or near trees. I was previously confusing bushes with poor LOD for people on my old panel, but now the difference is clear. For any folks who are having trouble spotting the bad guys, just go 4K!

        • reply
          July 31, 2017 11:34 AM

          That sucks. Friend of mine who went from his old HP ZR24 to a XB271 said that spotting was much better as well. Mostly I think due to brightness/contrast difference. Are you using your 970 on the 4k screen?

          • reply
            July 31, 2017 11:37 AM

            Yes I am. Some of my games are handled well, while others are not. For the most part, I am able to use the nVidia GPU scaling to run custom resolutions between 1440p and 4K and it has worked very well in some games. I can run Doom at a mostly solid 60fps using a mix of medium settings and a resolution of 2944x1656. I imagine that this type of upscaling is similar to what the consoles do to hit the magic 4K bullet point with sub-optimal hardware.

            • reply
              July 31, 2017 11:45 AM

              Damn, pushing that 970 to the limit! Were you waiting on Vega before buying a new gpu?

              • reply
                July 31, 2017 11:52 AM

                Not really. I've been wanting a display upgrade for a while now and the new wave of 4K TVs arrived with specs that were pretty crazy for the asking price. I effectively doubled up on contrast ratio, got myself within a few percent of AdobeRGB color accuracy and massively reduced my input lag. All of this was gained on top of adding 4X the resolution for less than $270. Even though I can't run everything maxed at native resolution, I am still very happy. A GPU upgrade is in my future for sure, though. Because I don't have Freesync or G-Sync, I can kind of choose whatever vendor gives me the best deal.

                I will note that getting HDR to work in Windows 10 has not been easy, but the two games I got it running with sure look amazing.

                • reply
                  July 31, 2017 12:01 PM

                  Yeah I figure it's another year or so before they get HDR all ironed out. Too bad you don't have gsync/freesync that actually would be really useful for 4K. Does it still look good when you run it @ 1440p?

                  • reply
                    July 31, 2017 12:05 PM

                    Yes, the scaling works very well. Some games look sharper than others, though. Rocket League is one title that does not play well with the scaling, most likely due to the way the grass and walls/fences are rendered. It's not much a problem though since the 970 can play it at native 4K at 60Hz with no issue. I tried both of the Metro titles at 3200x1800 and they looked magnificent, even with some slight frame drops here and there.

                    Adaptive sync would be nice, but a 1080 Ti would play virtually everything at 4K 60Hz on the highest settings and the impending nVidia Volta GPU release will likely show big gains at 4K for the new cards. Driving 4K at 120Hz+ will be the big problem for the next 2 years or so.

    • reply
      July 31, 2017 11:12 AM

      Thus far I'm filing Vega under "quite disappointing, might be a workable value proposition with the bundles"

      I'm still interested to see what the draw stream binning rasterizer brings to the party when they finally switch it on though.

    • reply
      July 31, 2017 11:24 AM

      "AMD is betting that these packs will offer an enticing value for prospective buyers and get them locked into AMD’s gaming ecosystem. "

      It doubles as a way to keep the Crypto hunters away, and the good part (as i mentioned earlier) there will be plenty of cheap screens and games for the rest of us when they have to offload them :-)

      • reply
        July 31, 2017 11:29 AM

        It's not official yet, but it is believed that the monitor coupon promo only applies to a single Samsung monitor, the Samsung CF971. I would not necessarily call that particular monitor "cheap".

        • reply
          July 31, 2017 11:32 AM

          My point is that Crypto burners are likely to buy the bundle, that leaves them with games and monitors they don't need. So they have to sell those of course.

          • reply
            July 31, 2017 11:34 AM

            You don't get a monitor with purchase, just a voucher for a discount. Also, the hash rates for the RX Vega 64 does not appear to be anywhere as lucrative as the Polaris cards due to the excessive power draw. Couple that with the increasing difficulty in mining Ethereum, and I doubt that RX Vega will be as juicy a target for Crypto-bois.

            • reply
              July 31, 2017 11:53 AM

              Sure?
              This means that if you buy a Radeon Pack SKU, you must buy the discounted hardware at the same time to get the discount.

              http://www.anandtech.com/show/11680/radeon-rx-vega-unveiled-amd-announecs-499-rx-vega-64-399-rx-vega-56-launching-in-august/2

              Regarding the raised difficulty, that is exactly where the new Vegas might shine:

              http://1stminingrig.com/amd-vega-frontier-edition-mining-performance-review/

              I know nothing about the Crypto, but that test does seem to indicate that a price around 500 would be attractive..

              • reply
                July 31, 2017 12:01 PM

                Yeah, but why pay $599 for the pack and $700+ for the monitor when you could just buy the card for $499?

                Also the article you linked shows a Vega hash rate that peaks at 40mh/s while the Polaris cards peak around 31Mh/s. Why pay more for such small gain in has rate when you also have to deal with the insane spike in power consumption that will eat into profits? Even if the supposed driver fix to increase has rate is real, it may not be wise to base such a big purchase on a rumor that AMD has a secret fix.

                • reply
                  July 31, 2017 12:05 PM

                  Suck Nvidia's cock a little harder why don't you. Shacknews? More like Shacknews, brought to you by geforce.

                  • reply
                    July 31, 2017 12:49 PM

                    Yeah I hate when people make sense during a discussion. I also tell them to suck a dick and sometimes even throw out a nazi.

                  • reply
                    July 31, 2017 2:54 PM

                    This is not a good day for AMD.

                • reply
                  July 31, 2017 12:37 PM

                  Uhh? Because you obviously would only buy the pack when you cannot get the other cards, which is more than likely at launch (and for a long time). And as a Anand says the package stuff seems to double as a good offer and a bad offer for those that only wants to crypto. Hence my my idea that Cryptos will buy these packages and sell of what they don't need.

                  Which brings us back to the coupon, is it a coupon offer or is it the way Anand describes it..?

                  The driver fix is not about vega, but the older cards, because as you mentioned Couple that with the increasing difficulty in mining Ethereum and that is going to be a problem (supposedly) for the old cards. But not for VEGA. And if the fix (that you don't believe in) doesn't materialize then VEGA may the one and only Crypto card worth getting. And obviously power draw will then be less of a problem.

                  As for the original article, too little too late depends on benchmarks, the heat is really not what i worry about, unless they are noisy of course.. https://www.youtube.com/watch?v=qspdnAYiiug

    • reply
      July 31, 2017 11:42 AM

      I don't get why power draw is so high on this newer node. wtf.

      • reply
        July 31, 2017 11:48 AM

        It's basically at maximum possible clocks and way past the efficient point for the process node.

        Part of why this is such a turd - you can't overclock it like the 1080 cards.

      • reply
        July 31, 2017 11:54 AM

        They are having to resort to very high clock speeds to get to a performance parity with nVidia. That requires a ton of juice on this design.

        • reply
          July 31, 2017 11:56 AM

          Probably a heat bomb too.

          • reply
            July 31, 2017 2:51 PM

            300W+ is quite the space heater.

            • reply
              July 31, 2017 2:54 PM

              I'm embarrassed to admit how long it took me to understand that power consumption always has a direct relationship with heat.

    • reply
      July 31, 2017 3:13 PM

      I have a freesync monitor, I knew that vega was going to be around a 1080 in terms of performance which is fine it's still an upgrade from what I have now, but double the tdp of a 1080?? That could be a deal breaker.

      I'll wait and see what reviews look like before deciding, but if I had to decide right now I think I would have to pass.

      • reply
        July 31, 2017 3:59 PM

        That makes no sense to me at all... How the hell do they justify needing a 500-600W powersupply JUST for their video card!? How is it that Pascal is so much farther ahead!? My 1070 is great with PubG, but still... COME ON.... AMD should have literally stopped production and just gone back to the drawing board. Drafting off Ryzen is a bad idea with a failure of a video card...

    • reply
      August 3, 2017 10:42 AM

      How much ethercoin can it farm per hour.