AMD Shows off Radeon RX Vega in Budapest

Radeon Technologies Group allowed the public to play with the upcoming RX Vega GPU at its community event in Hungary.

24

AMD’s long awaited high-end GPU is less than two weeks away from its expected unveiling at the SIGGRAPH conference in Los Angeles. In the meantime, AMD’s Radeon team held an event in Budapest today as part of their pre-release RX Vega roadshow. While the RX Vega has been teased at various events over the last year, today marked the first time that the public got its hands on the GPU. The event was dubbed as an RX Community meet-up by AMD and was certainly different from your typical halo product demonstration.

The extent of the Radeon RX Vega demonstration was a pair of PCs at one end of the room for players to try. Both setups featured 3440x1440p ultrawide displays running at 100Hz. One PC included the RX Vega using Freesync adaptive sync and the other PC was using an unidentified nVidia GPU using G-Sync adaptive sync. Players were invited to try the machines under supervision of the Radeon team. Battlefield 1 and Sniper Elite 4 where the game reported to be running on the pair of PCs. No frame rate data was shown for either machine and reports from those in attendance said the AMD reps spent lots of time talking about how both setups offered an identical experience, with one of them costing $300 less to the end user. It was reportedly never revealed which PC contained the RX Vega GPU. According to some first hand accounts on Reddit and Twitter, the PC seated on the left hand side of the room felt noticeably smoother, despite claims from the marketing team that the experiences should feel identical.

The RX Community Meetup was not an event meant for the tech press, but for fans of the Radeon brand. Participants got a chance to see other Radeon products and get their hands on some swag.

Not all attendees were impressed by the event, as seen on Twitter afterwards -

Expect all the news, benchmarks, and reactions you can handle when the Radeon RX Vega is shown off at SIGGRAPH on July 31st.

Contributing Tech Editor

Chris Jarrard likes playing games, crankin' tunes, and looking for fights on obscure online message boards. He understands that breakfast food is the only true food. Don't @ him.

From The Chatty

  • reply
    July 18, 2017 2:35 PM

    Chris Jarrard posted a new article, AMD Shows off Radeon RX Vega in Budapest

    • reply
      July 18, 2017 2:40 PM

      Ugh, why even bother if they're going to do it like that? I still wait for the 31st.

      • reply
        July 18, 2017 7:15 PM

        Nvidia does this too. It sucks. https://www.hardocp.com/article/2016/05/07/nvidia_gtx_1080_1070_announcement

        "1080 faster than Titan X!"
        "1070 faster than Titan X!"

        2.1ghz air cooled!!!

        At least AMD had something you could theoretically try out, instead of putting it under glass a la Nvidia.

        • reply
          July 18, 2017 8:52 PM

          On the other hand, the percentage faster bar graphs nVidia showed at that event (and the 1080 Ti reveal) were spot on. Also the Doom demo at that reveal had framerate data on screen for all to see.

          • reply
            July 18, 2017 9:44 PM

            Admittedly I've only skimmed through the hour long presentation but there's no 1080 ti reveal and I missed the Doom demo. Also didn't see the bar graphs you speak of.

            Of course that might be because Nvidia didn't reveal the 1080 ti at the 1080 & 1070 event. That came like 9 months later.

            But none of that was really the point. My point was that both companies have had their share of overly optimistic claims and paper launches.

            • reply
              July 18, 2017 9:48 PM

              ...like I just heard him say near the end of the presentation that the 1080 gtx has twice the performance of Titan X. Obviously that was a bit of a stretch.

                • reply
                  July 18, 2017 11:23 PM

                  The "twice the Titan X performance" bit was a repeat from their VR stuff. It is total marketing BS, but in that one specific use case, it was probably true. The big claim of faster than the 980 in SLI was proved true and was incredibly impressive for a single generation's leap.

            • reply
              July 18, 2017 11:21 PM

              I never claimed that the 1080 Ti was revealed at the same show. Doom was certainly shown off at the Pascal reveal event - https://www.geforce.com/whats-new/articles/watch-the-geforce-gtx-1080-run-doom-at-up-200-fps-using-vulkan

              I'm not sure about a paper launch. The cards were announced for May 27th and one of my overexcited buddies ordered the 1080 FE direct from nVidia and had the card in hand that following Monday after paying ripoff overnight shipping.

    • reply
      July 18, 2017 2:58 PM

      IMO the "$300 less" claim is the key.
      Does that mean the consumer Vega will be in the 1070 - 1080 range? Maybe instead they tossed a ~$1,200 Titan XP in the nvidia system? Or perhaps they are basing GPU SKU's off Ethereum inflated prices?

      • reply
        July 18, 2017 3:03 PM

        I believe the $300 less claim referred to the GPU+Monitor combo cost.

        • reply
          July 18, 2017 3:08 PM

          If that's the case all the cost savings are in the free sync monitor, and if I gamble probably offsetting a higher priced Vega to boot.

          • reply
            July 18, 2017 3:15 PM

            If they can put out a card at ~$400 that is comparable to a 1080 they have a winner. Doubtful but it would be nice considering all the new freesync monitors.

            Those samsung QLEDs are pretty tempting.

        • reply
          July 18, 2017 3:40 PM

          Precisely, and $250 of it is the monitor.

        • reply
          July 18, 2017 6:51 PM

          It's still a lousy value proposition given AMD's trouble competing at the high end.

          Buy a Gsync monitor, get a 1080/1080ti now and upgrade to a Volta card etc in the future

          Buy a freesync monitor and get Vega with all of its flaws and wait and pray that AMD doesn't fuck up Navi as well.

          If I could afford a high end card and a monitor I would certainly suck up the premium for the Nvidia system.

      • reply
        July 18, 2017 3:06 PM

        I expect some serious bullshit to trickle down on this claim.

        • reply
          July 18, 2017 3:15 PM

          I'm still in the "buy monster GPU that performs above the panel refresh anyway" camp. I'd rather buy an excellent looking monitor instead of worrying about dumping money into adaptive sync and getting a panel with poor contrast and washed out colors.

          • reply
            July 18, 2017 9:13 PM

            There's no GPU that can run 144hz without any hiccups ever. Unless you play nothing but Counterstrike at lowest settings.

            • reply
              July 18, 2017 11:26 PM

              A panel does not need to be 144Hz to be worth considering. Many of the available 144Hz panels make some tradeoffs to hit those refresh rates as well. Obviously there is no perfect display, but adaptive sync is not a requirement to have a good gaming experience.

          • reply
            July 18, 2017 9:15 PM

            there are IPS adaptive sync monitors available.

            • reply
              July 18, 2017 9:17 PM

              Well here's at least one counter example... http://www.legitreviews.com/asus-mg279q-27-inch-ips-freesync-gaming-monitor-review_169461

            • reply
              July 18, 2017 11:25 PM

              IPS screens also have very suspect contrast ratios, along with backlight bleeding. All of the three major panel types have their weaknesses.

              • reply
                July 18, 2017 11:28 PM

                So you're anti-monitor?

                • reply
                  July 19, 2017 1:57 AM

                  No, I personally prefer the high-end VA-type panels. The contrast is unmatched (and is a huge factor in the overall look of a display) and the good ones offer color reproduction as good or better than similarly priced IPS panels. The big drawbacks being viewing angles (which doesn't bother me as I don't use additional monitors on the side of my main - also the angles are still way better than any TN) and some panels can exhibit annoying ghosting when over-driven.

                  • reply
                    July 19, 2017 7:22 AM

                    Have you ever used a PG279Q or XB271HU? If not you won't know how much better they are for gaming imo. The drawbacks are small compared to the relative performance. However if you don't primarily play FPS I could see it not mattering as much.

                    • reply
                      July 19, 2017 8:05 AM

                      Of those two I have only seen the ASUS in person. The incredible smoothness during gaming is impressive but for me, it isn't worth having a screen that looks washed out. I'm not sure of the exact word I'm supposed to use, but in very dark scenes, the thing looked splotchy with areas of the screen that looked brighter at the top edge and corners. It was less noticeable during fast moving colorful stuff (I mostly saw it with Battlefront), but seeing it play back a Youtube video and scrolling the chatty made it seem like the thing was busted.

                      For something that costs more than a reasonable car payment, I expected way more. I can get a bigger VA-type panel with eye-popping contrast and zero uniformity issues that runs at 75Hz for way less money. I do miss out on the smoothness for twitchy stuff, but everything else I do on my PC is so much better with the VA. In a perfect world, I'd have an adaptive sync 144Hz unit, but manufacturers seem to be focusing solely on ultrawides with newest high-speed VA panels. Going from 16x10 to 16x9 was bad enough, 21:9 is a no go for me.

                      • reply
                        July 19, 2017 8:19 AM

                        Yeah the BLB on the edges is certainly a drawback. But I don't play enough games with lots of black screens/areas that it has been very noticeable. I just can't see myself going back to anything sub 120hz and/or without gsync/fastsync.

                    • reply
                      July 19, 2017 9:14 AM

                      I regret my 279q a little. Never had an ips panel before, and the ips glow is bad in dark scenes. The gsync is great though. However next time around I think I will go for picture quality shadows and blacks over high refresh rate.

                      My benq 1070 projector beats the piss out of my 279q when it comes to picture quality.

    • reply
      July 18, 2017 6:18 PM

      I'm slowly starting to lose hope that AMD will ever be competitive again in the GPU market. Goddammit.

      • reply
        July 19, 2017 7:17 AM

        AMD does best when their competition fucks up or grows stagnant. I think Nvidia is done being stagnant (especially in the mobile arena) so AMD's going to have to hope they fuck up soon.

        • reply
          July 19, 2017 8:34 AM

          Eh, debatable. They competed pretty evenly throughout the Athlon, Athlon XP and Athlon 64 era.

          On the GPU front I have no opinion.

          • reply
            July 19, 2017 9:21 AM

            That's ancient history at this point. It would be great to see them competing at that level now or in the near term but that hasn't been the case for over a decade.

    • reply
      July 18, 2017 7:05 PM

      Hmm seems lame but c'mon need that competition

    • reply
      July 19, 2017 12:20 AM

      High end gaming performance won't matter because no one will be able to get one. The cryptocurrency miners will exhaust the supply and nVidia won't have to drop prices to compete because they'll have the only video cards available.

      Either way, gamers as customers are fucked for this generation.

      • reply
        July 19, 2017 8:09 AM

        The crypto craze seems to be over already.

        This Vega card pulls 300W+, so unless the hash rate is insane it won't be dollar efficient due to the power bill.