Xbox One specs supposedly boosted by 'the cloud'

It seems highly unlikely that Microsoft's claims that their upcoming console can be greatly enhanced "by the power of the cloud" have any merit.

14

From a hardware standpoint, Xbox One is shaping up to be weaker than PlayStation 4, with slower memory and three operating systems that eat up a large part of the console's 8GB of RAM. But Microsoft doesn't believe it's outgunned. In fact, it claims that Xbox One can become massively more powerful due to the power of the cloud.

"We're provisioning for developers for every physical Xbox One we build, we're provisioning the CPU and storage equivalent of three Xbox Ones on the cloud," Jeff Henshaw told OXM. "We're doing that flat out so that any game developer can assume that there's roughly three times the resources immediately available to their game, so they can build bigger, persistent levels that are more inclusive for players. They can do that out of the gate."

That certainly sounds exciting, doesn't it? However, the folks at Digital Foundry doubt that Microsoft's claims will translate in the real world, with specific concerns over the limitations of latency and bandwidth. "Even assuming instantaneous processing thanks to the power of the servers, the internet is incredibly slow in terms of real-time computing," their report says, pointing out that while a 30 frame per second game would need calculations done in 33 milliseconds, cloud computing can take 100ms or more.

Microsoft's Matt Booty acknowledges latency, saying that "reactions to animations in a shooter, reactions to hits and shots in a racing game, reactions to collisions" wouldn't be possible on the cloud. However, he says that "there are some things in a video game world, though, that don't necessarily need to be updated every frame or don't change that much in reaction to what's going on."

Another challenge that cloud computing must overcome in order to have a genuine impact on the performance is bandwidth. According to Digital Foundry, PS4 allocates around 20,000MB/s of its memory system for the CPU. However, even with a broadband connection of 50mbps, that equates to only 6MB/s. "This represents a significant bottleneck to what can be processed on the cloud, and that's before upload speed is even considered," the report points out.

Given these significant hurdles, it seems highly unlikely that Microsoft's claims that their upcoming console can be greatly enhanced "by the power of the cloud" have any merit. Of course, there could be some demonic magicks being employed by Redmond--something that will require a practical demonstration (and not just PR talk) to dispel any doubts.

Andrew Yoon was previously a games journalist creating content at Shacknews.

Filed Under
From The Chatty
  • reply
    May 28, 2013 3:00 PM

    Andrew Yoon posted a new article, Xbox One specs supposedly boosted by 'the cloud'.

    It seems highly unlikely that Microsoft's claims that their upcoming console can be greatly enhanced "by the power of the cloud" have any merit.

    • reply
      May 28, 2013 3:05 PM

      From a hardware standpoint, Xbox One is shaping up to be weaker than PlayStation 4, with slower memory and three operating systems that eat up a large part of the console's 8GB of RAM


      http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4

      • reply
        May 28, 2013 4:59 PM

        BTW, the last paragraph on the last page does bring up an interesting point on the dual VM/HyperV OSes that the XBone is running - how long before it shows up on a PC? Hmmm.... now that would be interesting...

        • reply
          May 28, 2013 5:17 PM

          This would be pretty cool. I have tons of wasted ram on my system. 8gigs on a pc for gaming isn't used imagine a closed system like a console where the devs can control and guarantee behavior.

          • reply
            May 28, 2013 8:43 PM

            The 8GB is shared between the GPU and the CPU though. It won't be long before all of that space is full on the console with higher quality textures and more complicated simulation.

      • reply
        May 28, 2013 5:06 PM

        Some of their analysis is a bit misguided:

        Although 32MB doesn’t sound like much, if it is indeed used as a cache (with the frame buffer kept in main memory) it’s actually enough to have a substantial hit rate in current workloads (although there’s not much room for growth).

        You actually want to keep your framebuffers in the high speed embedded RAM. The WiiU had the same design and you really don't want to have the GPU and CPU contending for memory access. Framebuffers are constantly being written to, so you see the biggest gains by moving those out of main memory. We had some problems with CPU/GPU contention to main memory on the WiiU and alleviated most of the main offenders by moving the related framebuffers to the embedded memory.

        There were only minimal gains to be had by using it as anything else, really. CPU memory bandwidth wasn't much of a problem so long as you're cache friendly. GPU reads from main memory didn't seem to be too bad (due to texture caches and whatnot). It was GPU writes to main memory that were the big problem.

        • reply
          May 28, 2013 5:35 PM

          I'm curious. How fast is the wii u's embedded ram? I did a quick search and didn't find anything specific. Anandtech has the xbox one at 102GB/s.

          • reply
            May 28, 2013 5:37 PM

            I actually can't remember at the moment. And I'm not sure if I'd be able to say or not, anyway.

        • reply
          May 28, 2013 6:29 PM

          Yeah, the Anandtech article proceeds under the assumption that the 32MB of ESRAM could actually be a coherent hardware cache, but all the leaked documentation appears to rule that out. You could still write a software cache algorithm if you were so inclined to try and leverage the lower latency access to ESRAM for GPU compute tasks, but there's so much bandwidth overhead in copying data from DDR3 to ESRAM that it would only be justifiable for very specific access patterns. It is clear that the 32MB ESRAM was designed pretty explicitly to house your framebuffers. Like, the ESRAM could have been much faster had they cared to make it so, but they just seemed to want it fast enough that it wouldn't be saturated by the 16 ROPs.

    • reply
      May 28, 2013 3:08 PM

      In other words, it'll be fun times for XBox One owners when Azure has another outage. http://www.forbes.com/sites/markgibbs/2013/02/23/microsofts-azure-outage-three-reasons-why-such-things-happen-and-three-steps-to-avoid-them/

      • reply
        May 28, 2013 6:25 PM

        Yeah, just the fact that it relies on an Internet connection which can't be counted on 100% means that they can't build an engine around it because they have to assume it's going to fail at some point. The cloud processing has to be something extra... something like physics or calculations for parts of the level that are outside your view... something which is nice to have but not required. Otherwise the game will just break if there is an interruption in the network (which is going to happen with consumer grade Internet access - you don't get 99.99% up-time SLAs for Internet to you HOUSE!).

        • reply
          May 28, 2013 6:28 PM

          while this is all true, let's not pretend like MMOs and similar ilk (like Diablo 3) haven't existed just fine in the PC spaces which use these same internet connections. People willing to buy into that requirement were compensated with certain experiences you couldn't get elsewhere (this is not an invitation for the D3 haters to reply, think MMOs if you really have to). And like I mentioned in another thread, I could also imagine this working for normally local operations that are slow but could be done faster via cloud processing (ex no longer wait a long time for the CPU to take their turn in a TBS or take awhile to sim the next week of Madden accurately).

    • reply
      May 28, 2013 3:15 PM

      The Xbone is sounding more and more like PS3 take 2. Initial focus on being a living room multimedia box, screwy processing system design of dubious utility, integration of a new interface with limited applications, arrogance towards their potential customers.

      This will not end well.

    • reply
      May 28, 2013 3:23 PM

      it will TOTALY ENHANCE the in game advertising.

      • reply
        May 28, 2013 4:06 PM

        If I bend over and show Kinect my sphincter, will it size me for a prostate massager?

        • reply
          May 28, 2013 4:17 PM

          They have an addon just for you called Kinect Thor

      • reply
        May 28, 2013 4:19 PM

        make it stop

        • reply
          May 28, 2013 5:12 PM

          i actually expect to see real world advertising in some game genres if the game is required to be online.

    • reply
      May 28, 2013 4:00 PM

      So it is always online?

      • reply
        May 28, 2013 4:19 PM

        Shush you're stirring up a hornets nest. MS says it doesn't need to be "always on" and I'll be damned if MS is wrong.

    • reply
      May 28, 2013 4:10 PM

      I would go so far as to say the statements coming from Microsoft about cloud computing for Xbox One are simply duplicitous. The most realistic scenarios amount to either a rebranding of dedicated servers/MMOs or just in time delivery of prebaked light-maps which probably won't actually be generated on demand since it will be so much cheaper to just store a bunch of different versions for a variety of conditions and serve those up to create the illusion of something being randomly or dynamically calculated.

    • reply
      May 28, 2013 6:30 PM

      I don't think "the cloud"could be used to do any rendering or time critical computation, but I think large scale AI could be done with cloud computing.

      Something like Skyrim, could do AI simulation for areas not directly in front of the player. I don't know how useful that would be for game developers, but it might be interesting to see what could be done.

      • reply
        May 28, 2013 8:48 PM

        AI was the first thing that came to mind for me too. You don't have as firm of restrictions on memory so you can afford to give actors more variables to influence their behavior.

    • reply
      May 28, 2013 8:53 PM

      I think this is part of the plot of Bioshock infinite. Its all in the clouds

    • reply
      May 28, 2013 8:58 PM

      having worked on many projects that use amazon cloud and azure, I have seen plenty of distributed apps have far more power than what they would if they were all local. Sure, they aren't twitch or graphic heavy like games, but I see no reason why this can't enhance the experience. Will it bridge the gap? probably not. But let's wait and see a game that takes advantage of this before judging it as a failure.

      Have to say Andrew, the tone on this is negative, and not because you have experience in this specifically. It's negative because you got info form other sources. Did you reach out to MS for their input? It would be great to have heard MS' response to the Digital Foundry article.

    • reply
      May 29, 2013 8:08 AM

      Will leveraging the cloud enhance gameplay? Perhaps. But that does lead into the fact that you always need to be connected to the internet.

      How about those people who do not have a stable/fast internet connection or have a cap on their data? Should they just deal with it like how Adam Orth so eloquently put it?

    • reply
      May 30, 2013 1:43 AM

      Welp, Andrew Yoon says it's impossible. Everybody go home!

Hello, Meet Lola