Nvidia CloudLight tech demo shows off cloud-rendered lighting

Internet-connected devices will theoretically be able to offload graphical computation onto the cloud. But what would that actually look like?

18

Internet-connected devices will theoretically be able to offload graphical computation onto the cloud. But what would that actually look like? Nvidia's CloudLight demo shows off how the cloud could be used to calculate indirect lighting in games.

CloudLight essentially computes lighting data on a cloud-based server, and then transfer it back to the end user. By focusing on indirect light, Nvidia is trying to overcome the biggest potential pitfall for cloud computing in games: latency. Because direct illumination is still rendered on a local client, cloud-based indirect lighting is an additive feature.

Nvidia argues that cloud rendering makes sense for this purpose because it is effective even when an internet connection is unreliable. "In the worst case, the last known illumination is reused until connectivity is restored, which is no worse than the pre-baked illumination found in many game engines today," Nvidia's report details. As seen in the video below, even with significant latency, the effect is rather subtle, making a scene feel a bit more ephemeral.

There are three approaches Nvidia is taking with cloud-based lighting, but voxels are especially exciting because it is ideally suited for mobile devices. With this method, light data is encoded into 3D points that get transferred via H.264 video compression. It require "almost no computation burden on client devices." Although not practical for large scenes, the low bandwidth and computational requirements for voxel-based light rendering could make it ideal for tablets and phones. And, "in the near future, thermal limits on mobile devices are unlikely to be overcome. To continually improve visual quality at the rate consumers have come to expect, the only solution may be to move some computation to the Cloud," Nvidia says.

There are still practical concerns regarding bandwidth, but it's nonetheless a fascinating proof-of-concept on the potential of deferred rendering. One can only assume that Microsoft will want to get in touch.

Andrew Yoon was previously a games journalist creating content at Shacknews.

Filed Under
From The Chatty
  • reply
    July 29, 2013 8:30 AM

    Andrew Yoon posted a new article, Nvidia CloudLight tech demo shows off cloud-rendered lighting.

    Internet-connected devices will theoretically be able to offload graphical computation onto the cloud. But what would that actually look like?

    • reply
      July 29, 2013 9:20 AM

      Heh, "no worse than pre-baked illumination" on a connection interruption. I can't wait for screenshots where this system flakes out completely and makes the lighting look as wildly colorful and unbalanced as Mortyr.

    • reply
      July 29, 2013 10:12 AM

      At least in this tech demo it's surprising to see that at normal broadband latency the effect is very minimal. But, they don't really get into how much bandwidth it chews up doing this. That would become a bigger factor in multiple/mmo titles.

    • reply
      July 29, 2013 10:27 AM

      So they needed a Titan in the servers to achieve reasonable results?

      Yeah I don't think that's going to work when all of the current cloud solutions are using low powered commodity servers. This idea of giving one user huge timeslices of cloud based processing to do heavy computations contradicts what these cloud clusters are actually built for.

      • reply
        July 29, 2013 12:40 PM

        Yeah, stuff like this just sounds unreasonably expensive.

      • reply
        July 29, 2013 3:50 PM

        [deleted]

        • reply
          July 29, 2013 3:55 PM

          That doesn't really matter when the cloud service is expected to provide a real-time response. You would need to design your cloud for peak capacity since you can't just defer the service during lesser load times.

    • reply
      July 29, 2013 10:30 AM

      but digitalfoundry told me that server side processing power was useless for gaming and they're an internet website so how could they be wrong?

      • reply
        July 29, 2013 2:13 PM

        Wow, you're using an NVidia demo of the least disruptive aspect of gaming to send to IaaS processing to attack the whole base of criticism of foolhardily shoving every aspect of gaming to IaaS, latency be damned. You're shameless.

        • reply
          July 29, 2013 4:35 PM

          yep that's what I said, dunno why the Xbox One even has its own CPU, just do it all in the cloud

          • reply
            July 29, 2013 5:00 PM

            all the instant dismissals of anything cloud gaming related here remind me of the slew of folks constantly claiming that OnLive's tech demos were surely impossible and wouldn't work in reality. And then it worked exactly as advertised.

    • reply
      July 29, 2013 12:58 PM

      Where this will look weird of course will be for light created at the moment a user spontaneously chooses, e.g. when he shoots a gun in a dark room and the muzzleflash illuminates the environment. Even a minor delay would be accutely noticable in a single player game. I suppose the game wil differentiate between such instances and keep them computed client-side. We're also talking about a significant, consistent load being throughput and I don't see the current crop of cloud servers handling that well. It sounds like game companies (or GPU manufacturers wanting to make their tech attainable) will need to purchase new, dedicated hardware for the gaming cloud to host such services. This will become a reality with time, but current constraints make me think this tech demo halts at proof of concept and lacks practical application for at least a decent number of years out. Then again, Microsoft and Sony will want this tech to improve their new consoles and extend their lifespans, so it would be great if they pushed hard for implementation and the market benefited.

    • reply
      July 29, 2013 1:01 PM

      I can't wait to buy a computer I don't own

    • reply
      July 29, 2013 1:34 PM

      So when I read the title of this post I thought I was going to see a video of a tech demo showing off light shining through virtual clouds. I was pretty disappointed.

      • reply
        July 29, 2013 4:28 PM

        I misread lighting as lightning and was even more disappointed.

    • reply
      July 29, 2013 4:56 PM

      Hey dudebros. I didn't work on this, but I can talk about it a little.

      Most games use a combination of dynamic and static lighting. The dynamic lighting comes in the form of lights and some straightforward shader math. The static lighting typically comes from lightmaps, which are just textures that have data in them that say "here's how much light is at this point (and what color it is)."

      The lightmap data is very high quality, and can take things into account like "bounces". Bouncing matters very much in real life. It's the property that makes things really look connected. For example, think of a red wall next to a white floor--bouncing is what makes the red wall "reflect" in the white floor.

      By comparison, dynamic lighting almost never does anything like this, because the cost of doing so would be too high (it's more complex than simple math at that point).

      The bonus complexity is that bounce lighting really should be affected by dynamic lights. For example, if you point a flash or spotlight at that red wall, the reflection in the floor should become even more pronounced.

      This system basically leverages GRID servers to do high quality, static-style lighting in a realtime way. But not quite realtime enough to do on your local GPU in addition to all of the other work you're doing to render your game. (Unless you were, say, valcan_s and wanted to have some spare GPUs sitting around to do just this).

      The reason that the lighting is "no worse" than prebaked illumination in cases of connection interruption is that games would basically just fall back to their prebaked lighting in those cases.

      The tech is pretty cool. All of the folks involved in the research are super smart, I hope to see some games or workstation apps take advantage of the tech sooner rather than later.

      I can probably answer limited questions about this if folks are interested.

      • reply
        July 29, 2013 5:26 PM

        [deleted]

      • reply
        July 29, 2013 5:29 PM

        Great post, thanks for the writeup.

      • reply
        July 29, 2013 5:35 PM

        Do the servers have to be in the cloud?
        I'm sure some of us have additional gaming PCs in the house that could be harvested to do this work on the LAN instead of having to farm it out.

        • reply
          July 29, 2013 5:36 PM

          I don't know--or rather I don't know if I can comment.

          • reply
            July 29, 2013 5:43 PM

            Another question:
            So my understanding of SLI, you either got one card either rendering 1/2 the screen, or doing an every-other frame type thing.

            Could this technology offer a different mode for SLI? One card does the normal rendering, with the other card doing this specialized lighting calculations? Would there be any benefit to this configuration?

    • reply
      July 29, 2013 5:57 PM

      150 to 200ms lag? No thanks, not for my games.

Hello, Meet Lola