Square Enix preparing DX11 'Luminous Studio' engine

Square Enix's Japanese technology division is working on a brand new multi-platform engine, one that's fully compliant with DirectX 11.


Square Enix's Japanese technology division is working on a brand new multi-platform engine, one that's fully compliant with DirectX 11, and is currently compatible with PS3, Xbox 360, and of course PC. The "Luminous Studio" engine is inspired by the crystals from Final Fantasy, and is likely to replace the current generation Crystal Tool engine.

Luminous' focus will be on animation. According to a report by Andriasang, "unnatural animation will stand out" in the next generation, thanks to higher quality lighting, shading, and modeling. In addition, Siliconera reports that artificial intelligence will be another focus. The plan is to integrate AI into the engine, to have "scalable AI that recognizes game scenes."

The engine is being developed separately from IO Interactive's Glacier 2 engine, and Crystal Dynamics' CDC engine. However, the Japanese team has admitted traveling to IO Interactive and Eidos Montreal to get a better understanding of those studios' technologies. And although it is being developed as a "full engine," like Unreal Engine or CryEngine 3, Square Enix has no plans of licensing it outside of their internal development studios.

Currently, Square Enix is planning on showing off a tech demo of the engine in late 2012 or 2013. There's no games attached to the engine yet, but information should be forthcoming in the next year as well.

Here's a very brief real-time demonstration of the engine:

Andrew Yoon was previously a games journalist creating content at Shacknews.

Filed Under
From The Chatty
  • reply
    August 26, 2011 1:00 PM

    Andrew Yoon posted a new article, Square Enix preparing DX11 'Luminous Studio' engine.

    Square Enix's Japanese technology division is working on a brand new multi-platform engine, one that's fully compliant with DirectX 11.

    • reply
      August 26, 2011 1:02 PM

      Oh boy shaky cam in video games. I can't wait.

      • reply
        August 26, 2011 4:04 PM

        I'm getting seasick just watching that video. I hate shakeycam. :(

    • reply
      August 26, 2011 1:55 PM

      That video looks so fake. Given the channel it's on, it seems like it looking fake reflects what it really is.

    • reply
      August 26, 2011 5:00 PM

      Just a simple static environment with prebaked static lighting from a decent lighting model. Its a special case with nothing really going on in the scene. Looks cool. Is that shader network UI at the top part of it also, or just a random pic? (looks polished for an in-house tool)

      • reply
        August 26, 2011 5:06 PM

        It looks similar to the Maya shader channel interface to me.

    • reply
      August 27, 2011 6:36 AM

      Really?! DX11 for the consoles?! Someone want to explain to me how a DX 9 card is going to run DX11 on those decade old consoles?

      This story reeks of fake. I'd love to see a x1900 or a 7900 card run tessellation. And what cpu is in there an athlon 1.2g ?

      And this attached video, was that a corridor from HL1?

      • reply
        August 27, 2011 9:48 AM

        Erm, pretty sure they mean it CAN run DX11 but is going to run DX9 as well for the consoles.

    • reply
      August 27, 2011 9:50 AM

      If they really wanna make this next-gen, they need to work on realistic motion blur.

      I am so sick of still frames during action sequences being completely clear as though nothing was happening at all. It's completely unnatural. And it's one reason why games look choppy at 30 FPS.

      It makes me sad that few developers make this a priority. It should be the #1 thing for all developers to fix, and it should be standard-issue in every single game.

      • reply
        August 27, 2011 10:05 AM

        It'd be cheaper to push the game to 60 or 120 FPS than do realistic motion blur. You're talking about rendering multiple samples over time and then blending them, and it also tends to require a lot of samples to give good quality blur, more samples the faster something is moving.

        Movie/TV quality blur would butcher your framerate, not only do you have to render things multiple times, but you then need to blend them which is even more expensive.

        I really doubt every game is going to want to sacrifice that much performance. I'd far rather they nail 60 FPS minimum framerate, then spend resources on other elements of rendering quality.

        Fake blur effects are cheaper, but not suitable for all rendering styles.

Hello, Meet Lola