Stage is a scene assembly, lighting and shader authoring system. It provides a tight relationship between your offline renderer and our realtime renderer. At this time we are integrating SolidAngle’s Arnold renderer, but Stage is designed to support any offline renderer that provides an API, such as ChaosGroup’s VRay or Pixar’s Renderman.
Stage is at an early alpha and we are seeking studios that want to support and influence development. If you would like early access to this module, and are interested in working with us to develop it further, please go here and complete the access request form.
Scene Management – Asset Styles
Asset Styles are the way that Stage handles scene complexity and variation. One of the biggest challenges when working on multiple assets across multiple shots is the management of variations – tracking multiple copies of assets across multiple shots is painful and prone to expensive errors. Stage’s asset back-end is being built as a reference implementation on top of Shotgun and Tank, but could be hooked in to any python-based asset management solution with ease.
Many thanks to John Martini from Ingenuity Engine for allowing us to use these assets.
Lighting: Preview Rendering and Offline Rendering
Reliable decisions can only be made when there is enough information available to make an informed choice – ideally we want to work with a high-quality preview renderer that allows us to see a reasonable approximation of changes to a scene. Since we have a powerful openGL real-time renderer already, we were able to create a direct relationship with the offline renderer. Interaction with the renderer is performed through our Kernel Language, which is compiled at runtime. This means that no particular rendering pipeline is enforced, the user can programmatically set up the scene management as well as the rendering pipeline. The integration of the offline renderer into KL also allows Creation to combine offline rendering with realtime rendering, for example to perform realtime compositing of offline passes. As you can see from the video, this deep integration is extremely powerful for things like shader assignment and authoring.
Many thanks to Paul Smith from The Imaginarium for allowing us to use this asset.
The Creation realtime renderer has been extended with a series of light and surface shaders to match (to a certain degree) the offline shaders, enabling preview of lighting and surface shading in realtime. This tight relationship allows for rapid iteration and decision-making in real-time, with a high degree of parity with the offline result. As we continue development on our realtime renderer, this relationship will become even tighter.
All data is stored in parallel containers, there is one node for all of the lights, one node for all of the meshes etc. This guarantees extraordinary scalability and allows Stage to handle huge amounts of objects and lights. Furthermore all of the communication is performed in parallel, so when meshes are moved into renderer this happens in a multithreaded fashion.
The UI is fully procedural. The widgets are created based on a JSON description pulled out of the offline renderer. This model is very flexible and allows Creation to easily integrate with other GUI libraries.
Since all of the rendering happens inside KL, the interaction uses the generalized event system and the UI uses JSON to construct the widgets. Stage can be deployed via Creation: Orb for server side rendering. This makes the application available for remote users, the rendering happens on the server and the user interface is presented through a browser.