Skip to content

Graphics

Alex Miyamoto edited this page Feb 7, 2021 · 17 revisions

Overview

The graphics library is only for low level rendering code, this document assumes that the reader is familiar with either DirectX12 or Vulkan, and will not attempt to cover any fundamental topics.

Although Usagi was targetting the 3DS for the majority of its development this doesn't mean the graphics library is dated; the team manually wrote out gpu command lists rather than using any of the existing APIs in order to ensure that we could design in a more modern manner. The Usagi graphics code at this point should appear familiar to anyone experienced with either modern APIs. It began life as a DirectX11 style interface but has been updated to more closely match Vulkan (although some design choices from the former still persist in the code base). This actually worked out extremely well; pre-creating command lists for entire pipeline states allowed us to render far more models than with the middleware we had used on prior titles.

As the PC was not the priority the windows implementation used OpenGL behind the scenes. It has since been converted to use Vulkan, but it has not been extensively tested, and the API had only been modified enough to be functional (amoung other issues it currently only supports a single thread and the code has not yet been validated).

To switch to building for OpenGL:
Change add "_ogl" to Tools\bin\platform.win.rb, and remove _vulkan.
In Engine\Common_win\Common_ps.h comment out the USE_VULKAN define.

Co-ordinate system

Usagi uses a left handed co-ordiante system
+x: Right
+y: Up
+z: Forward

Co-ordiante systems are as controversial as the placement of { brackets. Our reasoning for adopting the classic DirectX co-ordianate system is consistency: depth is represented by a z buffer, and the z buffer is 0 to 1, x and y are the standard graph axes along the screen. Anyone raising this issue (or where to place those brackets) will be shushed in the spirit of getting some actual work done.

Classes

Below is just a subset of classes which may need some explanation.

  • GFXDevice - Reponsible for managing displays, contexts and the creation and management of all states.
  • GFXContext - Responsible for constructing command lists. You can either use one for the entire scene, or multiple for constructing on seperate threads or for reusable commands such as the left and right eye in a 3D scene. However on the OGL implementation only one is supported. *Note it is too heavy weight to extend to constructing command lists for only one or two draw calls, if you wanted to construct micro command lists in this fashion a refactor would be required.
  • Constant Set - Usagi only supports setting variables via constant buffers, and currently there is no assumption that the layout of the data on the CPU matches the GPU. As such you must declare the contents of any buffers using a ShaderConstantDecl. When uploading to the GPU the alignment rules for the target GPU will be applied.

Device Creation

It is generally assumed that you will initialize your game by calling the function GameInit(), which in turn calls InitEngine().

If you do wish to initialize manually you must call GFX::Initialise() followed by the call to InitDisplay or InitAllHardwareDisplays depending on your target platform.

Draw Loop

The simplest draw loop which only uses an immediate context might look like this

   // Get the default context designed for non-reusable render calls
   usg::GFXContext* pImmContext = pDevice->GetImmediateCtxt();
   // Begin the frame
   pDevice->Begin();
   pImmContext->Begin(true);

   // Set the render target (render target contains a render pass which specifies which buffers should be cleared)
   pImmContext->SetRenderTarget(pTarget);

   // DO DRAWING CODE

   pImmContext->Transfer(pTarget, pDisplay);
   pDisplay->Present();
   pImmContext->End();
   pDevice->End();

Note that that you can also opt to render to the displays render target directly, but the usual use case is to have multiple temporary render targets for post process effects (also useful for dynamically adjusting the resolution).

Resource creation

All resources are created through the GFXDevice, for example the minimum amount to create a pipeline state handle is shown below:

usg::PipelineStateDecl pipeline;
pipeline.renderPass = renderPass;
// Describe the vertex layout
pipeline.inputBindings[0].Init(usg::GetVertexDeclaration(usg::VT_POSITION));
pipeline.uInputBindingCount = 1;
// Set the effect
pipeline.pEffect = usg::ResourceMgr::Inst()->GetEffect(pDevice, "PositionEffect")
hndl = pDevice->GetPipelineState(pipeline)

The PipelineStateDecl includes much more information but all the the declaration structs in Usagi have sensible defaults applied in their constructors, however there is no sensible default vertex layout or device, so the above is the minimum required.

Sample draw call

   // Set the pipeline state
   pContext->SetPipelineState(m_pipeline);
   // Set the draw call specific descriptor set, bound to index 1 (by convention we use 0 for global)
   pContext->SetDescriptorSet(&m_descriptors, 1);
   // Set the vertex buffer for this draw call
   pContext->SetVertexBuffer(&m_vertexBuffer);
   // Draw using every index in the index buffer
   pContext->DrawIndexed(&m_indexBuffer);

Lighting

Usagi supports both forward and deferred shading (more information on support for these in PostFX).
The supported light types are as follows:
DirLight (Directional)
PointLight (Point)
SpotLight (Spot)
ProjectionalLight (Projection)
Projection lights are simply cones which project a texture as a light. Apart from the fact that they project a texture they are not dissimilar to a spot light. They might be useful for creating projectors or simply for spot lights where a unique pattern is required.

All light types support shadows. Directional lights use shadows cascades
Again all shadows were balanced for WiiU/ Switch level hardware and you may find you wish to increase the number of samples and/ or resolution for more modern hardware.

For information on spawning lights please see the scene documentation.

Effects

The old text files have been replaced with yml effect pak definitions.

Effects:
  - { name: DirBase, vert: deferred/directional, frag: deferred/directionalbase }
  - { name: PointNoSpec, vert: deferred/pointlight_pos, frag: deferred/pointlightnospec }
  - { name: PointPos, vert: deferred/pointlight_pos_only, frag: deferred/pos_only  }
  - { name: SpotPos, vert: deferred/spotlight_pos_only, frag: deferred/pos_only  }
  - name: DirExtraShadow
    vert: deferred/directional
    frag: deferred/directionalshadowpass  
    has_default: false
    define_sets:
      -  name: "1"
         defines: LIGHT_INDEX=1
      -  name: "2"
         defines: LIGHT_INDEX=2
      -  name: "3"
         defines: LIGHT_INDEX=3
      -  name: "4"
         defines: LIGHT_INDEX=4    

Effects in this format can specify multiple define sets for the same shaders. The resulting effect name is:
PakName.EffectName.DefineSetName
A default effect with no defines specified will be created, and named PakName.EffectName unless has_default is set to false in the effect definition file

As of 0.3 you also have the option of specifying the bindings within these files.

In order to do this place the following line after your includes.

// <<GENERATED_CODE>>

Then in the effect definition you can specify the constant buffers, input attributes and textures:

  Mouse:
    Attributes:
      - { index: 0, name: ao_position, hint: "position", type: vec3 } 
      - { index: 1, name: ao_size, hint: "size", type: vec2 }
      - { index: 2, name: ao_texcoordRng, hint: "tex", type: vec4 }
      - { index: 3, name: ao_color, hint: "color", type: vec4 }
    Samplers:
      - { index: 0, hint: "Color", default: "Textures/white" }     
    ConstantDefs:      
      - binding: Material
        shaderType: VS
        Variables:
          - { type: int, name: iFrameCount,      count: 1, default: 1} 
          - { type: int, name: iCurrentFrame,    count: 1, default: 0 }   

This is entirely optional, you can specify the attribute layouts, and constant buffer layouts in C++ if you would rather:

static const DescriptorDeclaration g_descriptorGBuffer[] =
{
	DESCRIPTOR_ELEMENT(0,	DESCRIPTOR_TYPE_COMBINED_IMAGE_SAMPLER, 1, SHADER_FLAG_PIXEL),
	DESCRIPTOR_ELEMENT(1,	DESCRIPTOR_TYPE_COMBINED_IMAGE_SAMPLER, 1, SHADER_FLAG_PIXEL),
	DESCRIPTOR_END()
};

static const VertexElement g_instanceVertex[] =
{
	VERTEX_DATA_ELEMENT_NAME(1, Debug3D::SphereData, vPos, VE_FLOAT, 4, false),
	VERTEX_DATA_ELEMENT_NAME(2, Debug3D::SphereData, vColor, VE_FLOAT, 4, false),
	VERTEX_DATA_END()
};

But if you specify this data in an effect file then it saves you having to mirror the variables in two locations, you can instead use CustomEffectRuntime and then use the templated SetVariable function to set constant set data.

Shader Code

There is currently no intermediate shader language; we primarily targetted GLSL based platforms. In the cases where GLSL was not supported we mirrored all of the required code in the appropriate language. The version of GLSL being used is 3.3.

There were some minor differences in the GLSL support between platforms, primarily to do with binding. When specifying binding locations for variables please use the following:

ATTRIB_LOC(0) in vec3 ao_position;   // Attribute bound to location 0
BUFFER_LAYOUT(1,  UBO_MATERIAL_ID) uniform Material // Buffer bound to descriptor set 1, at the material index (1)
SAMPLER_LOC(1, 0) uniform sampler2D sampler0; // Sampler bound to descriptor set 1, index 0

Currently sets in buffer layouts are ignored, but will be required for Vulkan support.

There is no define for output bindings so use the standard glsl layout keyword

layout(location = 0) out vec4 po_vDiffuse;

Information about higher level rendering can be found in Scene, PostFX and Particles