Skip to content


John Sietsma edited this page Oct 10, 2018 · 6 revisions

DISCLAIMER: This is not official documentation. This is a feature in preview, this documentation may get out of date quickly! See the Home page for documentation links.

You should just use the Shader Graph or standard shaders if you can. The LWRP supports basic legacy unlit shaders, try it before you port. But if you're porting a complex existing shader or the Shader Graph doesn't support what you'd like to do, then this may help. But you should probably use the Shader Graph!

There is a really great, well commented physically based example shader from Felipe Lira that is a good base for your own shader.

Shader Differences

Although the same shading language is used by both, there are some key differences in how to write shaders for the LWRP and the default renderer.


ShaderLab has SubShader and Pass Tags to control render order and light modes.

SRP adds a RenderPipeline tag. For the LWRP this must be "RenderPipeline" = "LightweightPipeline". This allows support for different pipelines within the same shader.

LightMode now has a LightweightForward, SRPDefaultUnlit, ShadowCaster and DepthOnly modes for the LWRP. Although it is possible to register new light modes in a custom render pass. See ScriptableRenderPass.cs for details.

If no "RenderPipeline" tag and "LightMode" is specified, the LWRP will still render it. This allows existing simple unlit shaders to work out of the box.


Unity has a number of different shader compilers and cross-compilers. SRP requires the HLSLcc. On platforms that use the OpenGL ES graphics API (for example Android), HLSLcc is not used by default.

Use this pragma to force it on: #pragma prefer_hlslcc gles

DirectX 9 is not supported, use this pragma to exclude it: #pragma exclude_renderers d3d11_9x

Shader Language

Nothing has changed here, except you should use HLSLPROGRAM/ENDHLSL instead of CGPROGRAM/ENDCG. The CG version automagically adds #include "HLSLSupport.cginc; to your shader. You can still use the default shader library, but it's probably best not to include files from it by default.

Shader Libraries

The default renderer includes a built-in downloadable shader library. It has a bunch of useful macros and functions for lighting, tessellation, color space conversion, position transformation, etc. It also has a library of shaders that can be used or copied and modified.

Most commonly shaders had #include "UnityCG.cginc". But extra or different shaders includes can be used.

The SRP has a core library and a library for the LWRP and HDRP.

For LWRP, most shaders use #include "Packages/com.unity.render-pipelines.lightweight/ShaderLibrary/Core.hlsl". It also includes the Core.hlsl from the core shader library.

Constant Buffers and UnityPerMaterial

Constant Buffers are used to store data that rarely changes on the GPU. They can be used in Unity to store shader variables. There are the CBUFFER_START and CBUFFER_END pre-defined macros that enabled constant buffers on platforms that support it.

The LWRP is aware of two constant buffers called UnityPerObject and UnityPerMaterial. It will bind these buffers once so they can be used across drawcalls. These means that the renderer doesn't have to rebind constant buffers or call setpass on materials. The default renderer batches on materials, but with shaders that use the same constant buffer, without other shader properties, the LWRP can batch different materials that share the same constant buffer.

To take advantage of this extra batching, your shader should use the UnityPerMaterial constant buffer for shader properties, rather then shader variables. This is defined in various includes like LitInput.hlsl. Or you can define it yourself in your shader like this:

float4 _MainTex_ST;
half4 _Color;
half _Cutoff;
half _Glossiness;
half _Metallic;

Key Shader Library Differences

Although most of the shaders written for both the default and SRP renderers use CG/HLSL, they make heavy use of shader library macros, defines and functions. So a new shader library means there is a few key differences to look out for.

Space Transforms

The vertex shader transforms vertex positions from object space into clip space. This is done by multiplying the vertex by the world, view and projection matrices.

Shaders can multiply matrices directly, but most existing shaders call UnityObjectToClipPos(pos) to do that transformation.

In the LWRP the equivalent is TransformWorldToHClip(pos). However, there is also function GetVertexPositionInputs which will return the world, view and clip space positions. If you don't use them they get compiled out, but if you need them they're available to you.

So vertex transforms will now look something like this.

VertexPositionInputs vertexInput = GetVertexPositionInputs(;
output.vertex = vertexInput.positionCS;

There is also a similar concept for normals, tangents and bitangents.

VertexNormalInputs normalInput = GetVertexNormalInputs(input.normal
output.normal = normalInput.normalWS;

UV Transforms

The vertex shader may pass the uv coordinates straight through to the fragment shader stage. The default shader library has a macro TRANSFORM_TEX which handles the texture tiling and offset uv transformation.

The Core shader library has the same macro in Macros.hlsl. The source comments indicate it is legacy, but all the shaders still use it. So continue to use it!

Samplers and SamplerStates

You can still write sampler2D _MainTex; and tex2D(_MainTex, uv). But Unity has supported separate samplers and sampler states for awhile now.

Separate sampler states are only available in some graphics APIs. The default inbuilt shader library has macros for declaring samplers and sampling from them, while still supporting those older graphics APIs

The naming of those has changed in the Core/LWRP library.

For the default renderer, to share a sampler you would write:



half4 color = UNITY_SAMPLE_TEX2D(_MainTex, uv);
color += UNITY_SAMPLE_TEX2D_SAMPLER(_SecondTex, _MainTex, uv);

In the LWRP you would write:



half4 color = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, uv);
color += SAMPLE_TEXTURE2D(_SecondTex, sampler_MainTex, uv);


These comments from UnityCG.cginc show the default fog support:

//  multi_compile_fog Will compile fog variants.
//  UNITY_FOG_COORDS(texcoordindex) Declares the fog data interpolator.
//  UNITY_TRANSFER_FOG(outputStruct,clipspacePos) Outputs fog data from the vertex shader.
//  UNITY_APPLY_FOG(fogData,col) Applies fog to color "col". Automatically applies black fog when in forward-additive pass.
//  Can also use UNITY_APPLY_FOG_COLOR to supply your own fog color.

Note that you can still import UnityCG.cginc and use these macros. But the LWRP shaders support fog the following way:

// In the Varyings struct, add an extra float to store the fog value
float3 uvMain: TEXCOORD1; // xy: uv0, z: fogCoord

// The vertex function compute how much fog
`output.uvMain.z = ComputeFogFactor(vertexInput.positionCS.z);`

// In the frag shader, mix in the fog value = MixFog(, input.uvMain.z);


Porting lit shaders is significantly more difficult. The default renderer has Surface Shaders, which automatically apply lighting. But the SRP doesn't support them.

A full guide to lighting is beyond the scope of this document.

Lighting.hlsl includes functions and macros for:

  • BRDF: DirectBDRF
  • GI: SampleSH, SampleLightmap, SAMPLE_GI, GlobalIllumination
  • Direct light: LightingPhysicallyBased, VertexLighting

LitForwardPass.hlsl is a good place to find sample usages of these function.

Stylistic differences

The default shader library uses appdata_base and variants and v2f. The LWRP tends to use Attributes and Varyings.

You can’t perform that action at this time.