Quick reference for developers with previous knowledge of how shaders work.
Kage is the language used on Ebitengine to write shaders. It has a Golang-like syntax and it's internally translated to GLSL, HSL or MSL at runtime as required. Only fragment shaders are supported at the moment.
- Supported types:
bool
,int
,float
(float32),vec2
,vec3
,vec4
(float vectors),mat2
,mat3
,mat4
and constants. - Vectors support swizzling with
rgba
,xyzw
andstpq
. You can also index them[N]
directly. - You can write helper functions, but there are no slices, maps, strings, structs, import, switch, etc.
- Source textures limited to 4 RGBA images per shader invocation.
Most of them can be applied to numerical types like int
and float
, but also vectors (e.g. abs(vec2(-1, 0)) == vec2(1, 0)
).
Key single-argument functions:
len(vec) // for vec2, vec3, vec4. same as in Golang
length(x) // mathematical length / magnitude of a vector
abs(x)
sign(x) // returns -1, 0 or 1
sin(x); cos(x); tan(x) // plus asin, acos, atan, etc.
sqrt(x)
floor(x); ceil(x); fract(x)
Key two-argument functions:
mod(x, m) // %
min(a, b); max(a, b)
pow(x, exp)
step(s, x) // 0 if `x < s`, 1 otherwise
atan2(x, y) // classic `angle := atan2(y - oy, x - ox)`
dot(x, y); cross(x, y vec3) // dot and cross products
distance(pointA, pointB) // == length(pointA - pointB)
Key three-argument functions:
clamp(x, min, max)
mix(a, b, t) // lerp, linear interpolation
Full official reference.
Example Kage shader that generates a checkerboard pattern.
//kage:unit pixels
package main
func Fragment(targetCoords vec4, sourceCoords vec2, color vec4) vec4 {
const CellSize = 32
cellCoords := floor(targetCoords/CellSize)
if mod(cellCoords.x + cellCoords.y, 2) == 0 {
return vec4(1, 0, 1, 1) // magenta
} else {
return vec4(0, 0, 0, 1) // black
}
}
(Make sure to configure your editor so you get syntax highlight)
For quick testing, you can put the code into a shader.kage
file and run it with the following main.go
:
package main
import "github.com/tinne26/kage-desk/display"
func main() {
display.SetTitle("kage/checkerboard")
display.SetSize(512, 512)
display.Shader("shader.kage")
}
The //kage:unit pixels
is a special directive similar to Golang's compiler directives. Without it, you would be operating in texels mode. The general recommendation —and what virtually all advanced users are doing— is to use the new pixels mode. All the information in this tutorial is based on the pixel mode; if there's anything that behaves differently under the texels mode I won't even bother telling you, so keep that in mind.
Tip
If you are too lazy, you can also use www.kageland.com, a very handy playground created by @tomlister to write and share Kage shaders from your browser. Try copy pasting the shader code above and Run
it!
If you want to compile and invoke a shader manually, here is some reference code. Basically, use 4 vertices to create a quad and set the vertex target coordinates. While DrawRectShader()
also exists, I recommend focusing on DrawTrianglesShader()
instead1.
We will build upon the following template for the next examples:
package main
import "time"
import _ "embed"
import "github.com/hajimehoshi/ebiten/v2"
//go:embed shader.kage
var shaderProgram []byte
func main() {
// compile the shader
shader, err := ebiten.NewShader(shaderProgram)
if err != nil { panic(err) }
// create game struct
game := &Game{ shader: shader, startTime: time.Now() }
// configure window and run game
ebiten.SetWindowTitle("kage/load-and-invoke")
ebiten.SetWindowSize(512, 512)
err = ebiten.RunGame(game)
if err != nil { panic(err) }
}
// Struct implementing the ebiten.Game interface.
// Reusing the vertices and options is advisable.
type Game struct {
shader *ebiten.Shader
vertices [4]ebiten.Vertex
shaderOpts ebiten.DrawTrianglesShaderOptions
startTime time.Time
}
func (self *Game) Update() error { return nil }
func (self *Game) Layout(_, _ int) (int, int) {
return 512, 512 // fixed layout
}
// Core drawing function from where we call DrawTrianglesShader.
func (self *Game) Draw(screen *ebiten.Image) {
// map the vertices to the target image
bounds := screen.Bounds()
self.vertices[0].DstX = float32(bounds.Min.X) // top-left
self.vertices[0].DstY = float32(bounds.Min.Y) // top-left
self.vertices[1].DstX = float32(bounds.Max.X) // top-right
self.vertices[1].DstY = float32(bounds.Min.Y) // top-right
self.vertices[2].DstX = float32(bounds.Min.X) // bottom-left
self.vertices[2].DstY = float32(bounds.Max.Y) // bottom-left
self.vertices[3].DstX = float32(bounds.Max.X) // bottom-right
self.vertices[3].DstY = float32(bounds.Max.Y) // bottom-right
// NOTE: here we will also map the vertices to
// the source image in later examples.
// triangle shader options
if self.shaderOpts.Uniforms == nil {
// initialize uniforms if necessary
self.shaderOpts.Uniforms = make(map[string]any, 2)
self.shaderOpts.Uniforms["Center"] = []float32{
float32(screen.Bounds().Dx())/2,
float32(screen.Bounds().Dy())/2,
} // this will be passed as a vec2
// link images if necessary (omit if nil)
self.shaderOpts.Images[0] = nil
self.shaderOpts.Images[1] = nil
self.shaderOpts.Images[2] = nil
self.shaderOpts.Images[3] = nil
}
// additional uniforms
seconds := float32(time.Now().Sub(self.startTime).Seconds())
self.shaderOpts.Uniforms["Time"] = seconds
// draw shader
indices := []uint16{0, 1, 2, 2, 1, 3} // map vertices to triangles
screen.DrawTrianglesShader(self.vertices[:], indices, self.shader, &self.shaderOpts)
}
Tip
If you are too lazy to do all this when starting but you still prefer your editor to kageland, notice that the kage-desk/display
package also provides many utilities: quick setup, background color, two default images, high resolution and resizability options, Time float
, Cursor vec2
and MouseButtons int
uniforms, F (fullscreen) and ESC shortcuts... If you are interested, check out the main.go and shade.kage files of the learn/filled-circle
example for a quick reference.
The code from the previous section already shows how to link uniforms from the CPU side. Now let's see how to use them in the actual shader. We are going to make a shader where a pixel orbits around the center of the screen, at a rate of one revolution per minute (so, a clock that tracks seconds):
//kage:unit pixels
package main
var Center vec2 // technically this isn't necessary, will explain later
var Time float
func Fragment(targetCoords vec4, _ vec2, _ vec4) vec4 {
const MarkerDistance = 160
const Pi = 3.14159265
// compute the position for the seconds marker
secAngle := (mod(Time, 60)/60)*2*Pi
secPos := Center + vec2(sin(secAngle)*MarkerDistance, -cos(secAngle)*MarkerDistance)
// return the sum of contributions for the two dots in the screen
centerMarker := vec4(1)*inDotMask(targetCoords.xy, Center, 2, 1.5)
secondMarker := vec4(1)*inDotMask(targetCoords.xy, secPos, 2, 1.5)
return centerMarker + secondMarker
}
// Returns 1 if the current position is within 'hardRadius' of 'target',
// between 1 and 0 if within 'hardRadius + softRadius', zero otherwise.
func inDotMask(current vec2, target vec2, hardRadius, softRadius float) float {
return 1.0 - smoothstep(hardRadius, hardRadius + softRadius, distance(current, target))
}
As you can see, adding uniforms is as simple as declaring exported variables at the start of the file with the correct names. Vectorial types are inferred from []float32
slices. Implicit conversions from float64
and int
to the shader's float
will also happen automatically, but using float32
directly on the CPU side is probably the best practice.
To sample a texture, you will typically use the sourceCoords
input argument and the imageSrc0At()
function, which expects a coordinate in pixels. As showcased in the load and invoke section, you can link up to 4 images in the shader options. The full collection of relevant image functions is the following:
imageSrcNAt()
(replace N with {0, 1, 2, 3}): source texture sampling. Sampling is always nearest; if you want to perform linear interpolation you will have to do it manually. Until we make something better, you can find some interpolation implementations here2.imageSrcNUnsafeAt()
(replace N with {0, 1, 2, 3}): likeimageSrcNAt()
, but doesn't check whether you go out of bounds. If you go out of bounds withimageSrcNAt()
, you will get backvec4(0)
. With the unsafe function, you could actually peek at the whole internal atlas.imageSrcNSize()
(replace N with {0, 1, 2, 3}): returns the size in pixels of the requested source texture.imageSrcNOrigin()
(replace N with {0, 1, 2, 3}): returns the origin of the requested source texture in pixels. This is not only relevant when working with subimages that might not start at (0, 0), but also any time your want to divide or multiply the coordinates. Since Ebitengine uses internal atlases, you have to subtract the origin to get the relative coordinates, transform them, and then add the origin back. Always keep that in mind!imageDstSize()
andimageDstOrigin()
: same idea as the two previous functions, but for the target texture instead of the sources. With these you could eliminate theCenter
uniform of the previous section, for example.
Note
Remember that colors are in RGBA format, with values between [0, 1]
, and premultiplied alpha (color channel values can't exceed the alpha value, or weird stuff will happen). It's easy to slip when you are often working with [0, 255] RGBA on the CPU side.
If you are using DrawTrianglesShader(...)
, you also need to map the source texture to the target vertices:
// (typically done after the DstX/DstY setup)
srcBounds := yourImage.Bounds()
self.vertices[0].SrcX = float32(srcBounds.Min.X) // top-left
self.vertices[0].SrcY = float32(srcBounds.Min.Y) // top-left
self.vertices[1].SrcX = float32(srcBounds.Max.X) // top-right
self.vertices[1].SrcY = float32(srcBounds.Min.Y) // top-right
self.vertices[2].SrcX = float32(srcBounds.Min.X) // bottom-left
self.vertices[2].SrcY = float32(srcBounds.Max.Y) // bottom-left
self.vertices[3].SrcX = float32(srcBounds.Max.X) // bottom-right
self.vertices[3].SrcY = float32(srcBounds.Max.Y) // bottom-right
...and don't forget to link your images too!
self.shaderOpts.Images[0] = yourImage
If you need texture clamping / repeat, the snippets page has some code for it.
Footnotes
-
While
DrawRectShader()
can be handy in some situations, in many practical scenarios you will have to end up reaching forDrawTrianglesShader()
anyway, so I'm of the opinion that you should spare yourself the cognitive overhead and simply ignore the function. If you really must know about the restrictions, all source images must have the same dimensions as the target area and any scaling must be handled throughGeoM
. In my opinion, the combined use ofGeoM
with shaders can make texture sizes, sampling/interpolations and so on more surprising and confusing than operating directly with vertices. ↩ -
Many of those shaders use
SourceRelativeTextureUnitX
andSourceRelativeTextureUnitY
uniforms, but that can often be replaced withunits := fwidth(sourceCoords)
. Otherwise, the classic bilinear interpolation with +/-0.5 can be found at src_bilinear.kage and doesn't require any uniforms. This is pretty much what Ebitengine does by default withFilterLinear
, but with some extra clamping that you might or might not be interested in. ↩