Skip to content
optimized single-pass blur shaders for GLSL
JavaScript GLSL
Branch: master
Clone or download
Latest commit 5dbb6e9 Jun 13, 2016
Type Name Latest commit message Commit time
Failed to load latest commit information.
demo use new img module Jun 13, 2016
tools animations Jun 5, 2015
.gitignore gitignores Jun 7, 2015
.npmignore documentation Jun 7, 2015
13.glsl animations Jun 5, 2015
5.glsl animations Jun 5, 2015
9.glsl animations Jun 5, 2015 add source Jun 4, 2015 code linting; readme update; docs Jun 8, 2015
package.json use new img module Jun 13, 2016




demo - source

Optimized separable gaussian blurs for GLSL. This is adapted from Efficient Gaussian Blur with Linear Sampling.


The function blurs in a single direction. For correct results, the texture should be using gl.LINEAR filtering.

#pragma glslify: blur = require('glsl-fast-gaussian-blur')

uniform vec2 iResolution;
uniform sampler2D iChannel0;
uniform vec2 direction;

void main() {
  vec2 uv = vec2(gl_FragCoord.xy / iResolution.xy);
  gl_FragColor = blur(iChannel0, uv, iResolution.xy, direction);

The module provides three levels of "taps" (the number of pixels averaged for the blur) that can be required individually. The default is 9.

#pragma glslify: blur1 = require('glsl-fast-gaussian-blur/13')
#pragma glslify: blur2 = require('glsl-fast-gaussian-blur/9')
#pragma glslify: blur3 = require('glsl-fast-gaussian-blur/5')

Since this is separable, you will need multiple passes to blur an image in both directions. See here for details or the demo for an implementation.


Use npm to install and glslify to consume the function in your shaders.

npm install glsl-fast-gaussian-blur --save



vec4 blur(sampler2D image, vec2 uv, vec2 resolution, vec2 direction)

Blurs the image from the specified uv coordinate, using the given resolution (size in pixels of screen) and direction -- typically either [1, 0] (horizontal) or [0, 1] (vertical).

Returns the blurred pixel color.

Further Optimizations

This can be further optimized on some devices (notably PowerVR) by using non-dependent texture reads. This can be done by calculating the texture coordinates in the vertex shader, and passing them as varyings to the fragment shader. This is left as an exercise for the reader to keep this module simple. You can read more about it here.


MIT, see for details.

You can’t perform that action at this time.