Skip to instructions and examples if you want to see CSS custom filters in action.
Skip to step-by-step explanation if you want to see how to create your own CSS filter effect.
CSS custom filters (formerly known as CSS shaders) are a new browser feature for applying user-created visual effects to elements of HTML document.
Custom filters are a part of Filter Effects 1.0 specification. They complement other part of the specification which defines several common CSS filter effects already built into browsers (like blur, sepia, contrast or grayscale).
Built-in CSS filters offer functionality similar to filters in image manipulation applications - you get a predefined set of commonly used effects where you can control parameters.
For example here an image was converted to grayscale and then blurred:
(click on the images to see live example, built-in CSS filters should work on stable Chrome)
CSS custom filters, on the other hand, allow to create completely new types of effects where you can control not just parameters of an existing effect but also the define the very nature of the effect using shaders.
For example here a map embedded into the document was transformed into a lit textured sphere:
(click on the images to see live example, see info on how to get CSS custom filters running)
Most notably built-in filters can read pixels of arbitrary DOM content, which means they can do for example blurs. See limitations of custom filters.
Custom filter effects are specified by small programs called shaders.
Shaders define 3D shape and look of graphical elements (shaders operate on polygonal meshes).
Shaders run directly in hardware on the graphics card. They can process a lot of data in parallel, which means they can be very fast but also often feel quite alien compared to typical sequential programs ran on the CPU.
CSS custom filters use GLSL as a language for writing shaders.
If you are familiar with shaders from modern graphics programming using programmable pipeline (OpenGL, OpenGL ES, WebGL, DirectX) this should sound familiar - shaders used in CSS custom filters are exactly the same thing.
Shaders used in custom CSS filters come in two types: vertex shaders and fragment shaders.
Vertex shaders tell where things are. They allow to move vertices of the mesh around 3D space, deforming and displacing objects.
Fragment shaders tell how object surfaces look. They allow to "paint" over the objects or to modify how existing pixels belonging to objects look.
In general, to be able to create a valid GPU program, you would need both fragment and vertex shaders. However for CSS custom filters only one of them is required as browser will use default pass-through shader for the missing one.
Modern browsers nowadays are themselves implemented using graphical acceleration.
HTML pages rendered by browsers are collections of textured rectangles corresponding to DOM elements.
With CSS custom filters you get a hook into browser's own rendering pipeline, getting a chance to modify shape and look of these content rectangles before they get drawn to the screen.
It's similar to how CSS 3D transforms work, just instead of being able to play only with parameters for a fixed predefined functionality, you can now run your own code against DOM content.
Every DOM element with custom CSS filter effect will be converted into a triangular mesh with user specified tessellation:
By default the mesh will have only two triangles (minimum possible to create a rectangle).
This mesh will get a texture created from the DOM element content (this is what would be normally rendered to the screen), then your custom shaders will be applied to it.
Mesh triangle grid can be created in two ways (controlled from CSS):
Mesh with attached triangles is a single connected object where neighboring triangles share vertices. If you move any vertex, all connected triangles will deform (think a sheet of fabric). This is a default way.
Mesh with detached triangles is composed of many individually separated triangles. Every vertex belongs only to a single triangle. You can break the mesh into individual components. Mesh can get holes, or in fact be completely remodelled in vertex shader.
Mesh tessellation and connectivity has to stay the same throughout CSS transitions.
Vertex and fragment shaders can take input parameters of three types:
Uniforms are parameters with a single value for all vertices and all pixels of the mesh (for example object color).
Attributes are per-vertex parameters, every vertex of the mesh gets its own value for each attribute (for example vertex position).
Varyings are parameters passed in from vertex shader to fragment shader. They are set per triangle vertex and then their values for points inside the triangle are interpolated by GPU (for example lighting).
Filter Effects specification also allows for another type of inputs: textures. However these are not yet implemented (and attempt to use them will break shaders silently).
Browser provides some default built-in parameters, created and initialized for all elements with custom CSS filters applied to them.
attribute vec4 a_position; attribute vec2 a_texCoord; attribute vec2 a_meshCoord; attribute vec3 a_triangleCoord;
Built-in attributes allow to identify and locate individual vertices and triangles of the mesh grid.
uniform mat4 u_projectionMatrix; uniform vec2 u_textureSize; uniform vec4 u_meshBox; uniform vec2 u_tileSize; uniform vec2 u_meshSize;
Built-in uniforms provide information about DOM element data common for the whole mesh.
varying vec2 v_texCoord;
Built-in varying provides texture coordinates in case effect uses default shaders.
You can find precise definitions of these built-in parameters in Filter Effects specification (attributes, uniforms, varyings).
For example u_textureSize and u_meshSize uniforms are missing. You can work around
this by supplying these values as user-specified uniforms from CSS.
Also v_texCoord varying doesn't work yet, so you'll need to create your own varying for passing texture coordinates
out of a_texCoord attribute in the vertex shader.
You can track a progress of CSS custom filters implementation in this WebKit "master" issue.
In addition to built-in parameters you can also provide custom uniform parameters for your effect via CSS style.
.shaded {
-webkit-filter: custom(url(distort.vs) mix(url(tint.fs) normal source-atop),
distortAmount 0.5, lightVector 1.0 1.0 0.0);
}
// Shader (vertex or fragment) ... uniform float distortAmount; uniform vec3 lightVector; ...
This is how you can control the effect from the outside. Also notably values for uniforms coming from CSS
will be interpolated when using CSS transitions.
Check here to see how many shader parameters are available on your system for WebGL (available numbers should be similar for CSS custom filters, save for some slots consumed by built-in parameters).
CSS custom filters are applied in the same way as other CSS styling, via <style> elements. It looks like this:
<style>
.shader {
-webkit-filter:
custom(url(shaders/crumple.vs) mix(url(shaders/crumple.fs) normal source-atop),
50 50, amount 0, strength 0.2, lightIntensity 1.05);
}
</style>
Or alternatively, if you use just a fragment shader:
<style>
.shader {
-webkit-filter:
custom(none mix(url(shaders/tint.fs) normal source-atop), amount 0);
}
</style>
Similar syntax with none in a place of fragment shader for using just a vertex shader doesn't seem to work yet.
For such cases you will need to supply a dummy pass-through fragment shader.
You give to browser a link to GLSL shader source files and specify their parameters via CSS style. Browser takes care of compiling the shaders into actual running binary program and applying them to HTML content.
When you already have existing shaders applying them to HTML content is simple. You just apply them as any other CSS styling.
<div class="shader">
You can use CSS transitions with custom CSS filters, for example to have some dynamic changes happen to the effect when hovering over page element.
-webkit-transition: -webkit-filter ease-in-out 1s;
CSS custom filter parameters will be then interpolated in the same way like for built-in CSS properties.
This post reflects a state as of September 2012. If you read it later and something doesn't work exactly as advertised, it's a good idea to search on the web a bit, try finding some more recent running examples for the exact syntax / behaviors.
At this moment, CSS custom filters work only in recent Chrome Canary build (exact version at the time of writing this article was 24.0.1278.0).
You can download Chrome Canary here:
CSS custom filters should work on Windows and OSX (I use Windows 7).
Even if your OS and browser supports them, you may still not able to get them working if your GPU is too old / underpowered or has incompatible drivers (try WebGL compatibility check, hardware requirements should be pretty similar).
Right now CSS custom filters are not enabled by default, you'll have to run Chrome with a command line option:
chrome --enable-css-shaders
--enable-css-shaders parameter to its "target" property.
Alternatively you may create a batch file calling chrome executable with a command line option.
In newer Chrome Canary this flag is also available via user interface at special "flags" url:
chrome://flags/
Copy and paste this into url bar, look for "Enable CSS Shaders".
If you see spherical planets, congratulation, it's working ;).
If you see just rectangular images, something went wrong ;S.
Please note that some older examples found randomly on the web may be broken. For example effects created to work in the early custom Chromium build from May 2012 that's on GitHub do not work in the latest Chrome Canary.
In such case you can try:
Shaders broken in one tab can kill shaders in all other tabs (and shaders will stay broken, immune to page reloads).
Shaders may be using only partially rendered DOM element upon the first page load. In such case close the tab and open the page again, reloading will not help.
Shaders used in custom CSS filters can't read DOM content texture pixels at all and also they can't write to screen pixels directly.
This limitation came as a response to possibility of timing attacks where rogue 3rd party shader embedded in the website could be used to read website's content (via different shader code paths executed for different source pixels colors).
The only way how you can interact with pixels of the original DOM element content is thus via blending
of your computed colors with the original colors using
css_ColorMatrix
or css_MixColor
built-in fragment shader variables.
gl_FragColor for writing solid color values.
This doesn't currently work.
The downside of CSS shader approach is that it prohibits many interesting types of applications, the upside is that CSS shaders can be applied on any arbitrary content.
Another limitation, at least with the current implementation seems to be a number of triangles per single tessellated mesh.
You can't have mesh with a bit more than about 20,000 triangles. This suggests rendering using indexed triangles with 16-bit integer indices limiting number of vertices in a single mesh to 65,536.
All you need is a browser and text editor. Shaders used for CSS custom filters are just regular text files.
Workflow is as with other HTML + CSS content authoring: you do the changes, reload the page in the browser and changes will be reflected in the page.
There is a tool from Adobe called CSS FilterLab, which allows you to more easily tweak and modify effect parameters.
For WebGL a caching of already used binary compiled shaders is being implemented in Chrome, this is something that could help eventually.
Simple no-op vertex shader looks like this:
precision mediump float;
attribute vec4 a_position;
uniform mat4 u_projectionMatrix;
void main() {
gl_Position = u_projectionMatrix * a_position;
}
Simple no-op fragment shader looks this:
precision mediump float;
void main() {
float r = 1.0;
float g = 1.0;
float b = 1.0;
float a = 1.0;
css_ColorMatrix = mat4( r, 0.0, 0.0, 0.0,
0.0, g, 0.0, 0.0,
0.0, 0.0, b, 0.0,
0.0, 0.0, 0.0, a );
}
Precision specifiers have no effect on desktop OpenGL, they work just on mobile devices using OpenGL ES. However they must be present for the current implementation to work.
And shaders can be applied to HTML content like this:
<style>
.shader {
-webkit-filter: custom(url(simple.vs) mix(url(simple.fs) normal source-atop), 1 1)
}
</style>
<body>
<div class="shader"> Hello world! </div>
<body>
This is a minimal CSS style using both fragment and vertex shader applied on mesh with the simplest tessellation using 2 attached triangles (1 column x 1 row).
How to create a custom effect?
Here we'll show how to create a bit simpler version of the sphere effect, similar to the one used here.
We start from a plane textured with DOM content (left image). We want warp this plane into a sphere and have it shaded as if it was a 3D object lit by a directional light (right image).
Find a full running example here.
All mesh deformations are taking place in vertex shader.
First we will need to find a way how to warp a rectangular plane into a shape of sphere, we will take care of shading later.
We start with a simple rectangular mesh, lying on a 2D plane, with vertex positions spread uniformly over the plane. We'll need to find some mapping of these 2D positions onto a 3D sphere.
We can get vertex coordinates for the original plane as built-in a_position attribute provided by the browser.
We need to declare the variable before it can be used:
attribute vec4 a_position;We will store position attribute in a local variable which will be modified later on (attributes are read-only):
vec4 position = a_position;
One common operation in computer graphics is texturing via UV mapping where a rectangular image is wrapped around a mesh using 2D texture coordinates. This sounds very similar to what we need.
One of the built-in mesh attributes provided by the browser is two-component vector a_texCoord with texture coordinates of the vertex.
attribute vec2 a_texCoord;
Texture coordinates are called U and V, they are in 0 .. 1 range for each axis and map mesh vertices along image width and height.
For getting X, Y, Z coordinates corresponding to our U and V coordinates we'll use a transformation between spherical coordinate system and Cartesian coordinate system:
x = r * sin( θ ) * cos( φ ) y = r * sin( θ ) * sin( φ ) z = r * cos( θ )
Spherical coordinate system uses these coordinates:
We will let user specify radius via CSS supplied uniform:
uniform float sphereRadius;
And we will map U and V coordinates to azimuth and inclination:
vec3 computeSpherePosition( vec2 uv, float r ) {
vec3 p;
float fi = uv.x * PI * 2.0;
float th = uv.y * PI;
p.x = r * sin( th ) * cos( fi );
p.y = r * sin( th ) * sin( fi );
p.z = r * cos( th );
}
vec3 sphere = computeSpherePosition( a_texCoord, sphereRadius );
Now we can blend between the original plane position and a new sphere position we just computed
using GLSL built-in function mix
(linear interpolation between two values).
uniform float amount;
We will use user supplied uniform parameter amount to control the blending
(uniform parameters from CSS can be interpolated by browser using CSS transitions).
position.xyz = mix( position.xyz, sphere, amount );
Finally we can displace mesh vertices by writing into built-in GLSL variable
gl_Position,
transforming our computed position by browser supplied
u_projectionMatrix matrix uniform:
gl_Position = u_projectionMatrix * position;
Shading computation is done jointly both in vertex and fragment shaders.
We will compute per-vertex lighting in vertex shader and pass it as varying parameter into fragment shader where we will use it to modify colors of DOM texture.
For shading we will use a simple Lambertian diffuse reflectance model.
The light reflection is calculated as the dot product of the surface's normal vector and a normalized light-direction vector.
We will need a position of the light and surface normal for each vertex.
We will let user supply light position as 3-component vector uniform lightPosition passed in via CSS:
uniform vec3 lightPosition;
We will normalize it using built-in GLSL normalize function:
vec3 lightPositionNormalized = normalize( lightPosition );
Next we will need to compute normals for both plane and sphere shapes.
Surface normals are usually computed on CPU side as you need mesh connectivity and access to all triangle vertices at once (shaders see just one individual vertex at time).
Here, however, we deal with simple geometrical shapes so we can compute normals analytically.
For the plane normals we will cheat a bit: a proper plane normal would be facing outwards of the monitor, perpendicular to XY plane.
However this would give an undeformed DOM element darker color for some light positions. This would be accurate lighting just not consistent with other elements on the page which wouldn't have applied shaders.
Instead we will use a vector facing towards the light as a fake plane normal:
vec3 planeNormal = lightPositionNormalized;
This will make undeformed element always fully lit.
For the sphere normals we will use a simple analytical formula. Sphere normals at the surface are just normalized vectors from the sphere center to the sphere surface:
vec3 sphereNormal = normalize( position.xyz );
For getting an approximation of the normal for our intermediate blended shape, we will just blend between plane and sphere normals and normalize such blended vector:
vec3 normal = normalize( mix( planeNormal, sphereNormal, amount ) );
Finally now we can compute lighting according to Lambertian reflectance formula,
using built-in GLSL dot product function,
clamping negative values using built-in GLSL max function:
float light = max( dot( normal, lightPositionNormalized ), 0.0 );
And as the last step in vertex shader, we will pass lighting intensity to fragment shader via varying:
varying float v_light;
v_light = light;
Fragment shader will be very simple, all the heavy lifting already done in vertex shader.
We will read lighting intensity coming from varying:
varying float v_light;
And use it to modulate color coefficients (we will leave alpha untouched):
float r, g, b; r = g = b = v_light;
Which will be used to initialize blending matrix
css_ColorMatrix built-in output variable:
css_ColorMatrix = mat4( r, 0.0, 0.0, 0.0,
0.0, g, 0.0, 0.0,
0.0, 0.0, b, 0.0,
0.0, 0.0, 0.0, 1.0 );
In our case this will result in final color equivalent to this:
gl_FragColor = vec4( r, g, b, 1.0 ) * sourceColor;
precision mediump float;
// Built-in attributes
attribute vec4 a_position;
attribute vec2 a_texCoord;
// Built-in uniforms
uniform mat4 u_projectionMatrix;
// Uniforms passed in from CSS
uniform float amount;
uniform float sphereRadius;
uniform vec3 lightPosition;
// Varyings
varying float v_light;
// Constants
const float PI = 3.1415;
// Construct perspective matrix
vec3 computeSpherePosition( vec2 uv, float r ) {
vec3 p;
float fi = uv.x * PI * 2.0;
float th = uv.y * PI;
p.x = r * sin( th ) * cos( fi );
p.y = r * sin( th ) * sin( fi );
p.z = r * cos( th );
return p;
}
// Main
void main() {
vec4 position = a_position;
// Map plane to sphere using UV coordinates
vec3 sphere = computeSpherePosition( a_texCoord, sphereRadius );
// Blend plane and sphere
position.xyz = mix( position.xyz, sphere, amount );
// Set vertex position
gl_Position = u_projectionMatrix * position;
// Compute lighting
vec3 lightPositionNormalized = normalize( lightPosition );
vec3 planeNormal = lightPositionNormalized;
vec3 sphereNormal = normalize( position.xyz );
vec3 normal = normalize( mix( planeNormal, sphereNormal, amount ) );
float light = max( dot( normal, lightPositionNormalized ), 0.0 );
// Pass in varyings
v_light = light;
}
precision mediump float;
varying float v_light;
void main() {
float r, g, b;
r = g = b = v_light;
css_ColorMatrix = mat4( r, 0.0, 0.0, 0.0,
0.0, g, 0.0, 0.0,
0.0, 0.0, b, 0.0,
0.0, 0.0, 0.0, 1.0 );
}
.shader {
-webkit-filter: custom(url(sphere.vs) mix(url(sphere.fs) normal source-atop),
16 32, amount 1, sphereRadius 0.35, lightPosition 0.0 0.0 1.0);
-webkit-transition: -webkit-filter ease-in-out 1s;
}
.shader:hover {
-webkit-filter: custom(url(sphere.vs) mix(url(sphere.fs) normal source-atop),
16 32, amount 0, sphereRadius 0.35, lightPosition 0.0 0.0 1.0);
}
Debugging shaders used in CSS custom filters is difficult. At least for now there is no error output to Chrome console (unlike WebGL where you can get results of shader compilation so for example you can see where you made a typo).
On Mac you should get some error output printed to the terminal window from where you started Chrome but on Windows there is no output to the terminal.
One possible workaround would be to use a GL shader validation plugin for Sublime text editor created recently by @aerotwist and @brendankenny.
Unfortunately, similar to WebGL there is no way to see what's going on on the GPU side :S.
Though there are few tricks you may use to get at least a bit of insight about shaders performance.
For example, if you want to check if the performance of your effect is fragment shader bound, try changing the size of the DOM element to see if the performance gets better with smaller elements.
Try avoiding effects applied to full-screen elements. At least for now performance is not there yet (could be a bottleneck at page compositing).
If you want to see if the performance of your effect is vertex shader bound, try it with different mesh tessellations.
Bear in mind if you apply your effect to many highly tessellated elements on a single page, you may ramp up overall triangle count quite fast.