-
Notifications
You must be signed in to change notification settings - Fork 29.7k
Description
Background
The FragmentProgram API allows developers to provide new color sources to use with their applications. For example, a gradient that interpolates in a different color space or a pixelation image rendering effect. We've seen a ton of cool things with this API, and we want to make it more useful.
In the https://github.com/jonahwilliams/flutter_shaders/ , I created (hacked together) a widget that emulates support for applying shaders as compositing effects. For example, you might use this to apply a glitch effect to the entire application:
video_2023-06-04_05-18-21.mp4
(from jonahwilliams/flutter_shaders#21)
An example of what the code looks like might be something like:
class SampledText extends StatelessWidget {
const SampledText({super.key, required this.text, required this.value});
final String text;
final double value;
@override
Widget build(BuildContext context) {
return ShaderBuilder((context, shader, child) {
return AnimatedSampler((image, size, canvas) {
shader.setFloatUniforms((uniforms) {
uniforms
..setFloat(value)
..setFloat(value)
..setSize(size);
});
shader.setImageSampler(0, image);
canvas.drawRect(
Rect.fromLTWH(0, 0, size.width, size.height),
Paint()..shader = shader,
);
}, child: Text(text, style: TextStyle(fontSize: 20)));
}, assetKey: 'packages/flutter_shaders/shaders/pixelation.frag');
}
}With the associated fragment shader:
// Copyright 2013 The Flutter Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#version 460 core
precision highp float;
#include <flutter/runtime_effect.glsl>
layout(location = 0) uniform vec2 uPixels;
layout(location = 1) uniform vec2 uSize;
layout(location = 2) uniform sampler2D uTexture;
out vec4 fragColor;
void main() {
vec2 uv = FlutterFragCoord().xy / uSize;
vec2 puv = round(uv * uPixels) / uPixels;
fragColor = texture(uTexture, puv);
}Problems
There are a number of problems/limitations with this API that can't be fixed without new engine features:
-
Unlike "regular" composition (ImageFiltered, opacity, blurs), the framework needs to manage the creation of the render target and texture lifecycle. This is actually pretty tricky to manage, and we sitll probably have bugs here: toImageSync memory leak Android #131524
-
The way that toImageSync works while good for caching isn't great when we plan to composite on each frame, as we end up more or less processing two raster frames per UI frame - which can cause churn in places like the glyph atlas cache.
-
The AnimatedSampler cannot apply backdrop effects, limiting the usecases compared to normal compositing: Pixelation Shader not working with Backdrop filter. jonahwilliams/flutter_shaders#26
-
Minor point but the fragment shaders used with AnimatedSampler are non optimal as they require computing UVs in the fragment stage instead of the vertex stage.
Overview
Add a new constructor to the ui.ImageFilter type which accepts a developer provided fragment shader (see discussion below for semantcs). This image filter object would be usable wherever image filters are today and to the framework would work more or less the same as other filters.
Widget build(BuildContext context) {
return ImageFiltered(ui.ImageFilter.fromFragmentProgram(myProgram), child: ...);
}
// or
Widget build(BuildContext context) {
return BackdropFilter(filter: ui.ImageFilter.fromFragmentProgram(myProgram), ...)
}
Detailed Design
Similar to the runtime effect, we'll provide a standard vertex shader and fragment shader inputs. The vertex shader inputs will handle computing the texture UVs, and provide these to the fragment shader in place of coordinates. THis ensures that the engine can correctly manage the MVP while the fragment shader need only sample from the texture.
The FlutterTextureUV will provide the computed UVs, similar to how FlutterFragCoord provides the positions for color source shaders.
FlutterFragCoord will be used exactly as it is now, except that the engine will need to provide 1) the texture binding and 2) the texture size.
TBD: should we provide texture size as a uniform, or is there good xplatform support for a query like https://registry.khronos.org/OpenGL-Refpages/gl4/html/textureSize.xhtml ? The reason we need this is that the developer will not know the size of the child layer that is being sampled from.
TBD: how can we let developers communicate that a particular shader should not be compiled for Skia?
Providing other uniform data.
Additional uniform data could be provided via the ui.ImageFilter.fromFragmentProgram constructor, or via setters on the constructed object. I don't have a strong opinion here.
var filter = ui.ImageFilter.fromFragmentProgram(program, floats: [0.5], textures: [otherImage])
// or
var filter = ui.ImageFilter.fromFragmentProgram(program);
filter.setFloat(0, 0.5);
// slot 0 taken by input.
filter.setTexture(1, otherImage);In order to be usable by the engine, the shader must have at least one sampled texture uniform. We can validate this at runtime and reject invalid shaders using the attached metadata.
Limitations on Skia backend
Skia has limited support for filtering with SkRuntimeEffect (see https://api.skia.org/SkRuntimeEffect_8h_source.html ). While we could construct some limited color filter or sk blender objcts from it, I don't believe we'd be able to realize arbitrary image filters.
Since we cannot implement a vertex stage in Skia, even if we attempted to polyfill functionality with an AnimatedSampler like approach, developers would need to write two different kinds of shaders.
Web implementation
Similar to the above, since there is no Impeller web backend this would not work on the web either. While regular image filters could be implemented using a similar hack to the AnimatedSampler object, there is no good equivalent to BackdropFilter.
Alternatives Considered
Construction from asset key
We could also let the engine manage loading/caching the fragment program as well. This might look something like:
var imageFilter = await ui.ImageFilter.fromFragmentProgram('shaders/foo.frag');I don't really think this is any easier to use than the API approved above, and it would still need to be async so that we don't accidentally jank when loading the asset/creating the PSO.
It wouldn't be any easier to create multiple image filters, since there is no clone API.
Construction from fragment shader
The FragmentProgram/FragmentShader distinction normally comes down to the shader itself, versus the shader and some particular uniform values. If we accepted the FragmentShader, then we'd introduce the possibility of users accidentally using the same instance and re-binding uniforms meant for a different filter accidentally.
Support via custom layer type
We could instead add a new layer type to the framework/engine, rather than construct a ui.ImageFilter object. This seems like an obvious way to do it, but this has a number of draw backs. First of all, it requires a ton of code to implement a new flow layer - whereas this layer is more or less just a copy of the image filter or backdrop filter layer. Most parts of the backend don't have to care what the contents of an image filter are, its much simpler to encapsulate it.