WebGPU - Problems Creating WGSL Shader to Resize Image

UPDATE (Sep 1)

The reason I was attempting to resize the position points buffer was to allow me to increase the resolution of the Normal Map and to add more detail.

I have since found an easier way to add detail in the first example of the TSL Guide which discussed adding a detail texture to the diffuse texture Although I wasn’t able to get the get the example in the Guide to work, I was able to create a “detail texture” by adding a command to make the texture repeat 32x. I then combined the “detail texture” with the lower resolution diffuse texture using the “mul” command. The result is a diffuse texture that has good detail.

After making the original post, I realized there were a couple of problems with my post. First, as noted below, the “error” shown in the picture was not an error, but an expected result of my not interpolating the position values when resizing the position buffer. Second, I had forgotten that my Normal Map shader was already resizing the texture.

In conclusion, the shader below may actually be a good way to resize a buffer. But, if you are simply trying to add more detail to a texture, there are easier ways to do that.

Thanks to every who participated in this discussion..

ORIGINAL POST

I am trying to resize a series of points that are saved in vec4 texture format
Thus, if the source texture is 12,22,37 … and the destination texture is 2x larger, I would expect that the destination values are 12,12,22,22,37,37 … This is something like a simple pixel resizer.

I am using this wgsl shader to perform that action.

this.compBigr = wgslFn(`
	fn computeWGSL(
		u_tsiz: f32, // size of destination
		r_disp: texture_2d<f32>, // source
		w_bigr: texture_storage_2d<rgba32float,write>, // destination
		u_indx: u32,
		u_mult: u32 // multiplier
	) -> void {
		// Compute vUv (special)
		var posX = f32(u_indx) % u_tsiz; // width - destination
		var posY = f32(u_indx) / u_tsiz; // height - destination
		var idxD  = vec2i(i32(posX),i32(posY)); // index - destination
		var idxS  = vec2i(i32(posX)/i32(u_mult),i32(posY)/i32(u_mult)); // index - source
		//
		var input = textureLoad(r_disp,idxS,0);
		textureStore(w_bigr,idxD,input);
	}
`);

The resized values are then fed to a normal shader which computes the normal values, which I expect to be 2x bigger.

However, instead of smooth values, I am getting a series of little squares, like this:

The problem could be with the normal shader. But since this resizing should be a simple exercise, I want to first make sure that my resize shader looks correct.

Any suggestions?

Is your input texture also rgba ?

EDIT: sorry I just reread your initial texture is vec4

To compute a normal, the texture is sampled at 3 points, and the differences between those points form the components of the normal.
It looks like the sample radius of that normal shaders sampling is too small by ~5x. Does the normal shader have any settings like ‘radius’?

Sorry, I think I created some of the confusion. The points are x,y,z displacement values with an extra unused value.

So the values being duplicated are vec4 values. So I should have said that if the rgba values are vec4(12,22,37,1), vec4(13,30,40,1) …., I would expect that the results would be vec4(12,22,37,1), vec4(12,22,37,1), vec4(13,30,40,1), vec4(13,30,40,1).

I assume that the textureLoad and textureStore are reading vec4 rgba values. However, in the var input line, I am not sure what the last “0” is for.

Since I am using full vector displacement (xyz), instead of just y, the normal shader is sampling 4 points (up, down, left, right). If the resize shader is acting as expected, I would think that … [light bulb appears above head] … the normal shader SHOULD be creating flat areas because it is comparing some identical values.

I am actually increasing the size of the texture by 4, so I am not sure that matches the pattern I am seeing. But I still need to see if that is what is causing the lines. So, I apparently need to figure out how to create interpolated values after all. I am not sure if the xyz values are signed or unsigned, so (for now) I will assume that they are unsigned.

Is there a simple shader command to create interpolated values or some example I can refer to? Otherwise, I assume the that I will need to look ahead to the next x and y values, divide by the resize (e.g. 4) and add that to each of the next (3) numbers.

You can set the filtering on the source texture to LinearFilter?

1 Like

Thanks, but changing that did not seem to help. Here is my setup for those two textures:

	// Source (need MipMap enabled since used for vertex displacement) 
	this.dispMapTexture = new THREE.StorageTexture(this.Res,this.Res);
	this.dispMapTexture.type = THREE.FloatType;
	this.dispMapTexture.magFilter = THREE.LinearFilter;
	this.dispMapTexture.minFilter = THREE.LinearMipMapLinearFilter;
	this.dispMapTexture.generateMipmaps = true;
	this.dispMapTexture.wrapS = this.dispMapTexture.wrapT = THREE.RepeatWrapping;
	// Destination
	this.bigrMapTexture = new THREE.StorageTexture(this.Mlt*this.Res,this.Mlt*this.Res);
	this.bigrMapTexture.type = THREE.FloatType;
	this.bigrMapTexture.magFilter = THREE.LinearFilter;
	this.bigrMapTexture.minFilter = THREE.LinearMipMapLinearFilter;
	this.bigrMapTexture.generateMipmaps = true;
	this.bigrMapTexture.wrapS = this.dispMapTexture.wrapT = THREE.RepeatWrapping;

What I am trying to do is to increase the apparent resolution of my display by changing the size of the normal map but not the displacement map.. Creating the displacement map involves many computations. I was hoping that I could enhance the larger displacement map using some some simple add-on adjustments and use the result to create a more detailed normal map. For example, I might be able to add a static map with random distortions to create the illusion of more detail. Upon reflection, perhaps I should just go ahead and do that now to see if that eliminates the flat areas.