Hi I am using three.js r179.1 and using WebGPU to create a hobby project working around virtual texture system, while implementing I encountered a issue where my value passed to the TSL compiler it seems the value was changed to my sampler node. Image below was normal expected value.
In here the value seems way different, it is not a texture but rather the sampler instead of the texture node.
Instead it builds the texture node using sampler’s input.
this.uvSamp = sampler({
filter: THREE.LinearFilter,
wrapS: THREE.RepeatWrapping,
wrapT: THREE.RepeatWrapping
});
The value passed here is this:

Here is my snippet:
import * as THREE from 'three/webgpu';
import {
storage, texture, uniform, globalId,
uint, float, vec2, vec4, uv, sampler, wgslFn, code
} from 'three/tsl';
export class FeedbackPass {
constructor(opts) {
this.renderer = opts.renderer;
this.mesh = opts.mesh;
this.pageTable = opts.pageTable;
const m = opts.manifest;
this.vW = (m.virtualWidth|0) >>> 0;
this.vH = (m.virtualHeight|0) >>> 0;
this.tile = (m.tileSize|0) >>> 0;
this.mips = (m.mipCount|0) >>> 0;
this.res = (opts.res|0) || 512;
this.everyN = (opts.everyN|0) || 8;
this.lodBias = Number.isFinite(opts.lodBias) ? opts.lodBias : 0.0;
// ---- A) UV capture target ----
this.rt = new THREE.RenderTarget(this.res, this.res, { depthBuffer: false, stencilBuffer: false });
this.rt.texture.name = 'vt_uv_rt';
// uniforms & nodes
this.uVW = uint(this.vW);
this.uVH = uint(this.vH);
this.uTile = uint(this.tile);
this.uMipC = uint(this.mips);
this.uBias = float(this.lodBias);
this.uRes = uint(this.res);
// This is the line that causes the TSL error.
this.uvTexNode = texture(this.rt.texture);
this.uvSamp = sampler({
filter: THREE.LinearFilter,
wrapS: THREE.RepeatWrapping,
wrapT: THREE.RepeatWrapping
});
// ---- C) Compute kernel ----
const kernel = wgslFn(`
fn kernel(
uvTex : texture_2d<f32>,
uvSamp : sampler,
bits_in : ptr<storage, array<u32>, read_write>,
vW : u32,
vH : u32,
tile : u32,
mipCount : u32,
bias : f32,
res : u32,
gid : vec3<u32>
) -> void {
// ... (rest of the kernel code)
}
`, [...]);
this.kernelNode = kernel(
this.uvTexNode,
this.uvSamp,
this.bitsNode,
this.uVW,
this.uVH,
this.uTile,
this.uMipC,
this.uBias,
this.uRes,
globalId
).compute(this.res * this.res);
}
}
Maybe I could do some jsfiddle using Gemini, sorta like this. The error dumps only happened if I call dispatch of the compute shader, but when debugging it errors out but not showing up on console.

