How Do I Add Displacement Maps to Node Materials?

I am converting my materials to node materials, which generally involves replacing modifiers with node modifiers. For example, a material like this:

material = new THREE.MeshPhysicalMaterial({
	color: 0x0000ff,
	metalness: 1.0,
	roughness: 0.7,
	normalMap: NormalMapAddress,
});

becomes

material = new MeshPhysicalNodeMaterial({
	colorNode: color(0x0000ff),
	metalnessNode: float(1.0),
	roughnessNode: float(0.7),
	normalNode: normalMap(texture(NormalMapAddress)),
});

To use these Node modifiers, you also have to import the related items from Nodes.js, such as: color, float, metalness, roughness, texture, normalMap and MeshPhysicalNodeMaterial.

However, I have been unable to use this approach to add a Displacement Map.

Ideally, I should be able to use the same protocol that I used to load a Normal Map. But using

	displacementNode: displacementMap(texture(DisplacementMapAddress)),

does not work. And when I try to import displacementMap from Nodes.js, I get an error message informing me that this item does not exist. I have also searched through the three.js Node directory for any reference to displacement and none appears to exist.

Is it still possible to add a displacement map to a node material? If so, what is the protocol?

(If the answer involves creating and referencing a shader, I would not mind because I am using a shader to create the displacement map anyways.)

In NodeMaterial you would use positionNode – displacement is just a simple case, it can do much more including full vertex animation now.

Thanks! You are right, that should yield an even better result. Because displacement map worked only vertically, I was having to modify the material shaders (using onbeforecompile) to accept horizontal displacements.

And, yes, the displacement map will be constantly changing, so I will be attempting to animate the vertices.

So, in general terms, I should only need to get the address of the position array one time and then, for each iteration, copy the xyz values from the displacement map to the position array and request a recomputation of the position array?

Or to speed things up, can I give the address of the position array to the program creating the displacement map and have it write the values directly to the array?

If you’re updating the vertex positions on every frame, then I’d consider one of:

a) directly modify the vertex buffer every frame (fine and easy for lower poly count)
b) precompute the position of each vertex for each frame, upload it as one big vertex animation texture, and use positionNode to read position of each vertex from a specific row of the texture, animating across columns of the texture over time.

1 Like

With “b” are you suggesting that the precomputation be performed en masse, using a compute shader?

“vertex animation textures” (this is a common term in Unity and Houdini, worth a google search) are a common way of precomputing the positions of vertices at each frame, sometimes in a DCC tool like Houdini.

You don’t need compute shaders, although perhaps that’d be helpful in WebGPU. But if the transformation of the vertex can be expressed completely in a shader, then you could skip uploading a texture at all and just do some custom positionNode with the complete logic, no texture required.

1 Like

Oh, this is what you meant, Phil

Phil means a displacementTexture that is the result of a complex phase space transformation (ifft). Unfortunately, the vertex positions cannot simply be calculated individually and not with a single shader. His displacementTexture is the result of the Cooley-Tukey algorithm.
Unfortunately, this algorithm requires a lot of f32 textures that serve as efficient data storage for the huge amounts of integration steps.

But with the final displacement textures (there are several) I do exactly what you suggested don. I then calculate the final vertex position from these data textures in a shader and pass it on to the positionNode.

I suspect that Phil was looking for a perspective that would work without such a shader. I think I can help you with that @phil.

1 Like

Yes, as @Attila_Schroeder has noted, I am trying to update the vertices using a WebGL ocean simulation that creates a displacement map and a normal map.

And I will also want to update to WebGPU so that I can use the superior WebGPU version that @Attila_Schroeder has been creating over the past several months.

Yes, this initial effort was intended to be partly for my benefit and partly for the benefit of anyone else who wants to use the WebGL version of the ocean shader with nodes.
Of course, that may soon be obsolete once people start switching to WebGPU.

Now that I have had a chance to re-read your responses and review some code - is the gist of what you are both saying is that this is not a texture issue in that node textures do not make changes to vertices (using displacement maps or otherwise) and that all changes to vertices are made by changing values within the geometry position array?

There are many ways to do vertex animation – textures, vertex buffers, custom vertex shaders, transform feedback, compute shaders, etc. I’m trying to share some of those options so you’re aware of them. But I don’t really know what you’re doing in any detail, so I’m not sure I can get more specific than this - you can use textures, or not, it’s up to you.

It turns out that I have a slightly dated version of the program that I am trying to modify on CodePen.

The program is the iFFT Wave Generator. This is a demo of an Ocean Module that I updated and modified. In this demo, the module is actually included in the program - starting at line 253 (which makes this one of the longest Code Pen examples), but you can ignore that whole module.

The Module uses the wav_ variable (line 41), which includes the addresses for the displacement (Dsp) and normal (Nrm) maps (lines 50-51). Initializing the Ocean module loads those sub-variables with the addresses of the animated Displacement and Normal maps created by the Ocean Module (at lines 422-423 if you are interested).

The material I was trying to change to a node material is WtrMat, defined at line 189. (This material will be added to a segmented flat plane geometry to create the animated mesh.) The normaMap is added to the material at line 192. I initially added the displacementMap to the texture at line 191. But since that only gave vertical displacement, I created the onBeforeCompile at line 201 which allows for both vertical and horizontal displacement.

When converting to node materials, I was expecting that I could at least do something like line 191 and hoping that I should do something like 201.

So maybe you can still do animation using the node texture?