Depth order lost with shader material in Volume Rendering

Hello everyone, I’m trying to implement a Volume Rendering using Medical Images (DICOM & NIFTI). I’m using a custom ShaderMaterial to execute the VR. All is OK, but when I add other objects l(ROIs) the order is lost. Have you any suggestion? Thanks in advance.

Here an example where I’ve just added a box to check that…

Here is the right result with the Surface Rendering (withouth shaderMaterial)

Here my pieces of code:
this._renderer = new THREE.WebGLRenderer({
canvas: this.canvas,
antialias: true,
logarithmicDepthBuffer: true,
preserveDrawingBuffer: true,
});

this._shaderMaterial = new THREE.ShaderMaterial({
uniforms: currUniforms,
vertexShader: vertex,
fragmentShader: fragments,
side: THREE.FrontSide,
});

here my shading:
vertex:
//three.js/src/renderers/shaders/ShaderChunk at dev · neeh/three.js · GitHub
export const vertex = `

#include
#include <logdepthbuf_pars_vertex>
struct Ray {
vec3 origin;
vec3 dir;
};

out vec3 _normal;
out Ray vRay;
out mat4 _modelViewProjectionMatrix;
uniform vec3 uVolScale;
#include <clipping_planes_pars_vertex>
void main()
{
vec3 pos1 = positionuVolScale;
_modelViewProjectionMatrix = projectionMatrix * modelViewMatrix;//projectionMatrix
(viewMatrix*modelMatrix);
gl_Position = _modelViewProjectionMatrix * vec4(pos1, 1.0);
vec3 eyePos = cameraPosition/uVolScale;
vRay.dir = position-eyePos;
vRay.origin = eyePos ;
_normal = vec3(normal);
#include <logdepthbuf_vertex>
}`;

fragment:

void main()
{
#include <logdepthbuf_fragment>
Ray ray;
ray.origin = vRay.origin;
ray.dir = normalize(vRay.dir);
vec2 bounds = computeNearFar(ray);
if (bounds.x > bounds.y) discard;
bounds.x = max(bounds.x, 0.0);
float near = bounds.x;
float far = bounds.y;
vec3 rayStart = ray.origin + near * ray.dir;
vec3 rayStop = ray.origin + far * ray.dir;

  // Transform from object space to texture coordinate space:
  rayStart = 0.5 * (rayStart + 1.0);
  rayStop = 0.5 * (rayStop + 1.0);
  vec3 dir = rayStop - rayStart;
  float len = length(dir);
  dir = normalize(dir);
  vec3 deltaDir = dir * uStepSize;
  float offset = wang_hash(int(gl_FragCoord.x + 640.0 * gl_FragCoord.y));
  vec3 samplePos = rayStart + deltaDir * offset;

  vec4 acc = vec4(0.0);
  switch(uRenderStyle){
    case 0: acc = CalculateOpacityShadingModelNone(samplePos,deltaDir,len);break;
    case 1: acc = CalculateOpacityShadingModelIllustrative(samplePos, deltaDir, len,  dir); break;
    case 2: acc = CalculateOpacityShadingModelMIP(samplePos, deltaDir,  len, dir); break;
  }
  if (acc.a < 0.1)
    discard;
  if (acc.a < 1.0)
   acc.rgb = mix(uClearColor, acc.rgb, acc.a);
  gl_FragColor.rgba = acc;
}

As far as the GL layer is concerned.. the depth value is the depth of the primitive that contains your raymarching shader.
(presumably a plane or cube)

You’re not doing anything in your raymarching code to also produce a depth value that is consistent with the sample you are outputting.

afaik, you will need to compute a depth value and output it via gl_FragDepth, or manually generate a depth buffer in a separate pass.

Have a look at the recently added Data3DTexture class, it could give some insights

Hi,
many thanks for your replies.

@Lawrence3DPK I’m already using DataTexture3D to load the medical data. Thanks

@manthrax: you right, I forgot to post a piece of code where the fragDepth is calculated.
I’m trying to do the same like here:

Here ist my complete code:
Vertex:

export const vertex = /*glsl*/ `
/**
Already defined in threejs
uniform mat4 modelMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat3 normalMatrix;
uniform vec3 cameraPosition;
uniform vec3 normal;
uniform vec3 position;
uniform vec2 uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
 */
struct Ray {
  vec3 origin;
  vec3 dir;
};

out vec3 _normal;
out Ray vRay;
out mat4 _modelViewProjectionMatrix;

void main()
  {

    _modelViewProjectionMatrix = projectionMatrix * viewMatrix * modelMatrix;
    gl_Position = _modelViewProjectionMatrix * vec4(position, 1.0);
    vec3 eyePos = cameraPosition/uVolScale;
    vRay.dir = position-eyePos;
    vRay.origin = eyePos ;
    _normal = vec3(normal);
  }`;

Fragment:

export const fragments = /* glsl */ `layout(location = 2) out float fragDepth;
    precision highp float;
    precision highp sampler3D;
    uniform float uOpacity;
    uniform sampler3D uVolume;
    uniform sampler2D uMapColor;

    struct Ray
    {
      vec3 origin;
      vec3 dir;
    };

    in Ray vRay;
    in vec3 _normal;

    in mat4 _modelViewProjectionMatrix;

    uniform float uStepSize, uSliceSize;
    uniform int uRenderStyle;
    vec2 computeNearFar(Ray ray);
    float getSample(vec3 modelPosition);
    float getSample(float x, float y, float z);
    vec4 GetColorByPosition(vec3 samplePosition);
    vec4 CalculateOpacityShadingModelNone(vec3 samplePos, vec3 delta, float len);
    vec4 CalculateOpacityShadingModelIllustrative(vec3 samplePos, vec3 delta, float len, vec3 dir);
    vec4 CalculateOpacityShadingModelMIP( vec3 samplePos, vec3 delta, float len, vec3 dir);
    vec4 apply_colormap(float val);
    vec3 GetNormal(vec3 loc, vec3 step);
    vec4 GetGradientNormal(vec3 P, vec3 Delta);

 
    float wang_hash(int seed)
    {
      seed = (seed ^ 61) ^ (seed >> 16);
      seed *= 9;
      seed = seed ^ (seed >> 4);
      seed *= 0x27d4eb2d;
      seed = seed ^ (seed >> 15);
      return float(seed % 2147483647) / float(2147483647);
    }
    void main()
    {
      Ray ray;
      ray.origin = vRay.origin;
      ray.dir = normalize(vRay.dir);
      vec2 bounds = computeNearFar(ray);
      if (bounds.x > bounds.y) discard;
      bounds.x = max(bounds.x, 0.0);
      float near = bounds.x;
      float far = bounds.y;
      vec3 rayStart = ray.origin + near * ray.dir;
      vec3 rayStop = ray.origin + far * ray.dir;

      // Transform from object space to texture coordinate space:
      rayStart = 0.5 * (rayStart + 1.0);
      rayStop = 0.5 * (rayStop + 1.0);
      vec3 dir = rayStop - rayStart;
      float len = length(dir);
      dir = normalize(dir);

      vec3 deltaDir = dir * uStepSize;
      float offset = wang_hash(int(gl_FragCoord.x + 640.0 * gl_FragCoord.y));
      vec3 samplePos = rayStart + deltaDir * offset;
      vec4 acc = vec4(0.0);
      float opCorr = uStepSize / uSliceSize;
      float stepSizex2 = 2.0 * uStepSize;
      vec4 acc, colorSample = vec4(0.0);
      float lenAcc = 0.0;
      while (lenAcc < len)
      {

        colorSample = GetColorByPosition(samplePos);
        colorSample.a = 1.0 - pow((1.0 - colorSample.a), opCorr);//opacity correction
        if (colorSample.a > 0.01 && lenAcc > stepSizex2)
        {
          colorSample.rgb *= colorSample.a;
          acc = (1.0 - acc.a) * colorSample + acc;
          if (acc.a >= 0.95)
          {
                  break;
          }
        }
        samplePos += delta;
        lenAcc += uStepSize;
      }

      if (acc.a < 0.1)
        discard;
      if (acc.a < 1.0)
       acc.rgb = mix(uClearColor, acc.rgb, acc.a);
     
       gl_FragColor.rgba = vec4(acc.rgb,uOpacity) ;//  acc;

       //calculate fragDepth
      vec4 clipSpace = _modelViewProjectionMatrix*vec4(samplePos, 1.0);//   gl_ProjectionMatrix * eye_space_pos;
      float ndc_depth =  clipSpace.z/clipSpace.w;// clip_space_pos.z / clip_space_pos.w;
      fragDepth = 0.5 * ndc_depth +  0.5;

    }

Unfortunately it does not work. The same code works in a C# application.

I don’t understand the where is the problem..
If you have a suggestion.

Thanks in advance.

Shouldn’t

      fragDepth = 0.5 * ndc_depth +  0.5;

be

gl_FragDepth = 0.5 * ndc_depth +  0.5;

(I’m hazy on the semantics, but your setup may imply attempting to write depth to a separate target which isn’t what you want..)
?

You’re welcome in advance. :smiley:

2 Likes