three.js point clouds, BufferGeometry and incorrect transparency - javascript

The problem: I have a point cloud with quite a lot of data points (around one million). When I apply transparency to the rendered points, the transparency somehow does not show what is behind the rendered points
As you can see in the example of the marked point, it does not show what it should, it is as if there is a problem with the buffering.
I use three.js to create a point cloud using the following "setup":
The renderer:
this.renderer = new THREE.WebGLRenderer({
canvas: this.canvas,
antialias: true
});
The material:
this.pointMaterial = new THREE.ShaderMaterial( {
uniforms: {
time: { type: "f", value: 1.0 }
},
vertexShader: document.getElementById('vertexShader').textContent,
fragmentShader: document.getElementById('fragmentShader').textContent,
transparent: true
});
The vertex shader:
attribute float size;
attribute float opacity;
attribute vec3 color;
varying vec3 vColor;
varying float vOpacity;
void main() {
vColor = color;
vOpacity = opacity;
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
gl_PointSize = size * (500.0 / length(mvPosition.xyz));
gl_Position = projectionMatrix * mvPosition;
}
The fragment shader:
uniform float time;
varying vec3 vColor;
varying float vOpacity;
void main() {
gl_FragColor = vec4(vColor, vOpacity);
}
The geometry (where I left out the part where I populate the arrays):
var bufferGeometry = new THREE.BufferGeometry();
var vertices = new Float32Array(vertexPositions.length * 3);
var colors = new Float32Array(vertexColors.length * 3);
var sizes = new Float32Array(vertexSizes.length);
var opacities = new Float32Array(vertexOpacities.length);
bufferGeometry.addAttribute('position', new THREE.BufferAttribute(vertices, 3));
bufferGeometry.addAttribute('color', new THREE.BufferAttribute(colors, 3));
bufferGeometry.addAttribute('size', new THREE.BufferAttribute(sizes, 1));
bufferGeometry.addAttribute('opacity', new THREE.BufferAttribute(opacities, 1));
this.points = new THREE.Points(bufferGeometry, this.pointMaterial);
this.scene.add(this.points);
I tried this with the built-in point material, where the same happens
this.pointMaterial = new THREE.PointsMaterial({
size: this.pointSize,
vertexColors: THREE.VertexColors,
transparent: true,
opacity: 0.25
});
Is this a but, expected behaviour or am I doing something wrong?

The way the alpha blending equation works is that a source colour for geometry that is behind is covered by a destination colour for geometry that is in front. This means you need to render your transparent geometry in sorted order from back to front, so that geometry in front will correctly blend with geometry behind.
If all you have is transparent geometry then you can just disable depth testing, render in reverse depth sorted order, and it will work. If you have opaque geometry as well then you need to first render all opaque geometry normally, then disable depth writing (not testing) and render transparent geometry in reverse depth sorted order, then re-enable depth writing.
Here are some answers to similar questions if you're interested in learning a bit more.

Related

Threejs fade in/out mesh shader material

I'm learning Threejs and glsl shaders. I want to fade in/out a mesh when adding or removing it from the scene. My mesh is a tubeGeometry (which joins two points from an sphere, taken from this tutorial at the end) and a shaderMaterial. My mesh configuration looks like this.
const tubeSegments = 20;
const path = new CatmullRomCurve3(points);
// Points is an array of vec3
const geom = new TubeGeometry(path, tubeSegments, 0.01, 8, false);
const material = new ShaderMaterial({
vertexShader,
fragmentShader,
side: DoubleSide,
uniforms: {
time: {
value: mesh_fragments_time.get(),
},
color: {
value: new Vector3(1, 1, 0),
},
},
});
const mesh = new Mesh(
geom,
material
);
The vertexShader:
varying vec2 vertexUV;
varying vec3 vertexNormal;
void main(){
vertexUV=uv;
vertexNormal=normalize(normalMatrix * normal);
gl_Position=projectionMatrix*modelViewMatrix*vec4(position,1);
}
The fragmenShader:
varying vec2 vertexUV;
uniform float time;
uniform vec3 color;
void main () {
float dash = sin(vertexUV.x*60. - time);
if (dash>0.) discard;
gl_FragColor=vec4(color,0);
}
Where mesh_fragments_time.get() or time is a number that changes on my requestAnimation and makes the object dashed.
I've tried adding opacity to the shader material and changing it after adding to scene, but doesn't work.
I suppose I have to do that inside of the fragment shader but don't know how. Can someone help me?
R147

Weird Line When Wrapping Image in Fragment Shader

I have made a simple fragment shader using THREE.js. It reads each pixel's coordinate, adds a value to the x component (wrapping to 0 if it goes above 1), and returns the color of a background image at this new location. This has the effect of shifting the background image over and wrapping the part that goes off-screen. The problem is that a dark line sometimes appears where the edge of the image is shifted over:
Here is my code:
var vertexShader = `
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
`;
var fragmentShader = `
varying vec2 vUv;
uniform sampler2D background;
void main() {
float x = mod(vUv.x + 0.47, 1.0);
float y = vUv.y;
gl_FragColor = texture(background, vec2(x, y));
}
`;
$(document).ready(function() {
var plotElement = $("#plot");
var scene = new THREE.Scene();
var renderer = new THREE.WebGLRenderer();
var camera = new THREE.OrthographicCamera(-0.5, 0.5, 0.5, -0.5, 0, 1000);
renderer.setSize(500, 500);
plotElement.append(renderer.domElement);
var background = new THREE.TextureLoader().load('https://wearablewearyphase.davidbrock1.repl.co/background.png');
var material = new THREE.ShaderMaterial({
vertexShader: vertexShader,
fragmentShader: fragmentShader,
uniforms: {
background: {
value: background
},
},
});
geom = new THREE.PlaneBufferGeometry(1, 1);
mesh = new THREE.Mesh(geom, material);
scene.add(mesh);
camera.z = 1;
function render() {
requestAnimationFrame(render);
renderer.render(scene, camera);
}
render();
});
<!DOCTYPE html>
<html>
<head>
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/r118/three.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.5.1/jquery.min.js"></script>
</head>
<body>
<div id="plot"></div>
</body>
</html>
I am quite sure that the line is not in the original image:
The boundary should not be visible at all because the left and right sides are the same color. Furthermore, the line only appears for some shift distances (49% and 47%, but not 50% or 48%, for example). I have found that if I make the output bigger than the background image than the line disappears, but I would prefer not to do this.
What is causing this line and how can I prevent it?
Note:
I just used the "mod" function as an example. Originally, I had another program (another shader) calculate x and y coordinates for every pixel and save them in another texture as the red and green components. The fragment shader then looked these coordinates up in the image.
I ran into this problem while trying to create an animation like this. The lines started appearing all over the screen and did not look good.
This is happening because the fragment shader interpolates values across pixels. So one pixel column approaches 1.0, the next one is a very squished version of your entire texture between 1.0 - 0.0, and the next one starts over at 0.0.
The easiest way to circumvent this behavior is to set your texture's wrapping mode to THREE.RepeatWrapping and get rid of the mod() so as your texture goes above 1.0, it'll automatically start over from the left again.

Three.js Texture scaling issue when using THREE.ShaderMaterial to map a renderTarget.texture to a quad

A codepen demonstrating this issue is here: https://codepen.io/lilrooness/pen/QWjdjgP
I'm rendering noise to a render target and then using that render target to texture a quad that I render to the screen.
When using the render target as a texure I'm encoutering a problem with the textures size.
It works fine if I use new THREE.MeshBasicMaterial({ map: renderTarget.texture })
but when I use my own material
var renderMaterial = new THREE.ShaderMaterial({
uniforms: {
tex: { type: 'sampler2D', value: renderTarget.texture }
},
fragmentShader: pixelateFragmentShader(),
vertexShader: standardVertexShader()
})
I get a very small texture that's clamped
This is the vertex shader that I use for both renders:
varying lowp vec3 vUv;
void main() {
vUv = position;
vec4 modelViewPosition = modelViewMatrix * vec4(position, 1.0);
gl_Position = projectionMatrix * modelViewPosition;
}
this is the rendering function (im using the same camera to render both times)
function animate() {
time += 0.1;
quad.material.uniforms.time.value = time;
requestAnimationFrame(animate);
renderer.setRenderTarget(renderTarget);
renderer.render(bufferScene, camera);
renderer.setRenderTarget(null);
renderer.render(scene, camera);
}
Again, this works fine if I use the MeshBasicMaterial. What am I doing wrong?
The problem was that, in my vertex shader I was setting
vUv = position
While this was desired behaviour for the perlin noise effect, I was re-using the same vertex shader to render the render target texture
I changed it to:
vUv = uv;

Vertex Colors are changing to white

I'm working with THREE.js points and sometimes I need them to have different per point color. Sometimes, I'm also modifying their alpha value so I had to write my own shader programs.
In JavaScript I have the following code:
let materials;
if (pointCloudData.colors !== undefined) {
geometry.colors = pointCloudData.colors.map(hexColor => new THREE.Color(hexColor));
// If the point cloud has color for each point...
materials = new THREE.ShaderMaterial({
vertexColors: THREE.VertexColors,
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
} else {
// Set color for the whole cloud
materials = new THREE.ShaderMaterial({
uniforms: {
unicolor: { value: pointCloudData.color },
},
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
}
const pointCloud = new THREE.Points(geometry, materials);
I am basically setting the mesh color to a uniform value unless I have defined per point colors - then I set vertexColors to the geometry. I also checked the values being stored in the geometry.colors and they are correct RGB values in range [0,1].
My Vertex Shader code:
attribute float size;
attribute float alpha;
varying float vAlpha;
varying vec3 vColor;
void main() {
vAlpha = alpha;
#ifdef USE_COLOR
vColor = color;
#endif
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = size * ( 300.0 / -mvPosition.z );
gl_Position = projectionMatrix * mvPosition;
}
And my Fragment shader code:
uniform vec3 unicolor;
varying vec3 vColor;
varying float vAlpha;
void main() {
#ifdef USE_COLOR
gl_FragColor = vec4(vColor, vAlpha);
#else
gl_FragColor = vec4(unicolor, vAlpha);
#endif
}
Again, I am checking if the vertexColor is set and then passing it to the Fragment Shader which then sets the per point.
For some reason, the vertices are all white when setting the color per point (screenshot: The white pixels should be green/red). I'm far from advanced user in WebGL and any help would be appreciated. Am I doing something wrong that I'm not aware of?
You are creating a custom ShaderMaterial and using this pattern in your fragment shader:
#ifdef USE_COLOR
vColor = color;
#endif
Consequently, you need to specify the material.defines like so:
var defines = {};
defines[ "USE_COLOR" ] = "";
// points material
var shaderMaterial = new THREE.ShaderMaterial( {
defines: defines,
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexshader' ).textContent,
fragmentShader: document.getElementById( 'fragmentshader' ).textContent,
transparent: true
} );
You do not need to set vertexColors: THREE.VertexColors. That is just a flag used by built-in materials to alert the renderer to set the defines for you.
three.js r.85
OK, I think I figured it out since it's working now.
I had to set the colors as geometry attributes:
const colors = new Float32Array(n * 3);
for (let i = 0; i < n; i += 1) {
new THREE.Color(pointCloudData.colors[i]).toArray(colors, i * 3);
}
geometry.addAttribute('colors', new THREE.BufferAttribute(colors, 1));
I also used the suggestion provided by WestLangley and removed the vertexColors: THREE.VertexColors, part from the Material definition and set the define as well:
materials = new THREE.ShaderMaterial({
defines: {
USE_COLOR: '',
},
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
Then in my Vertex shader I added:
attributes vec3 colors;
to get the colors passed from the JavaScript. The rest is the same, I just passed the colors to the fragment shader using the same code as in the posted question above.

Passing an attribute variable : black screen

I am trying to pass an attribute variable with three.js to the vertex shader, then the vertex shader should pass it to the fragment shader through a varying variable.
Vertex shader:
attribute vec4 color;
varying vec4 outColor;
void main()
{
outColor= color;
gl_Position= projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
Fragment shader:
varying vec4 outColor;
void main() {
gl_FragColor = outColor;
}
This way let's say that I have a cube with 8 vertices: if there is a different color for each vertex, then the cube should be drawn by interpolating the color of each vertex, and in the middle of a face it should have a mixed color. This is the javascript code snippet where I initialize the attributes:
var colors= [];
for(var i=0; i<geometry.vertices.length; i++) {
colors.push(new THREE.Vector4(0.0,1.0,0.0,1.0));
}
var attributes = {
color: {type:"v4", value: colors}
};
var material= new THREE.ShaderMaterial({
uniforms: uniforms,
attributes: {},
vertexShader: document.getElementById("vertexShader").textContent,
fragmentShader: document.getElementById("fragmentShader").textContent
});
For now this should draw a completely green cube. The problem is that the instruction in the vertex shader outColor=color; messes up everything: I just see a black screen. If I replace this instruction with outColor=vec4(0.0,1.0,0.0,1.0);, I see a correctly drawn green cube on the screen.
Here is the full source code.
Try passing attributes instead of {} to the THREE.ShaderMaterial constructor.

Categories

Resources