I have made a simple fragment shader using THREE.js. It reads each pixel's coordinate, adds a value to the x component (wrapping to 0 if it goes above 1), and returns the color of a background image at this new location. This has the effect of shifting the background image over and wrapping the part that goes off-screen. The problem is that a dark line sometimes appears where the edge of the image is shifted over:
Here is my code:
var vertexShader = `
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
`;
var fragmentShader = `
varying vec2 vUv;
uniform sampler2D background;
void main() {
float x = mod(vUv.x + 0.47, 1.0);
float y = vUv.y;
gl_FragColor = texture(background, vec2(x, y));
}
`;
$(document).ready(function() {
var plotElement = $("#plot");
var scene = new THREE.Scene();
var renderer = new THREE.WebGLRenderer();
var camera = new THREE.OrthographicCamera(-0.5, 0.5, 0.5, -0.5, 0, 1000);
renderer.setSize(500, 500);
plotElement.append(renderer.domElement);
var background = new THREE.TextureLoader().load('https://wearablewearyphase.davidbrock1.repl.co/background.png');
var material = new THREE.ShaderMaterial({
vertexShader: vertexShader,
fragmentShader: fragmentShader,
uniforms: {
background: {
value: background
},
},
});
geom = new THREE.PlaneBufferGeometry(1, 1);
mesh = new THREE.Mesh(geom, material);
scene.add(mesh);
camera.z = 1;
function render() {
requestAnimationFrame(render);
renderer.render(scene, camera);
}
render();
});
<!DOCTYPE html>
<html>
<head>
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/r118/three.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.5.1/jquery.min.js"></script>
</head>
<body>
<div id="plot"></div>
</body>
</html>
I am quite sure that the line is not in the original image:
The boundary should not be visible at all because the left and right sides are the same color. Furthermore, the line only appears for some shift distances (49% and 47%, but not 50% or 48%, for example). I have found that if I make the output bigger than the background image than the line disappears, but I would prefer not to do this.
What is causing this line and how can I prevent it?
Note:
I just used the "mod" function as an example. Originally, I had another program (another shader) calculate x and y coordinates for every pixel and save them in another texture as the red and green components. The fragment shader then looked these coordinates up in the image.
I ran into this problem while trying to create an animation like this. The lines started appearing all over the screen and did not look good.
This is happening because the fragment shader interpolates values across pixels. So one pixel column approaches 1.0, the next one is a very squished version of your entire texture between 1.0 - 0.0, and the next one starts over at 0.0.
The easiest way to circumvent this behavior is to set your texture's wrapping mode to THREE.RepeatWrapping and get rid of the mod() so as your texture goes above 1.0, it'll automatically start over from the left again.
Related
I'm learning Threejs and glsl shaders. I want to fade in/out a mesh when adding or removing it from the scene. My mesh is a tubeGeometry (which joins two points from an sphere, taken from this tutorial at the end) and a shaderMaterial. My mesh configuration looks like this.
const tubeSegments = 20;
const path = new CatmullRomCurve3(points);
// Points is an array of vec3
const geom = new TubeGeometry(path, tubeSegments, 0.01, 8, false);
const material = new ShaderMaterial({
vertexShader,
fragmentShader,
side: DoubleSide,
uniforms: {
time: {
value: mesh_fragments_time.get(),
},
color: {
value: new Vector3(1, 1, 0),
},
},
});
const mesh = new Mesh(
geom,
material
);
The vertexShader:
varying vec2 vertexUV;
varying vec3 vertexNormal;
void main(){
vertexUV=uv;
vertexNormal=normalize(normalMatrix * normal);
gl_Position=projectionMatrix*modelViewMatrix*vec4(position,1);
}
The fragmenShader:
varying vec2 vertexUV;
uniform float time;
uniform vec3 color;
void main () {
float dash = sin(vertexUV.x*60. - time);
if (dash>0.) discard;
gl_FragColor=vec4(color,0);
}
Where mesh_fragments_time.get() or time is a number that changes on my requestAnimation and makes the object dashed.
I've tried adding opacity to the shader material and changing it after adding to scene, but doesn't work.
I suppose I have to do that inside of the fragment shader but don't know how. Can someone help me?
R147
A codepen demonstrating this issue is here: https://codepen.io/lilrooness/pen/QWjdjgP
I'm rendering noise to a render target and then using that render target to texture a quad that I render to the screen.
When using the render target as a texure I'm encoutering a problem with the textures size.
It works fine if I use new THREE.MeshBasicMaterial({ map: renderTarget.texture })
but when I use my own material
var renderMaterial = new THREE.ShaderMaterial({
uniforms: {
tex: { type: 'sampler2D', value: renderTarget.texture }
},
fragmentShader: pixelateFragmentShader(),
vertexShader: standardVertexShader()
})
I get a very small texture that's clamped
This is the vertex shader that I use for both renders:
varying lowp vec3 vUv;
void main() {
vUv = position;
vec4 modelViewPosition = modelViewMatrix * vec4(position, 1.0);
gl_Position = projectionMatrix * modelViewPosition;
}
this is the rendering function (im using the same camera to render both times)
function animate() {
time += 0.1;
quad.material.uniforms.time.value = time;
requestAnimationFrame(animate);
renderer.setRenderTarget(renderTarget);
renderer.render(bufferScene, camera);
renderer.setRenderTarget(null);
renderer.render(scene, camera);
}
Again, this works fine if I use the MeshBasicMaterial. What am I doing wrong?
The problem was that, in my vertex shader I was setting
vUv = position
While this was desired behaviour for the perlin noise effect, I was re-using the same vertex shader to render the render target texture
I changed it to:
vUv = uv;
I am working with shaders in Three.js for the first time and I can't tweak some of the code that is essential to it, other than just change the RGBA. For example I can't place it inside a .html() method. What I'm going for is applying 10 different shader colors.
varying vec3 vNormal;
void main() {
float intensity = pow( 0.4 - dot( vNormal, vec3( 0.0, 0.0, 1.0 ) ), 4.0 );
gl_FragColor = vec4( 0, 0, 255, 1.0 ) * intensity; }
//gl_FragColor = vec4( 0, 0, 255, 1.0 ) is the RGBA value
}
As of right now, I can't do anything with that line that I found but just to let it untouched so it'll work. I can only copy the ID on its script tag, tweak the RGBA code and refer it to different <script> elements. But I don't want to do that ten times. My code needs to be efficient.
The entire code necessary is inside this fiddle.
How can and would you change that code so you can easily call a shader color?
You need to add a uniform for the colour.
HERE you can see the first one I did when starting with Three.js shaders.
but basically it's something like...
HTML/shaders:
<script id="vertexShader" type="x-shader/x-vertex">
void main() {
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
</script>
<script id="fragmentShader" type="x-shader/x-fragment">
uniform vec3 diffuse;
void main() {
gl_FragColor = vec4(diffuse.x, diffuse.y, diffuse.z, 1.0);
}
</script>
Create the shader in JS...
var uniforms = {
diffuse: { type: "c", value: new THREE.Color(0xeeeeee) }
};
var vertexShader = document.getElementById('vertexShader').text;
var fragmentShader = document.getElementById('fragmentShader').text;
material = new THREE.ShaderMaterial({
uniforms : uniforms,
vertexShader : vertexShader,
fragmentShader : fragmentShader,
});
To change the color, change uniforms.diffuse.value.
My example does not include your vNormal and intensity, but you get the idea.
Also, I have my shaders in HTML, but they obviously could just be javascript variables/Strings.
Check this version of your fiddle
You may find THIS page of other experiments of mine interesting.
The problem: I have a point cloud with quite a lot of data points (around one million). When I apply transparency to the rendered points, the transparency somehow does not show what is behind the rendered points
As you can see in the example of the marked point, it does not show what it should, it is as if there is a problem with the buffering.
I use three.js to create a point cloud using the following "setup":
The renderer:
this.renderer = new THREE.WebGLRenderer({
canvas: this.canvas,
antialias: true
});
The material:
this.pointMaterial = new THREE.ShaderMaterial( {
uniforms: {
time: { type: "f", value: 1.0 }
},
vertexShader: document.getElementById('vertexShader').textContent,
fragmentShader: document.getElementById('fragmentShader').textContent,
transparent: true
});
The vertex shader:
attribute float size;
attribute float opacity;
attribute vec3 color;
varying vec3 vColor;
varying float vOpacity;
void main() {
vColor = color;
vOpacity = opacity;
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
gl_PointSize = size * (500.0 / length(mvPosition.xyz));
gl_Position = projectionMatrix * mvPosition;
}
The fragment shader:
uniform float time;
varying vec3 vColor;
varying float vOpacity;
void main() {
gl_FragColor = vec4(vColor, vOpacity);
}
The geometry (where I left out the part where I populate the arrays):
var bufferGeometry = new THREE.BufferGeometry();
var vertices = new Float32Array(vertexPositions.length * 3);
var colors = new Float32Array(vertexColors.length * 3);
var sizes = new Float32Array(vertexSizes.length);
var opacities = new Float32Array(vertexOpacities.length);
bufferGeometry.addAttribute('position', new THREE.BufferAttribute(vertices, 3));
bufferGeometry.addAttribute('color', new THREE.BufferAttribute(colors, 3));
bufferGeometry.addAttribute('size', new THREE.BufferAttribute(sizes, 1));
bufferGeometry.addAttribute('opacity', new THREE.BufferAttribute(opacities, 1));
this.points = new THREE.Points(bufferGeometry, this.pointMaterial);
this.scene.add(this.points);
I tried this with the built-in point material, where the same happens
this.pointMaterial = new THREE.PointsMaterial({
size: this.pointSize,
vertexColors: THREE.VertexColors,
transparent: true,
opacity: 0.25
});
Is this a but, expected behaviour or am I doing something wrong?
The way the alpha blending equation works is that a source colour for geometry that is behind is covered by a destination colour for geometry that is in front. This means you need to render your transparent geometry in sorted order from back to front, so that geometry in front will correctly blend with geometry behind.
If all you have is transparent geometry then you can just disable depth testing, render in reverse depth sorted order, and it will work. If you have opaque geometry as well then you need to first render all opaque geometry normally, then disable depth writing (not testing) and render transparent geometry in reverse depth sorted order, then re-enable depth writing.
Here are some answers to similar questions if you're interested in learning a bit more.
I am trying to pass an attribute variable with three.js to the vertex shader, then the vertex shader should pass it to the fragment shader through a varying variable.
Vertex shader:
attribute vec4 color;
varying vec4 outColor;
void main()
{
outColor= color;
gl_Position= projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
Fragment shader:
varying vec4 outColor;
void main() {
gl_FragColor = outColor;
}
This way let's say that I have a cube with 8 vertices: if there is a different color for each vertex, then the cube should be drawn by interpolating the color of each vertex, and in the middle of a face it should have a mixed color. This is the javascript code snippet where I initialize the attributes:
var colors= [];
for(var i=0; i<geometry.vertices.length; i++) {
colors.push(new THREE.Vector4(0.0,1.0,0.0,1.0));
}
var attributes = {
color: {type:"v4", value: colors}
};
var material= new THREE.ShaderMaterial({
uniforms: uniforms,
attributes: {},
vertexShader: document.getElementById("vertexShader").textContent,
fragmentShader: document.getElementById("fragmentShader").textContent
});
For now this should draw a completely green cube. The problem is that the instruction in the vertex shader outColor=color; messes up everything: I just see a black screen. If I replace this instruction with outColor=vec4(0.0,1.0,0.0,1.0);, I see a correctly drawn green cube on the screen.
Here is the full source code.
Try passing attributes instead of {} to the THREE.ShaderMaterial constructor.