I am trying to pass an attribute variable with three.js to the vertex shader, then the vertex shader should pass it to the fragment shader through a varying variable.
Vertex shader:
attribute vec4 color;
varying vec4 outColor;
void main()
{
outColor= color;
gl_Position= projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
Fragment shader:
varying vec4 outColor;
void main() {
gl_FragColor = outColor;
}
This way let's say that I have a cube with 8 vertices: if there is a different color for each vertex, then the cube should be drawn by interpolating the color of each vertex, and in the middle of a face it should have a mixed color. This is the javascript code snippet where I initialize the attributes:
var colors= [];
for(var i=0; i<geometry.vertices.length; i++) {
colors.push(new THREE.Vector4(0.0,1.0,0.0,1.0));
}
var attributes = {
color: {type:"v4", value: colors}
};
var material= new THREE.ShaderMaterial({
uniforms: uniforms,
attributes: {},
vertexShader: document.getElementById("vertexShader").textContent,
fragmentShader: document.getElementById("fragmentShader").textContent
});
For now this should draw a completely green cube. The problem is that the instruction in the vertex shader outColor=color; messes up everything: I just see a black screen. If I replace this instruction with outColor=vec4(0.0,1.0,0.0,1.0);, I see a correctly drawn green cube on the screen.
Here is the full source code.
Try passing attributes instead of {} to the THREE.ShaderMaterial constructor.
Related
A codepen demonstrating this issue is here: https://codepen.io/lilrooness/pen/QWjdjgP
I'm rendering noise to a render target and then using that render target to texture a quad that I render to the screen.
When using the render target as a texure I'm encoutering a problem with the textures size.
It works fine if I use new THREE.MeshBasicMaterial({ map: renderTarget.texture })
but when I use my own material
var renderMaterial = new THREE.ShaderMaterial({
uniforms: {
tex: { type: 'sampler2D', value: renderTarget.texture }
},
fragmentShader: pixelateFragmentShader(),
vertexShader: standardVertexShader()
})
I get a very small texture that's clamped
This is the vertex shader that I use for both renders:
varying lowp vec3 vUv;
void main() {
vUv = position;
vec4 modelViewPosition = modelViewMatrix * vec4(position, 1.0);
gl_Position = projectionMatrix * modelViewPosition;
}
this is the rendering function (im using the same camera to render both times)
function animate() {
time += 0.1;
quad.material.uniforms.time.value = time;
requestAnimationFrame(animate);
renderer.setRenderTarget(renderTarget);
renderer.render(bufferScene, camera);
renderer.setRenderTarget(null);
renderer.render(scene, camera);
}
Again, this works fine if I use the MeshBasicMaterial. What am I doing wrong?
The problem was that, in my vertex shader I was setting
vUv = position
While this was desired behaviour for the perlin noise effect, I was re-using the same vertex shader to render the render target texture
I changed it to:
vUv = uv;
I'm trying to use custom shader for three.js to make a single texture animate and expand. The problem here is, when I multiply vUv with certain number to make it expand, the bigger the number is, the smaller the result paint appears, which means it works contrary to my expectation. For example, when I multiply 0.1, the result becomes 10 times bigger, and when I multiply 10.0, it becomes 10 times smaller.
Here is my shader codes (simplified to make the problem clear):
//vertex shader
varying vec2 vUv;
uniform float uFixAspect;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
precision mediump float;
uniform float time;
uniform vec2 resolution;
varying vec2 vUv;
uniform sampler2D uTex;
void main() {
// I want the result to be 10 times smaller than original, but it draws 10 times bigger than original
mat2 newvUv = vUv * mat2(0.1, 0.0, 0.0, 0.1);
gl_FragColor= texture2D(uTex, newvUv);
}
and, this is my three.js code:
const loader = new THREE.TextureLoader();
loader.load(
"./assets/textures/tex.png",
tex => {
const geo = new THREE.PlaneGeometry(2, 2);
const mat = new THREE.ShaderMaterial({
uniforms: {
uTex: {
value: tex
},
time: {
type: "f",
value: 0.1
},
resolution: {
type: "vec2",
value: new THREE.Vector2(512, 512)
}
},
vertexShader: vert,
fragmentShader: frag,
});
const shaderObj = new THREE.Mesh(
geo,
mat
);
marker.add(shaderObj);
}
);
Is there any problem on my code or is it the problem of three.js?
Thank you.
[...] when I multiply vUv with certain number to make it expand, the bigger the number is, the smaller the result paint appears [...]
Of course, because you scale the texture coordinates for the look up, but not the texture.
You want to "scale down" the texture. The texture keeps the same size, so you've to take the texels form a "up scaled" position.
Use the reciprocal of the scale factor:
float scale = 1.0/0.1; // reciprocal scale
mat2 newvUv = vUv * mat2(scale, 0.0, 0.0, scale);
I'm working with THREE.js points and sometimes I need them to have different per point color. Sometimes, I'm also modifying their alpha value so I had to write my own shader programs.
In JavaScript I have the following code:
let materials;
if (pointCloudData.colors !== undefined) {
geometry.colors = pointCloudData.colors.map(hexColor => new THREE.Color(hexColor));
// If the point cloud has color for each point...
materials = new THREE.ShaderMaterial({
vertexColors: THREE.VertexColors,
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
} else {
// Set color for the whole cloud
materials = new THREE.ShaderMaterial({
uniforms: {
unicolor: { value: pointCloudData.color },
},
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
}
const pointCloud = new THREE.Points(geometry, materials);
I am basically setting the mesh color to a uniform value unless I have defined per point colors - then I set vertexColors to the geometry. I also checked the values being stored in the geometry.colors and they are correct RGB values in range [0,1].
My Vertex Shader code:
attribute float size;
attribute float alpha;
varying float vAlpha;
varying vec3 vColor;
void main() {
vAlpha = alpha;
#ifdef USE_COLOR
vColor = color;
#endif
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = size * ( 300.0 / -mvPosition.z );
gl_Position = projectionMatrix * mvPosition;
}
And my Fragment shader code:
uniform vec3 unicolor;
varying vec3 vColor;
varying float vAlpha;
void main() {
#ifdef USE_COLOR
gl_FragColor = vec4(vColor, vAlpha);
#else
gl_FragColor = vec4(unicolor, vAlpha);
#endif
}
Again, I am checking if the vertexColor is set and then passing it to the Fragment Shader which then sets the per point.
For some reason, the vertices are all white when setting the color per point (screenshot: The white pixels should be green/red). I'm far from advanced user in WebGL and any help would be appreciated. Am I doing something wrong that I'm not aware of?
You are creating a custom ShaderMaterial and using this pattern in your fragment shader:
#ifdef USE_COLOR
vColor = color;
#endif
Consequently, you need to specify the material.defines like so:
var defines = {};
defines[ "USE_COLOR" ] = "";
// points material
var shaderMaterial = new THREE.ShaderMaterial( {
defines: defines,
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexshader' ).textContent,
fragmentShader: document.getElementById( 'fragmentshader' ).textContent,
transparent: true
} );
You do not need to set vertexColors: THREE.VertexColors. That is just a flag used by built-in materials to alert the renderer to set the defines for you.
three.js r.85
OK, I think I figured it out since it's working now.
I had to set the colors as geometry attributes:
const colors = new Float32Array(n * 3);
for (let i = 0; i < n; i += 1) {
new THREE.Color(pointCloudData.colors[i]).toArray(colors, i * 3);
}
geometry.addAttribute('colors', new THREE.BufferAttribute(colors, 1));
I also used the suggestion provided by WestLangley and removed the vertexColors: THREE.VertexColors, part from the Material definition and set the define as well:
materials = new THREE.ShaderMaterial({
defines: {
USE_COLOR: '',
},
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
Then in my Vertex shader I added:
attributes vec3 colors;
to get the colors passed from the JavaScript. The rest is the same, I just passed the colors to the fragment shader using the same code as in the posted question above.
I am working with shaders in Three.js for the first time and I can't tweak some of the code that is essential to it, other than just change the RGBA. For example I can't place it inside a .html() method. What I'm going for is applying 10 different shader colors.
varying vec3 vNormal;
void main() {
float intensity = pow( 0.4 - dot( vNormal, vec3( 0.0, 0.0, 1.0 ) ), 4.0 );
gl_FragColor = vec4( 0, 0, 255, 1.0 ) * intensity; }
//gl_FragColor = vec4( 0, 0, 255, 1.0 ) is the RGBA value
}
As of right now, I can't do anything with that line that I found but just to let it untouched so it'll work. I can only copy the ID on its script tag, tweak the RGBA code and refer it to different <script> elements. But I don't want to do that ten times. My code needs to be efficient.
The entire code necessary is inside this fiddle.
How can and would you change that code so you can easily call a shader color?
You need to add a uniform for the colour.
HERE you can see the first one I did when starting with Three.js shaders.
but basically it's something like...
HTML/shaders:
<script id="vertexShader" type="x-shader/x-vertex">
void main() {
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
</script>
<script id="fragmentShader" type="x-shader/x-fragment">
uniform vec3 diffuse;
void main() {
gl_FragColor = vec4(diffuse.x, diffuse.y, diffuse.z, 1.0);
}
</script>
Create the shader in JS...
var uniforms = {
diffuse: { type: "c", value: new THREE.Color(0xeeeeee) }
};
var vertexShader = document.getElementById('vertexShader').text;
var fragmentShader = document.getElementById('fragmentShader').text;
material = new THREE.ShaderMaterial({
uniforms : uniforms,
vertexShader : vertexShader,
fragmentShader : fragmentShader,
});
To change the color, change uniforms.diffuse.value.
My example does not include your vNormal and intensity, but you get the idea.
Also, I have my shaders in HTML, but they obviously could just be javascript variables/Strings.
Check this version of your fiddle
You may find THIS page of other experiments of mine interesting.
The problem: I have a point cloud with quite a lot of data points (around one million). When I apply transparency to the rendered points, the transparency somehow does not show what is behind the rendered points
As you can see in the example of the marked point, it does not show what it should, it is as if there is a problem with the buffering.
I use three.js to create a point cloud using the following "setup":
The renderer:
this.renderer = new THREE.WebGLRenderer({
canvas: this.canvas,
antialias: true
});
The material:
this.pointMaterial = new THREE.ShaderMaterial( {
uniforms: {
time: { type: "f", value: 1.0 }
},
vertexShader: document.getElementById('vertexShader').textContent,
fragmentShader: document.getElementById('fragmentShader').textContent,
transparent: true
});
The vertex shader:
attribute float size;
attribute float opacity;
attribute vec3 color;
varying vec3 vColor;
varying float vOpacity;
void main() {
vColor = color;
vOpacity = opacity;
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
gl_PointSize = size * (500.0 / length(mvPosition.xyz));
gl_Position = projectionMatrix * mvPosition;
}
The fragment shader:
uniform float time;
varying vec3 vColor;
varying float vOpacity;
void main() {
gl_FragColor = vec4(vColor, vOpacity);
}
The geometry (where I left out the part where I populate the arrays):
var bufferGeometry = new THREE.BufferGeometry();
var vertices = new Float32Array(vertexPositions.length * 3);
var colors = new Float32Array(vertexColors.length * 3);
var sizes = new Float32Array(vertexSizes.length);
var opacities = new Float32Array(vertexOpacities.length);
bufferGeometry.addAttribute('position', new THREE.BufferAttribute(vertices, 3));
bufferGeometry.addAttribute('color', new THREE.BufferAttribute(colors, 3));
bufferGeometry.addAttribute('size', new THREE.BufferAttribute(sizes, 1));
bufferGeometry.addAttribute('opacity', new THREE.BufferAttribute(opacities, 1));
this.points = new THREE.Points(bufferGeometry, this.pointMaterial);
this.scene.add(this.points);
I tried this with the built-in point material, where the same happens
this.pointMaterial = new THREE.PointsMaterial({
size: this.pointSize,
vertexColors: THREE.VertexColors,
transparent: true,
opacity: 0.25
});
Is this a but, expected behaviour or am I doing something wrong?
The way the alpha blending equation works is that a source colour for geometry that is behind is covered by a destination colour for geometry that is in front. This means you need to render your transparent geometry in sorted order from back to front, so that geometry in front will correctly blend with geometry behind.
If all you have is transparent geometry then you can just disable depth testing, render in reverse depth sorted order, and it will work. If you have opaque geometry as well then you need to first render all opaque geometry normally, then disable depth writing (not testing) and render transparent geometry in reverse depth sorted order, then re-enable depth writing.
Here are some answers to similar questions if you're interested in learning a bit more.