For performance reasons I need to display hundred of moving tetrahedrons in a scene. There for I use instancedbuffergeometry which requires a custom shader.
The scene also contains objects with regular geometry (non-buffer) and some lights (I prepared boiled-down snippet: https://jsfiddle.net/negan/xs3k1yz4/ ).
My problem is that the tetrahedrons are not shaded such that their lighting plausibly fits to the rest of the scene. The reason is probably the primitive shader I built:
<script id="vertexShader" type="x-shader/x-vertex">
attribute vec3 offset;
attribute vec4 color;
varying vec4 vColor;
varying vec3 vNormal;
void main() {
vColor = color;
vNormal = normalMatrix * vec3(normal);
gl_Position = projectionMatrix *
modelViewMatrix *
vec4(position*1.0+offset,1.0);
}
</script>
<script id="fragmentShader" type="x-shader/x-fragment">
varying vec4 vColor;
varying vec3 vNormal;
void main() {
float di = 0.4*dot(vNormal,normalize(vec3(1.,1.,0.)))+0.4*dot(vNormal,normalize(vec3(1.,0.,1.)));
di = di+0.2;
vec4 vColor2= vColor*vec4(1.0,1.,1.,0.2)*di;
gl_FragColor = vColor2;// adjust the alpha
}
</script>
Is there a way to make the custom shader fit the lights I defined in the scene? The shader also renders the faces in such a way which does not make the impression of directed light. I'd rather like to have single faces lit evenly rather that have the color interpolated from the vertices but I was unable to achieve that.
Any pointer or help is appreciated.
Here's a gist of the full fragment and vertex shader source for a Three.js MeshPhongMaterial shader, as of r83.
Three.js builds shaders using a string concatenation system, so figuring out the source of a shader from looking at the Three.js source will be almost impossible.
The above Gist was generated by installing the shader editor Chrome extension, going to a Three.js example page that has a MeshPhongMaterial like this one and using the shader editor to inspect the full source of a running shader program:
Three.js passes all default uniforms, like lighting data, to all shader programs, so if you create a custom shader with the above Gist code, lights, bones, etc, will all work automatically.
I would take the full source code and add your logic in manually, literally adding the result of your calculations to the existing gl_FragColor = ...
In the future, my tool ShaderFrog will let you automatically add any Three.js shader feature (lights, bones, morph targets, etc) to any custom shader. ShaderFrog can already combine any shaders together automatically, but I haven't done the manual work needed to fully support Three features yet.
There's InstancedMesh module for three.js . It allows mesh instancing without using shader materials. It patches the existing material.
https://github.com/pailhead/three-instanced-mesh
Related
Using ThreeJS I've made a setup that displays a model with a complex material. I'm using an alpha texture to make semi-transparent surfaces, like glass. However when shadows are being cast, the shadow is removed completely instead of using the alpha texture as gradient.
Basically I want a similar effect like this BabylonJS example: https://playground.babylonjs.com/#P1RZV0
When casting shadows I'm using the MeshDepthMaterial in order to let the light cast shadows through transparent surfaces, like this:
var customDepthMaterial = new THREE.MeshDepthMaterial( {
depthPacking: THREE.RGBADepthPacking,
alphaMap: alphaTex,
alphaTest: 1,
skinning: true
})
This works fine however as mentioned, shadows should not disappear completely but rather use the alpha texture as indicator how strong the shadow should be. The more opaque a surface is the stronger the shadow should be and vice versa.
You can take a look at the project here: https://jsfiddle.net/miger/2c7no6mj/
Apparently BabylonJS makes it as simple as using the parameter .transparencyShadow = true; and I've not found anything simmilar in ThreeJS. It seems that there has not been an official solution implemented in ThreeJS yet. Instead the only thing I found was this discussion about it here: https://github.com/mrdoob/three.js/issues/10600
I've already implemented some GLSL code in order to get selective bloom working. I'm guessing to implement the shadow effect I'm looking for I need to do something with the shaders but I do not know how.
These are the shaders I'm using so far:
<script type="x-shader/x-vertex" id="vertexshader">
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
</script>
<script type="x-shader/x-fragment" id="fragmentshader">
uniform sampler2D baseTexture;
uniform sampler2D bloomTexture;
varying vec2 vUv;
void main() {
gl_FragColor = ( texture2D( baseTexture, vUv ) + vec4( 1.0 ) * texture2D( bloomTexture, vUv ) );
}
</script>
And this is how the selective bloom is implemented using .onBeforeCompile:
child.material.onBeforeCompile = shader => {
shader.uniforms.globalBloom = uniforms.globalBloom;
shader.fragmentShader = `
uniform float globalBloom;
${shader.fragmentShader}
`.replace(
`#include <dithering_fragment>`,
`#include <dithering_fragment>
if (globalBloom > 0.5){
gl_FragColor = texture2D( emissiveMap, vUv );
}
`
);
}
I'm really bad with shaders, there have been some solutions in the above mentioned discussion, however I've just way to little experience with shaders and GLSL to know what I need to do.
How can I implement soft/gradient shadow casting using an alpha texture as strength indicator?
I'm converting a Shadertoy to a local Three.js project, and can't get it to render. You can try out the full snippet here.
I think the problem may lie in how I'm converting the iResolution variable. As I understand it, the built-in Shadertoy global variable iResolution contains the pixel dimensions of the window. Here is how iResolution is used in the original Shadertoy:
vec2 uv = fragCoord.xy / iResolution.y;
vec2 ak = abs(fragCoord.xy / iResolution.xy-0.5);
In converting this Shadertoy into a local Three.js-based script I have tried two approaches to converting iResolution:
1) Loading the window dimensions as a Vector2 and sending them into the shader as the uniform vec2 uResolution:
vec2 uv = gl_FragCoord.xy / uResolution.y;
vec2 ak = abs(gl_FragCoord.xy / uResolution.xy-0.5);
This solution sticks closest to the design of the original Shadertoy, but alas nothing renders.
2) The second approach comes from this SO Answer and converts the uv coordinates to xy absolute coordinates:
vec2 uvCustom = -1.0 + 2.0 *vUv;
vec2 ak = abs(gl_FragCoord.xy / uvCustom.xy-0.5);
In this one, I admit I don't fully understand how it works, and my use of the uvCustom in the second line may not be correct.
In the end, nothing renders onscreen except a Three.js CameraHelper I'm using. Otherwise, the screen is black and the console shows no errors for the Javascript or WebGL. Thanks for taking a look!
For starters, you don't need to even do this division. If you are using a full screen quad (PlaneBufferGeometry), you can render it with just the uvs:
vec2 uv = gl_FragCoord.xy / uResolution.y;
vec2 vUv = varyingUV;
uv == vUv; //sort of
your vertex shader can look something like this
varying vec2 varyingUV;
void main(){
varyingUV = uv;
gl_Position = vec4( position.xy , 0. , 1.);
}
If you make a new THREE.PlaneGeometry(2,2,1,1); this should render as a full screen quad
This question, where the OP was reporting a warning being issued when executing instruction gl.enableVertexAttribArray(shaderProgram.textureCoordAttribute); got me thinking: what if I want to have a scene with two (let's assume: flat) shapes in it, one with a texture and one with a uniform color? In the fragment shader's main, if I uncomment the second instruction and comment out the first, like so:
void main(void) {
//gl_FragColor = vColor;
gl_FragColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
}
the warning is still being issued, but this time in regards to the instruction gl.enableVertexAttribArray(shaderProgram.vertexColorAttribute);.
If I leave both uncommented, it still complains about the vertexColorAttribute because clearly it's being overridden.
So how can I have both? Am I to use two different fragment shaders? If so, how can it be done?
The proper way would be to have two separate shaders since you want to render either
textured or colored.
At the cost of memory and execution time, you may always have color and texture attributes in your vertex data and use some kind of blending depending on your use case.
As you describe it you could use simple additive blending as most WebGL implementations bind a black pixel [0,0,0,255] texture as default when no texture is bound.
Thus having something like
void main(void) {
gl_FragColor = vColor + texture2D(uSampler, vTexCoord);
}
would render the color when no texture is bound and correctly render the bound texture when color is [0,0,0,0].
See wikipedia for an overview of various blend modes
As an even more expensive solution you could just use a uniform as flag to indicate what "method" you want to use and branch inside the shader accordingly. This still requires all vertex attributes to be available though, also branching is quite inefficient on some platforms.
void main(void) {
if (uMethod == 1)
gl_FragColor = vColor;
else if (uMethod == 2)
gl_FragColor = texture2D(uSampler, vTexCoord);
else
gl_FragColor = vColor + texture2D(uSampler, vTexCoord);
}
Say I have this character and I want allow user to select it, so when it s selected I want to show an outline around it.
the character is an object3D with some meshes.
I tried to clone and set a backside material, but it did NOT work, the problem was each cube in the shape was render with backside separately so the outline was wrong.
do I need to create another mesh for the outline, is there an easier way?
What #spassvolgel wrote is correct;
What I suspect needs to be done is something like this: 1. First the background needs to be rendered 2. Then, on a separate transparent layer, the character model with a flat color, slightly bigger than the original, 3. On another transparent layer the character with its normal material / texture 4. Finally, the character layer needs to go on top of the outline layer and them combined need to be placed in the bg
You just create multiple scenes and combine them with sequential render passes:
renderer.autoClear = false;
. . .
renderer.render(scene, camera); // the entire scene
renderer.clearDepth();
renderer.render(scene2, camera); // just the selected item, larger, in a flat color
renderer.render(scene3, camera); // the selected item again
three.js.r.129
An generic solution that applies to geometries of any complexity might be to apply a fragment shader via the ShaderMaterial class in three.js. Not sure what your experience level is at, but if you need it an introduction to shaders can be found here.
A good example where shaders are used to highlight geometries can be found here. In their vertex shader, they calculate the normal for a vertex and a parameter used to express intensity of a glow effect:
uniform vec3 viewVector;
uniform float c;
uniform float p;
varying float intensity;
void main()
{
vec3 vNormal = normalize( normalMatrix * normal );
vec3 vNormel = normalize( normalMatrix * viewVector );
intensity = pow( c - dot(vNormal, vNormel), p );
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
These parameters are passed to the fragment shader where they are used to modify the color values of pixels surrounding the geometry:
uniform vec3 glowColor;
varying float intensity;
void main()
{
vec3 glow = glowColor * intensity;
gl_FragColor = vec4( glow, 1.0 );
}
I found something on gamedev.stackexchange.com/ that could be useful. They talk of a stencil buffer. I have no idea on how to apply this to THREE.js though..
https://gamedev.stackexchange.com/questions/59361/opengl-get-the-outline-of-multiple-overlapping-objects
You can get good results by rendering your outlined object(s) to a texture that is (ideally) the size of your destination framebuffer, then render a framebuffer-sized quad using that texture and have the fragment shader blur or do other image transforms. I have an example here that uses raw WebGL, but you can make a custom ShaderMaterial without too much trouble.
I haven't found the answer yet but I wanted to demonstrate what happens when I create multiple meshes, and put another mesh behind each of these meshes with
side: THREE.BackSide
http://jsfiddle.net/GwS9c/8/
as you can see, it's not the desired effect. I would like a clean outline behind ALL three meshes, that doesn't overlap. My level of programming shaders is really non-existent, but on most online resources people say to use this approach of cloning the meshes.
I'm trying to use an FBO in a material in THREE.js. I have a GPU-based fluid simulation which outputs its final visualisation to a framebuffer object, which I would like to use to texture a mesh. Here's my simple fragment shader:
varying vec2 vUv;
uniform sampler2D tDiffuse;
void main() {
gl_FragColor = texture2D( tDiffuse, vUv );
}
I am then trying to use a simple THREE.ShaderMaterial:
var material = new THREE.ShaderMaterial( {
uniforms: { tDiffuse: { type: "t", value: outputFBO } },
//other stuff... which shaders to use etc
} );
But my mesh just appears black, albeit with no errors to the console. If I use the same shader and shader material, but supply the result of THREE.ImageUtils.loadTexture("someImageOrOther") as the uniform to the shader, it renders correctly, so I assume the problem is with my FBO. Is there some convenient way of converting from an FBO to a Texture2D in WebGL?
EDIT:
After some more experimentation it would appear that this isn't the problem. If I pass the FBO to a different shader I wrote that just outputs the texture to the screen then it displays fine. Could my material appear black because of something like lighting/normals?
EDIT 2:
The UVs and normals are coming straight from THREE, so I don't think it can be that. Part of the problem is that most shader errors aren't reported so I have difficulty in that regard. If I could just map the WebGLTexture somehow that would make everything easier, perhaps like this
var newMaterial = new THREE.MeshLambertMaterial({ map : outputFBO.texture });
but of course that doesn't work. I haven't been able to find any documentation that suggests THREE can read directly from WebGLTextures.
By poking a little into the sources of WebGLRenderer (look at https://github.com/mrdoob/three.js/blob/master/src/renderers/WebGLRenderer.js#L6643 and after), you may try to create a three js texture with a dummy picture, then change the data member __webglTexture of this texture by putting your own webgltexture.
Also, you may need to set to true the __webglInit data member of the texture object so that init code is not executed (because then __webglTexture is overwritten by a call to _gl.createTexture();)
If you don't mind using the Three.js data structures, here's how you do it:
Three.js use framebuffer as texture