How do I assemble proper uniforms for Threejs shaders? - javascript

I am trying to figure out how to properly use Three.js' built in ShaderChunks for lighting and fog and such, and I figured a good first step was to just copy one of the ShaderLib shaders' setup. So to start with I used:
customMaterial = new ShaderMaterial({
lights: true,
uniforms: UniformsUtils.merge( [
UniformsLib.common,
UniformsLib.specularmap,
UniformsLib.envmap,
UniformsLib.aomap,
UniformsLib.lightmap,
UniformsLib.emissivemap,
UniformsLib.fog,
UniformsLib.lights,
{
emissive: { value: new Color( 0x000000 ) },
diffuse: { value: new Color( 1,1,1 ) }
}
]),
vertexShader: document.getElementById("vertexShader").textContent,
fragmentShader: document.getElementById("fragmentShader").textContent
})
Where the shader code is just directly copied from meshlambert_vert.glsl and meshlambert_frag.glsl, and that section there is based on this entry in the ShaderLib
However, I am rendering my test scene from two different cameras/renderers at once, and I immediately noticed an issue. Changing one camera's perspective changes the second camera's lighting angle, for objects with this customMaterial applied.
I assume this is due to these UniformLib objects being referenced elsewhere?
I'm not sure what I should be passing here instead, nor why this doesn't work but the standard material does. I guess I'm skipping a step, but I don't understand what it might be.
Here is a codepen where I have isolated the problem as much as I can. Now it is almost a direct copy of the ShaderLib source. At this point I'm thinking this is a pass-by-reference where it should have been a copy, somewhere inside the WebGLRenderer. https://codepen.io/cl4ws0n/pen/dVewdZ
For whatever its worth, I also tried adding a second scene, and moving the objects between them. That didn't fix it, nor did separate objects in separate scenes sharing the material.

WebGLRenderer has some hardcoded logic for certain materials. In this case it's looking for a flag called isMeshLambertMaterial: https://github.com/mrdoob/three.js/blob/r87/src/renderers/WebGLRenderer.js#L1780
So try setting isMeshLambertMaterial: true, isMeshBasicMaterial: false in your material.

As Matjaz Drolc indicated with his comments, this is a bug in three r87. The needsUpdate flags are not functioning properly for ShaderMaterials. You can force an update before each render pass in my codepen link and it will render correctly. See the fork https://codepen.io/cl4ws0n/pen/qPYwzp
I have made an issue on the repo, if anyone would like to track it's progress it can be found here.

Related

Three js smooth shading appearing flat

I'm loading .stl files, applying MeshStandardMaterial without touching the flatShading property as by default it is false.
The result looks very flat to me. If I try setting flatShading: true the result is the same.
I've tried everything I could think of but have ran out of ideas - any suggestion would be welcome, thanks.
geometry.computeVertexNormals();
geometry.computeBoundingBox();
geometry.computeBoundingSphere();
geometry.normalizeNormals();
The result looks very flat to me. If I try setting flatShading: true the result is the same.
STLLoader always returns a non-indexed buffer geometry (unconnected triangle soup). That means the geometry's faces share no vertices and thus using BufferGeometry.computeVertexNormals() can not produce normals required for smooth shading.
Also recomputing bounding volumes and the usage of BufferGeometry.normalizeNormals() are unrelated to this issue.
You can try to solve this issue by ensuring the asset comes with normals that allow smooth shading. Or you give BufferGeometryUtils.mergeVertices() a try which produces an indexed geometry by merging vertices.

resizing individual models in a single geometry

I have a 3D model of my home town. I would like to use real time data to change the height of the buildings. In my first try, I loaded the buildings as individual meshes and called scene.add(buildingMesh) during setup.
var threeObjects = []
var buildingMesh = new THREE.Mesh(geometry, material)
threeObjects.push(buildingMesh);
$.each(threeObjects,function(i, buildingMesh)
{
buildingMesh.rotation.x += -3.1415*0.5;
buildingMesh.castShadow = true;
buildingMesh.receiveShadow = true;
scene.add(buildingMesh);
});
Which is too slow as my dataset consists of roughly 10.000 building.
So I took the approach to add all the (geometries of the) meshes to a single geometry and wrap that in a mesh to be added to the scene
singleGeometry.merge(buildingMesh.geometry, buildingMesh.matrix); //in a loop
var faceColorMaterial = new THREE.MeshLambertMaterial( { color: 0xffffff, vertexColors: THREE.VertexColors } );
combinedMesh = new THREE.Mesh(singleGeometry, faceColorMaterial);
scene.add(combinedMesh);
Just to make a proof of concept, I'm trying to change the height of a building when I click it. Alas, this is not working.
By adding a new id field, I can get a reference to the faces and vertices and change the color of the building, but I can not for the life of me, get them to to change height.
In my first version, I would just use something like:
buildingMesh.scale.z=2;
But as I have no meshes anymore, I'm kinda lost.
Can anybody help?
disclaimer: I'm new to Three.js, so my question might be stupid...hope it's not :)
If you combine all of your buildings into a single geometry, you're obliterating everything that makes the buildings distinct from each other. Now you can't tell building A from building B because it's all one big geometry, and geometry at its basic level is literally just arrays of points and polygons with no way of telling any of it apart. So I think it's the wrong approach to merge it all together.
Instead, you should take advantage of three.js's efficient scene graph architecture. You had the right idea at first to just add all the buildings to a single root Object3D ("scene"). That way you get all the efficiencies of the scene graph but can still individually address the buildings.
To make it load more efficiently, instead of creating the scene graph in three.js every time you load the app, you should do it ahead of time in a 3D modeling program. Build the parent/child relationships there, and export it as a single model containing all of the buildings as child nodes. When you import it into three.js, it should retain its structure.
JCD: That was not quite the question I asked.
But anyhow, I found a solution to the problem.
What I did was to merge all the geometries, but in stead of using the standard clone function in geometry.merge() I used a shallow reference, which made it possible for me to use the reference in threeObjects to find the correct building and resize the part of the geometry using Mesh.scale, followed by a geometry.verticesNeedUpdate = true;
For further optimization, I split the model into 5 different geometries and only updated the geometry that contained the building

Threejs - Draw a transparent texture with MeshDepthMaterial

So I'm trying to have depth of field effect on some pixelart I made.
For that I need a texture with the depth information. So I render my texture on a plane using MeshDepthMaterial, but all I get is a grey rectangle, it completly ignores the alpha data in the texture (which is only 1 or 0, nothing inbetween).
Of course I'm using
depthMaterial.transparent = true;
depthMaterial.alphaTest = 0.5;
depthMaterial.needsUpdate = true; //just to be sure
Just for your interest this is my pixelart and how it is rendered: http://imgur.com/a/TLQOe
MeshDepthMaterial does not read the surface texture at all.
Instead you would need to either override the value of the 'depth' ShaderChunk to include UV's and your alpha-test-able texture, or add an extra ShaderMaterial to do the work yourself. Depending on the overall needs of your application, one approach would be better than the other. If no non-textured objects need to cast shadows, the former would be the simplest to maintain. If not, you'll need to do more work on managing which rendertargets get what and how.

Loaded OBJ in Three.js doesn't receive shadows even though the property is set to true

I'm currently diving into Three.js and came across my first real issue when switching from native primitives to imported OBJ objects.
I have a simple model of a dodecahedron that I UV-mapped in Cinema4D r15 and exported as OBJ file. When using this model over a DodecahedronGeometry, the geometry is not lit by my directional light anymore, but it is if I use the primitive.
See this JSFiddle: https://jsfiddle.net/xm3ttmxw/1/
See desired result here (using a primitive): https://jsfiddle.net/84hbs7ed/1/
As you can see, I'm setting the receiveShadow property to true for all meshes in the OBJ. I activated shadow maps for the light and renderer. The directional light is following the camera and pointing towards the origin (center of the dodecahedron). The ambient light seems to work fine.
Any idea? Thanks in advance
After some more research, it appears that the problem comes from a lack of vertex normals (OBJLoader doesn't compute those). The solution is to compute the vertex normals on the fly like shown here: https://stackoverflow.com/a/28568522/3446439
object.traverse( function( child ) {
if ( child instanceof THREE.Mesh ) {
child.geometry.computeVertexNormals(); //add this
}
} );
Thanks Sebastian Baltes!

Creating a room with Three.js and cannon.js

I'm just getting started with Three.js and cannon.js and I've been trying to create a simple room for a while with no success. I'm working off of this example and I've been trying to add walls and a ceiling. What is the easiest way to do this? Right now I have
// wall?
wallGeometry = new THREE.PlaneGeometry( 300, 300 );
wallGeometry.applyMatrix( new THREE.Matrix4().makeRotationX( Math.PI));
wallMesh = new THREE.Mesh( wallGeometry, material );
wallMesh.castShadow = false;
wallMesh.receiveShadow = true;
scene.add(wallMesh);
But it's light up weird and I don't bump into it.... And if I try to add it through cannon.js I get an invisible wall but can't see it. Can anyone point me to the right direction?
THREE.js itself has no physics included so when you create any object in it, it will never make you 'bump' into them by itself. You have to code such features yourself or use another library for that (as you are already trying to do with cannon.js).
Next - per default, THREE js will only create one ambient light for your scene. If you want dynamic lightning and shadows, all lights must be provided by you and objects you want to react to lights must be using MeshLambertMaterial or MeshPhongMaterial, for shadows you will have to set couple of properies and use specific types of lights.
In your example, you don't seem to define wall material anywhere - this might be why walls are invisible.
check these and steal pieces of code you need (and it seems like you will need a lot :-)
http://threejs.org/examples/
http://stemkoski.github.io/Three.js/index.html

Categories

Resources