What are the properties of three.js emissive materials - javascript

I'm working on a simple demonstration in three.js and am confused by the behaviour of THREE.MeshPhongMaterial coming from a background in the Unity Game Engine.
create_ring() {
// creates a ring mesh per inputed class data
const material = new THREE.MeshPhongMaterial({
color: this.color,
emissive: this.color,
emissiveIntensity: 1.6
});
const ring_geo = new THREE.TorusGeometry(this.radius, this.thickness, 16, 100);
// Translate in space
ring_geo.translate(5, 5, 0)
// add texture to mesh and output
const ring_mesh = new THREE.Mesh(ring_geo, material);
ring_mesh.receiveShadow = true;
ring_mesh.castShadow = true;
ring_mesh.name = "ring";
return ring_mesh
}
I was under the impression the materials would create a nice gentle pool of light on the floor geometry but now having researched the problem either I need some advice on how to implement this as a shader feature? Or I'm not understanding the limits and behaviour of materials in three.js? Below is an example of what is possible with a material's emissive option in Unity.

There's more than just an emissive material shown in the Unity screenshot above — the objects around the light probably were probably also marked as static, which Unity uses to "bake" the glow effect onto them, while compiling the application. There could also be a "bloom" post-processing effect to create the dynamic glow seen by the camera around the object.
Because three.js runs on the web and does not have an offline compilation step, these additional effects have to be configured manually. You can see the three.js bloom example for some help adding the bloom effect to a scene. Baking the light onto surrounding objects would generally be done in Blender, and then loaded into three.js with the base color texture or a lightmap.

Related

Three.js - how to create custom shapes

I´m using Three.js and trying to create some custom shapes, similar to one that appears in a project from one of agencies using threejs:
three.js featured project esample
How did they generated these boxes with holes inside? (on that examples
boxes basically have only borders around and are empty inside).
As I saw in the code (I was trying to figure out myself) they use BoxGeometry but I have no idea how to accomplish that. Does anyone know or can give me any directions? It would be really helpfull as i´m stuck with this and have no idea on how to create them.
So in THREE.js Meshes represent any kind of 3D object. They combine Geometries and Shaders. Generally to create a mesh you call
var mesh = new THREE.Mesh( geometry, shader );
If you use any of the builtin shaders (also known as Materials [ MeshBasicMaterial, MeshLambertMaterial, etc]) they have a wireFrame boolean attribute that allows this functionality.
var geometry = new THREE.BoxGeometry( x, y, z ),
material = new THREE.MeshBasicMaterial( {
wireFrame: true, // This makes the object appear wireframe
color: 0xffffff // You can alter other properties
});
var box = new THREE.Mesh( geometry, material );
// You can also change it later
box.material.wireFrame = false;

Wiew port for group of mesh in threejs

What is the best way/practice of creating viewport for group of meshes in three js?
In my case I have THREE.Group of a lot THREE.Mesh instances. My goal is to create viewport for this group, where meshes will be visible.
One solution that I see is to use local clipping planes. threejs example
But I'm concerned that I have to assign clipping planes for every THREE.Mesh material rather than set it once for THREE.Group.
Also I need to recalcutate clipping planes when I move or rotate THREE.Group.
You could look into the stencil buffer:
webgl.stencilFunc()
webgl.stencilOp()
With or without threejs, the principle is the same.
disable depth write
disable depth test
disable color write
enable stencil operation (write a value to stencil buffer)
draw an invisible shape that writes to the stencil buffer (you probably want a screen space quad)
enable 1,2,3
change stencil operation (only draw where stencil buffer is 1 for example)
draw your group
depending on when you're doing this, you could change the stencil op here
and then draw the rest of the scene where buffer is 0 (outside of that shape from 5)
Three.js does not have stencil abstractions unless they've been implemented recently. This means that there is no "magical" property of say transparent which manages a bunch of webgl state under the hood, you have to actually manage this yourself. This means that you have to get the webgl context and do webgl operations on it manually.
There are many ways to do this.
var myScreenSpaceQuad = new THREE.Mesh( new THREE.PlaneBufferGeometry(2,2,1,1), myQuadShaderMaterial )
var scene1 = new THREE.Scene()
var scene2 = new THREE.Scene()
var sceneMask = new THREE.Scene()
sceneMask.add(myScreenSpaceQuad)
//...
myRenderFunction(){
//gl stencil op
//...
myRenderer.render(myCamera, sceneMask)
//more stencil
//...
myRenderer.render(myCamera, scene1)
//some more stencil...
myRenderer.render(myCamera, scene2)
}
I'll try to write a working example. For screen space quad you can take a look at this.

Calculating light for custom geometry in Three.js

I've created a custom geometry and I've used geometry.computeFaceNormals() to get the lighting right. So far so good.
The problem comes when I animate the geometry. In the animation loop I call again geometry.computeFaceNormals() but the faces' lighting doesn't change.
Here is a fiddle with the example:
You are updating the vertices of your geometry, so typically, your normals must also be updated for the shading to be correct.
However, since you want flat shading, there is another solution.
MeshPhongMaterial generates flat shading using the OES_standard_derivatives extension. This means that geometry normals do not have to be specified or updated when vertices change.
All you have to do is use the following pattern, and flat shading will "just work" -- provided the extension is supported.
var material = new THREE.MeshPhongMaterial( {
color: 0xFFFFFF,
shading: THREE.FlatShading
} );
three.js r.80

Rendering lots of similar but not identical meshes on scene

We have 1 geometry that gets attached to every mesh in our scene.
var geometry = new three.PlaneGeometry(1, 1, 1, 1),
Everything has a texture that we generate and cache to create a new material and a mesh for each object.
this.material = new three.MeshLambertMaterial({
transparent: true,
emissive: 0xffffff
});
// get the cached texture
this.material.map = this.getTexture(this.attributes);
this.shape = new three.Mesh(geometry, this.material);
Afterwards we add these shapes into various Object3Ds in order to move large groups of shapes around.
This all works great on nicer devices and up to 5000 circles, but then our framerate starts to drop. On weaker devices this is dramatically slower even with say 100 meshes. We know that merging geometries can speed things up; however, we only have a single geometry that is shared. Is it possible to merge meshes? Does that even make sense? Note: These shapes are interactive (movable/clickable). What are our options?
Other notes:
We are using Ejecta on mobile devices, which is great at low mesh counts, but not so great after 100 meshes. I don't think its Ejecta's fault, but rather our lack of knowledge about how to optimize! Also even on desktop our app has some CPU usage amount that we find suspicious.
Figured it out! We went from being able to render 5k things at 60fps to 100k things at approx 40fps.
We followed what most people are saying out there about merging meshes, but it took some experimentation to really understand what was happening and getting multiple textures/materials to work.
for (var i = 0; i < 100000; i++) {
// creates a mesh from the geometry and material from the question and returns an object
circle = ourCircleFactory.create();
circle.shape.updateMatrix();
sceneGeometry.merge(circle.shape.geometry, circle.shape.matrix, circle.cachedMaterialIndex);
}
var finalMesh = new three.Mesh(sceneGeometry, new THREE.MeshFaceMaterial(cachedMaterials));
scene.add(finalMesh);
That code will create 1 geometry per cached material. cachedMaterialIndex is something we created to cache textures and indicate which material to use.
It is likely that this code will create 1 geometry per combination of material and geometry. EG: if you have 5 geometries and they are interchangeable with 5 materials then you will get 25 geometries. It seems that it doesn't matter how many objects you have on screen. Note: we were getting 15fps with 5000 geometries so I think this is a fairly cheap solution.

Three.js outlines

Is it possible to have an black outline on my 3d models with three.js?
I would have graphics which looks like Borderlands 2. (toon shading + black outlines)
I'm sure I came in late. Let's hope this would solve someone's question later.
Here's the deal, you don't need to render everything twice, the overhead actually is not substantial, all you need to do is duplicate the mesh and set the duplicate mesh's material side to "backside". No double passes. You will be rendering two meshes instead, with most of the outline's geometry culled by WebGL's "backface culling".
Here's an example:
var scene = new THREE.Scene();
//Create main object
var mesh_geo = new THREE.BoxGeometry(1, 1, 1);
var mesh_mat = new THREE.MeshBasicMaterial({color : 0xff0000});
var mesh = new THREE.Mesh(mesh_geo, mesh_mat);
scene.add(mesh);
//Create outline object
var outline_geo = new THREE.BoxGeometry(1, 1, 1);
//Notice the second parameter of the material
var outline_mat = new THREE.MeshBasicMaterial({color : 0x00ff00, side: THREE.BackSide});
var outline = new THREE.Mesh(outline_geo, outline_mat);
//Scale the object up to have an outline (as discussed in previous answer)
outline.scale.multiplyScalar(1.5);
scene.add(outline);
For more details on backface culling, check out: http://en.wikipedia.org/wiki/Back-face_culling
The above approach works well if you want to add an outline to objects, without adding a toon shader, and thus losing "realism".
Toon shading by itself supports edge detection. They've developed the 'cel' shader in Borderlands to achieve this effect.
In cel shading devs can either use the object duplication method (done at the [low] pipeline level), or can use image processing filters for edge detection. This is the point at which performance tradeoff is compared between the two techniques.
More info on cel: http://en.wikipedia.org/wiki/Cel_shading
Cheers!
Yes it is possible but not in a simple out-of-the-box way. For toon shading there are even shaders included in /examples/js/ShaderToon.js
For the outlines I think the most commonly suggested method is to render in two passes. First pass renders the models in black, and slightly larger scale. Second pass is normal scale and with the toon shaders. This way you'll see the larger black models as an outline. It's not perfect but I don't think there's an easy way out. You might have more success searching for "three.js hidden line rendering", as, while different look, somewhat similar method is used to achieve that.
Its a old question but here is what i did.
I created a Outlined Cel-shader for my CG course. Unfortunately it takes 3 rendering passes. Im currently trying to figure out how to remove one pass.
Here's the idea:
1) Render NormalDepth image to texture.
In vertex shader you do what you normally do, position to screen space and normal to screen space.
In fragment shader you calculate the depth of the pixel and then create the normal color with the depth as the alpha value
float ndcDepth = (2.0 * gl_FragCoord.z - gl_DepthRange.near - gl_DepthRange.far) / (gl_DepthRange.far - gl_DepthRange.near);
float clipDepth = ndcDepth / gl_FragCoord.w;
2) Render the scene on to a texture with cel-shading. I changed the scene override material.
3)Make quad and render both textures on the quad and have a orto camera look at it. Cel-shaded texture is just renderd on quad but the normaldepth shaded on that you use some edge detection and then with that you know when the pixel needs to be black(edge).

Categories

Resources