I'm working on some webgl software for generating 3D models and am relying on dynamic geometry. I've observed some very bizarre behavior that I've been able to isolate in this jsfiddle.
It seems that any new faces added after a geometry instance has been added to the scene, any new faces added will not be rendered (properly). In wireframe mode (as in the example), the new geometry is not rendered at all. When using textured materials, I also observed that sometimes new geometry is not rendered depending on the angle of the camera.
Here's a video of that in action.
Back to the jsfiddle, I used an existing three.js code sample (misc_exporter_obj.html) as a scaffold but on line 7 I made a generic function to add a triangle to the geometry. The addGeometry function is called on startup, and if you uncomment line 36 you can see what the expected result should have been:
var material = new THREE.MeshBasicMaterial( { wireframe : true} );
geometry = new THREE.Geometry();
addTriangle(-50, -50, 50, -50, 50, 50);
//addTriangle(-50, -50, -50, 50, 50, 50); // UNCOMMENT TO TEST WHAT FINAL OUTPUT SHOULD LOOK LIKE.
scene.add( new THREE.Mesh( geometry, material ) );
And as per the threejs guide on how to update things, lines 43-47 attempt to add a new triangle when you click the "transform triangle" button by setting the verticesNeedUpdate and elementsNeedUpdate flags:
function addTriangleFace(){
addTriangle(-50, -50, -50, 50, 50, 50);
geometry.verticesNeedUpdate = true;
geometry.elementsNeedUpdate = true;
}
Am I doing this wrong? Or should I submit a bug report?
Thanks.
Disappearing Mesh Update:
I may have discovered the cause of the weird behavior that was causing my mesh to be erased based on camera orientation. This answer suggests that Three.js may have thought that the mesh was not inside the camera's frustum.
I'm guessing the new vertices were not being considered when trying to determine whether the object was in the frustum, so I just disabled culling since the object being drawn is the main object in the scene.
You want to add faces to an existing geometry.
Since buffers can't be resized, the best solution is to switch to BufferGeometry, preallocate sufficiently-sized buffers, and set the drawRange. See this SO answer. This answer, too.
If you add vertices, you will need to recompute the bounding sphere for frustum culling to work correctly.
geometry.computeBoundingSphere();
Or, as you said, you can disable frustum culling:
mesh.frustumCulled = false;
three.js.r.91
Related
Whenever I add a narrow 3d object like the one below to the scene, I encounter some unwanted artifacts like a repeating texture on the object's surface. It worth mentioning that everything looks fine until I switch the receive shadow property of the object to true.
to be more precise, I created a box geometry with the size of (0.35, 0.02, 0.15) then I made a MeshStandardMaterial and feed both geometry and material to a THREE.Mesh. the lightning consists of ambient light and a directional light
ideally, the object should look like this:
Here is the code for lightning, object, and material
let ambientLight = new THREE.AmbientLight(0xffffff, 0.5);
let directionalLight = new THREE.DirectionalLight(0xffffff, 0.5);
directionalLight.castShadow = true;
this.directionalLight.position.set(-20, 20, 32);
scene.add(this.ambientLight);
scene.add(this.directionalLight);
let box = new THREE.BoxGeometry(0.02, 0.15,
0.35)
let material = new THREE.MeshStandardMaterial({color: 'white',
shadowSide: THREE.FrontSide, side: THREE.DoubleSide})
let mesh = new THREE.Mesh(box, material)
mesh.receiveshadow = true
mesh.castshadow = true
scene.add(mesh)
This is known as shadow acne. It happens when light hits a surface at a shallow angle. You'll probably need to make small modifications to the LightShadow.bias property. Quoting from the documentation:
Shadow map bias, how much to add or subtract from the normalized depth when deciding whether a surface is in shadow. The default is 0. Very tiny adjustments here (in the order of 0.0001) may help reduce artifacts in shadows.
Try something like: directionalLight.shadow.bias = 0.0001; and start from there, making small adjustments until the shadow acne isn't noticeable.
There's also a second parameter named LightShadow.normalBias that you could tweak.
I am trying to add some lines into the scene of a 3D model in a Forge Viewer application I am building. I want to draw some bounding boxes around certain objects; I have used the following guide as a baseline:
Getting bounding boxes of each component in the viewer
At the moment I am just using the drawLines function as I already have the coordinate data for the object I want to draw a box around from elsewhere in my code. However, when scene.add is called, the following error appears in console:
WebGL: INVALID_OPERATION: drawArrays: no buffer is bound to enabled attribute
I have looked up this error and can't find anything that can help me. It seems the issue may be due to the fact that my application already adds meshes to the scene, and when it goes to add lines, it uses the same shader, which does not have attributes set up correctly to deal with lines. This is just a guess though, I really have no idea what exactly is causing the error, or what I can do differently to fix it. I have tried various types of THREE.js objects, using sceneAfter, etc, but still cannot draw lines into the scene.
Aha, managed to get it working! To fix it I had to use createOverlayScene and addOverlay to add the line geometry to the scene instead of using scene.add, and had to remove matman().addMaterial.
You need to make a new Material like below.
var lineMaterial = new THREE.LineBasicMaterial ({
color: new THREE.Color (0xFF0000),
transparent: true,
depthWrite: false,
depthTest: true,
linewidth: 10,
opacity: 1.0
})
var lines = new THREE.Line (geometry,
lineMaterial)
scene.add (lines)
If you're calling the drawLines function directly, make sure to use the same material type as the tutorial:
let material = new THREE.LineBasicMaterial({ color: 0xffff00, linewidth: 2 });
viewer.impl.matman().addMaterial('MyLineMaterial', material, true);
drawLines([{x:0,y:0,z:0}, {x:10,y:10,z:10}], material);
make a new scene try ---- just the code (sorry for my bad english)
const geometry = new THREE.Geometry ()
geometry.vertices.push (new THREE.Vector3 ( 0, 0, 0))
geometry.vertices.push (new THREE.Vector3 (100, 100, 100))
var material = new THREE.LineBasicMaterial({
color: 0x0000ff,
linewidth: 2
});
var lines = new THREE.Line (geometry, material)
viewer.impl.scene.add (lines)
You need to make sure that your scene can create lines.
I am trying to get collision detection from meshes i lay out on my Three.js scene. I am confused on how the Raycaster reallu works and if i get it right.
Here is a fiddle to descripe what i have problem with
//Add cuba at 40/40
geometry = new THREE.CubeGeometry(20, 20, 20);
material = new THREE.MeshNormalMaterial();
mesh = new THREE.Mesh(geometry, material);
mesh.position.setY(40)
scene.add(mesh);
//Add Ray
var origin = new THREE.Vector3(50, 0, 0),
direction = new THREE.Vector3(-1,0,0),
ray = new THREE.Raycaster(origin, direction),
collisionResults = ray.intersectObjects([mesh]);
if(collisionResults.length!==0){
alert('Ray collides with mesh. Distance :' + collisionResults[0].distance)
}
//Add Arrow to show ray
scene.add( new THREE.ArrowHelper(direction, origin, 50, 0x000000));
Not working:
http://jsfiddle.net/FredricBerling/LwfPL/1/
Working:
http://jsfiddle.net/FredricBerling/LwfPL/3/
Basically the fiddle lays out a cube and the 50 points form that i shoot a ray in a "direction". Problem seems to be that it states a "hit" even if it shouldnt.
I lay out a Arrowhelper to show where i suspect the Raycaster shoots its ray.
From other tests it seems like the direction in Raycaster is different from the one in Arrowhelper. Raycaster seems to shoot the ray into the 0,0,0 of the scene. I am confused
EDIT!. Rob gave the answer. I needed to make sure the meshes was rendered so that worl matrixes was applied. Fiddle is updated with the correct code that works for testing Raycaster as expected.
The apparent false positive you're seeing is due to the fact that even though you have set the box's position, it hasn't yet had its world transformation matrix updated. This normally only happens just before rendering.
If you move the raycast test to after the first render (or call updateWorld() manually), you won't get a hit.
I´m using Three.js and trying to create some custom shapes, similar to one that appears in a project from one of agencies using threejs:
three.js featured project esample
How did they generated these boxes with holes inside? (on that examples
boxes basically have only borders around and are empty inside).
As I saw in the code (I was trying to figure out myself) they use BoxGeometry but I have no idea how to accomplish that. Does anyone know or can give me any directions? It would be really helpfull as i´m stuck with this and have no idea on how to create them.
So in THREE.js Meshes represent any kind of 3D object. They combine Geometries and Shaders. Generally to create a mesh you call
var mesh = new THREE.Mesh( geometry, shader );
If you use any of the builtin shaders (also known as Materials [ MeshBasicMaterial, MeshLambertMaterial, etc]) they have a wireFrame boolean attribute that allows this functionality.
var geometry = new THREE.BoxGeometry( x, y, z ),
material = new THREE.MeshBasicMaterial( {
wireFrame: true, // This makes the object appear wireframe
color: 0xffffff // You can alter other properties
});
var box = new THREE.Mesh( geometry, material );
// You can also change it later
box.material.wireFrame = false;
One can easily create a THREE.BoxGeometry where you have to pass arguments when creating as three separated arguments for width, height, and depth.
I would like to create any and all THREE[types]() with no parameters and set the values after that.
Is there a way to set the dimensions/size of the box geometry after creation (possibly buried in a Mesh already too)? other then scaling etc.
I couldn't find this in the documentation if so, otherwise maybe a major feature request if not a bug there. Any thoughts on how to classify this? maybe just a documentation change.
If you want to scale a mesh, you have two choices: scale the mesh
mesh.scale.set( x, y, z );
or scale the mesh's geometry
mesh.geometry.scale( x, y, z );
The first method modifies the mesh's matrix transform.
The second method modifies the vertices of the geometry.
Look at the source code so you understand what each scale method is doing.
three.js r.73
When you instantiate a BoxGeometry object, or any other geometry for that matter, the vertices and such buffers are created on the spot using the parameters provided. As such, it is not possible to simply change a property of the geometry and have the vertices update; the entire object must be re-instantiated.
You will need to create your geometries as you have the parameters for them available. You can however create meshes without geometries, add them to a scene, and update the mesh's geometry property once you have enough information to instantiate the object. If not that, you could also set a default value at first and then scale to reach your target.
Technically, scaling only creates the illusion of an updated geometry and the question did say (other then scaling). So, I would say a better approach would be to reassign the geometry property of your mesh to a new geometry.
mesh.geometry = new THREE.BoxGeometry(newSize, newSize, newSize)
With this approach you can update any aspect of the geometry including width segments for example. This is especially useful when working with non box geometries like cylinders or spheres.
Here is a full working example using this approach:
let size = 10
let newSize = 20
// Create a blank geometry and make a mesh from it.
let geometry = new THREE.BoxGeometry()
let material = new THREE.MeshNormalMaterial()
let mesh = new THREE.Mesh(geometry, material)
// Adding this mesh to the scene won't display anything because ...
// the geometry has no parameters yet.
scene.add(mesh)
// Unless you intend to reuse your old geometry dispose of it...
// this will significantly reduce memory footprint.
mesh.geometry.dispose()
// Update the mesh geometry to a new geometry with whatever parameters you desire.
// You will now see these changes reflected in the scene.
mesh.geometry = new THREE.BoxGeometry(size, size, size)
// You can update the geometry as many times as you like.
// This can be done before or after adding the mesh to the scene.
mesh.geometry = new THREE.BoxGeometry(newSize, newSize, newSize)