Forge Viewer - Can't add lines to scene - javascript

I am trying to add some lines into the scene of a 3D model in a Forge Viewer application I am building. I want to draw some bounding boxes around certain objects; I have used the following guide as a baseline:
Getting bounding boxes of each component in the viewer
At the moment I am just using the drawLines function as I already have the coordinate data for the object I want to draw a box around from elsewhere in my code. However, when scene.add is called, the following error appears in console:
WebGL: INVALID_OPERATION: drawArrays: no buffer is bound to enabled attribute
I have looked up this error and can't find anything that can help me. It seems the issue may be due to the fact that my application already adds meshes to the scene, and when it goes to add lines, it uses the same shader, which does not have attributes set up correctly to deal with lines. This is just a guess though, I really have no idea what exactly is causing the error, or what I can do differently to fix it. I have tried various types of THREE.js objects, using sceneAfter, etc, but still cannot draw lines into the scene.

Aha, managed to get it working! To fix it I had to use createOverlayScene and addOverlay to add the line geometry to the scene instead of using scene.add, and had to remove matman().addMaterial.

You need to make a new Material like below.
var lineMaterial = new THREE.LineBasicMaterial ({
color: new THREE.Color (0xFF0000),
transparent: true,
depthWrite: false,
depthTest: true,
linewidth: 10,
opacity: 1.0
})
var lines = new THREE.Line (geometry,
lineMaterial)
scene.add (lines)

If you're calling the drawLines function directly, make sure to use the same material type as the tutorial:
let material = new THREE.LineBasicMaterial({ color: 0xffff00, linewidth: 2 });
viewer.impl.matman().addMaterial('MyLineMaterial', material, true);
drawLines([{x:0,y:0,z:0}, {x:10,y:10,z:10}], material);

make a new scene try ---- just the code (sorry for my bad english)
const geometry = new THREE.Geometry ()
geometry.vertices.push (new THREE.Vector3 ( 0, 0, 0))
geometry.vertices.push (new THREE.Vector3 (100, 100, 100))
var material = new THREE.LineBasicMaterial({
color: 0x0000ff,
linewidth: 2
});
var lines = new THREE.Line (geometry, material)
viewer.impl.scene.add (lines)
You need to make sure that your scene can create lines.

Related

What are the properties of three.js emissive materials

I'm working on a simple demonstration in three.js and am confused by the behaviour of THREE.MeshPhongMaterial coming from a background in the Unity Game Engine.
create_ring() {
// creates a ring mesh per inputed class data
const material = new THREE.MeshPhongMaterial({
color: this.color,
emissive: this.color,
emissiveIntensity: 1.6
});
const ring_geo = new THREE.TorusGeometry(this.radius, this.thickness, 16, 100);
// Translate in space
ring_geo.translate(5, 5, 0)
// add texture to mesh and output
const ring_mesh = new THREE.Mesh(ring_geo, material);
ring_mesh.receiveShadow = true;
ring_mesh.castShadow = true;
ring_mesh.name = "ring";
return ring_mesh
}
I was under the impression the materials would create a nice gentle pool of light on the floor geometry but now having researched the problem either I need some advice on how to implement this as a shader feature? Or I'm not understanding the limits and behaviour of materials in three.js? Below is an example of what is possible with a material's emissive option in Unity.
There's more than just an emissive material shown in the Unity screenshot above — the objects around the light probably were probably also marked as static, which Unity uses to "bake" the glow effect onto them, while compiling the application. There could also be a "bloom" post-processing effect to create the dynamic glow seen by the camera around the object.
Because three.js runs on the web and does not have an offline compilation step, these additional effects have to be configured manually. You can see the three.js bloom example for some help adding the bloom effect to a scene. Baking the light onto surrounding objects would generally be done in Blender, and then loaded into three.js with the base color texture or a lightmap.

Dynamically adding faces to a three.js geometry

I'm working on some webgl software for generating 3D models and am relying on dynamic geometry. I've observed some very bizarre behavior that I've been able to isolate in this jsfiddle.
It seems that any new faces added after a geometry instance has been added to the scene, any new faces added will not be rendered (properly). In wireframe mode (as in the example), the new geometry is not rendered at all. When using textured materials, I also observed that sometimes new geometry is not rendered depending on the angle of the camera.
Here's a video of that in action.
Back to the jsfiddle, I used an existing three.js code sample (misc_exporter_obj.html) as a scaffold but on line 7 I made a generic function to add a triangle to the geometry. The addGeometry function is called on startup, and if you uncomment line 36 you can see what the expected result should have been:
var material = new THREE.MeshBasicMaterial( { wireframe : true} );
geometry = new THREE.Geometry();
addTriangle(-50, -50, 50, -50, 50, 50);
//addTriangle(-50, -50, -50, 50, 50, 50); // UNCOMMENT TO TEST WHAT FINAL OUTPUT SHOULD LOOK LIKE.
scene.add( new THREE.Mesh( geometry, material ) );
And as per the threejs guide on how to update things, lines 43-47 attempt to add a new triangle when you click the "transform triangle" button by setting the verticesNeedUpdate and elementsNeedUpdate flags:
function addTriangleFace(){
addTriangle(-50, -50, -50, 50, 50, 50);
geometry.verticesNeedUpdate = true;
geometry.elementsNeedUpdate = true;
}
Am I doing this wrong? Or should I submit a bug report?
Thanks.
Disappearing Mesh Update:
I may have discovered the cause of the weird behavior that was causing my mesh to be erased based on camera orientation. This answer suggests that Three.js may have thought that the mesh was not inside the camera's frustum.
I'm guessing the new vertices were not being considered when trying to determine whether the object was in the frustum, so I just disabled culling since the object being drawn is the main object in the scene.
You want to add faces to an existing geometry.
Since buffers can't be resized, the best solution is to switch to BufferGeometry, preallocate sufficiently-sized buffers, and set the drawRange. See this SO answer. This answer, too.
If you add vertices, you will need to recompute the bounding sphere for frustum culling to work correctly.
geometry.computeBoundingSphere();
Or, as you said, you can disable frustum culling:
mesh.frustumCulled = false;
three.js.r.91

Three.js - how to create custom shapes

I´m using Three.js and trying to create some custom shapes, similar to one that appears in a project from one of agencies using threejs:
three.js featured project esample
How did they generated these boxes with holes inside? (on that examples
boxes basically have only borders around and are empty inside).
As I saw in the code (I was trying to figure out myself) they use BoxGeometry but I have no idea how to accomplish that. Does anyone know or can give me any directions? It would be really helpfull as i´m stuck with this and have no idea on how to create them.
So in THREE.js Meshes represent any kind of 3D object. They combine Geometries and Shaders. Generally to create a mesh you call
var mesh = new THREE.Mesh( geometry, shader );
If you use any of the builtin shaders (also known as Materials [ MeshBasicMaterial, MeshLambertMaterial, etc]) they have a wireFrame boolean attribute that allows this functionality.
var geometry = new THREE.BoxGeometry( x, y, z ),
material = new THREE.MeshBasicMaterial( {
wireFrame: true, // This makes the object appear wireframe
color: 0xffffff // You can alter other properties
});
var box = new THREE.Mesh( geometry, material );
// You can also change it later
box.material.wireFrame = false;

Overlaying texture onto STL loaded mesh

I'm looking for an efficient method of overlaying a texture to cover a mesh. I'm not an expert, more a novice, when it comes to 3 dimensional mapping/objects. Below shows how I would like the end product to look.
When attempting to apply texture with the following code, the end result looks similar to below. I have not done any UV mapping, I believe my answer may be lay here. As you can see from the below image it roughly takes the general shade of the picture but I get the impression that the texture is being drawn between each vertice of the model rather than across the entirity.
var textureLoader = new THREE.TextureLoader();
var texture = textureLoader.load('resource/images/materials/Mahogany.jpg');
var STLLoader = new THREE.STLLoader();
STLLoader.load( 'test.stl', function ( geometry1 ) {
var meshMaterial = new THREE.MeshBasicMaterial({map:texture});
var mesh = new THREE.Mesh( geometry1, meshMaterial );
mesh.scale.set(1, 1, 1);
mesh.position.set(5, 20, 80);
scene.add(mesh);
});
The cube has the correct texturing, whereas my STL loaded mesh does not.
Please ignore the rotation of the object in the above picture, I will move to unioning my objects together once I have fixed my texturing issues.
Fairly new at asking questions on here so please do comment to help me expand my question if it's too general or not percise enough. Thank you.
You may use
THREE.MeshPhongMaterial()
instead of
THREE.MeshBasicMaterial()
THREE.MeshPhongMaterial() will wrap the material outside the object and we can get curved material as per the object.

three.js applyMatrix after scale results in error

The problem:
I am looking to update the vertice positions on a mesh after scaling. I am doing this because I need to calculate the volume of the mesh. I am creating a cloned mesh to do this because I need to keep scale an active parameter in the origin mesh.
I was using the answer at How to update vertices geometry after rotate or move object
Which worked fine with three.js release .70, but has broken on release .72
My code:
var volumeClone = new THREE.Mesh (this.mesh.geometry.clone(), new THREE.MeshBasicMaterial( { color: 0xff0000 } ));
volumeClone.scale.z = heightScale;
volumeClone.updateMatrix();
volumeClone.geometry.applyMatrix( volumeClone.matrix );
volumeClone.matrix.identity();
volumeClone.geometry.verticesNeedUpdate = true;
volumeClone.scale.set( 1, 1, 1 );
console(calculateVolume(volumeClone));
Result:
In Chrome is:
Uncaught TypeError: Cannot read property 'setFromPoints' of undefined
THREE.Geometry.computeBoundingBox
#lib.min.js:3THREE.Geometry.applyMatrix
What I have tried:
I have attempted to investigate and isolate the issue within Geometry.js and Box3.js as well as understand the programmatic flow to understand why "this.boundBox" is undefined, but I haven't found the issue.
Question:
Is the syntax correct? Was there an update to three.js in this area?
The answer turned out to be that extrudeGeometry does not contain the same properties as regular geometry and so doing work on it doesn't work since particulars are missing. The answer is to convert bufferGeometry to regular geometry before trying to work with it as regular geometry. I understand the performance benefits of using buffers, but in this case I can create a geometry for measurements and trash it when I get a result.
So rather than "clone" geometry as shown in my question, I switch to creating the particular type Geometry from the buffer instance:
var volumeClone = new THREE.Mesh (new THREE.Geometry().fromBufferGeometry( this.mesh.geometry._bufferGeometry ), new THREE.MeshBasicMaterial( { color: 0xff0000 } ));
volumeClone.scale.z = heightScale;
volumeClone.updateMatrix();
volumeClone.geometry.applyMatrix( volumeClone.matrix );
volumeClone.matrix.identity();
volumeClone.geometry.verticesNeedUpdate = true;
volumeClone.scale.set( 1, 1, 1 );
console(calculateVolume(volumeClone));
This has the added benefit of working with any geometry type I have thrown at it thus far: extrusions, imported mesh, primitives...

Categories

Resources