I have two models loaded in a scene using OBJloader and the Three.JS library. Some of the models are billboarded by use of model.setRotationFromQuaternion( camera.quaternion );
My goal is to draw lines from a vertex on a billboard to a vertex on the corresponding model - it should be drawn between the nearest points on the two models when the scene is first loaded. The models rotate freely so the lines will need to change as it rotates, staying connected to the same initial verticies.
Think of it like the billboard is a label and the line is connected between the label and somewhere on the rotating model.
How can I achieve this?
Below is a snippet of my code - the issue is that the position of all the models is 0,0,0 so I need to know how to get the location of a vertex on both the label and model and connect the two.
addLabelLines(){
var geometry = new THREE.Geometry();
for ( var i = 0; i < this.labels.length; i ++ ) {
var currentLabel = this.labels[i];
var modelMatchingLabel;
//find matching model
for(var j = 0; j < this.models.length; j++){
if(currentLabel.name.toLowerCase().indexOf(this.models[j].name) >= 0){
modelMatchingLabel = this.models[j];
}
}
if(!modelMatchingLabel){
console.warn("no model matching label "+currentLabel.name);
return;
}
else{
console.log('found model '+modelMatchingLabel.name +" matches label "+currentLabel.name);
geometry.vertices.push( currentLabel.position );
geometry.vertices.push( modelMatchingLabel.position );
}
}
var material = new THREE.LineBasicMaterial( { color: 0x800080 } );
line = new THREE.Line( geometry, material, THREE.LineSegments );
scene.add( line );
}
To get the closest point to be able to draw a line requires your typical "nearest neighbor" algorithms. Take a look at this example, as it'll provide you a very efficient way for achieving what you're looking for
https://threejs.org/examples/webgl_nearestneighbour.html
Related
I've looked for resources online, but I have not seen a way to extrude a colored image in Three.js. I'm trying to create something like a Minecraft item where the image is used to then create an extruded geometry. An example would be: https://minecraft.gamepedia.com/File:BowSpinning3.gif
I've tried looking at this resource: https://muffinman.io/three-js-extrude-svg-path/ but this only extrudes uncolored SVGs.
loader.load('./textures/diamondbleu.svg', function (data) {
// Group we'll use for all SVG paths
const svgGroup = new THREE.Group();
// When importing SVGs paths are inverted on Y axis
// it happens in the process of mapping from 2d to 3d coordinate system
svgGroup.scale.y *= -1;
const material = new THREE.MeshLambertMaterial();
// Loop through all of the parsed paths
data.paths.forEach((path, i) => {
const shapes = path.toShapes(true);
// Each path has array of shapes
shapes.forEach((shape, j) => {
// Finally we can take each shape and extrude it
const geometry = new THREE.ExtrudeGeometry(shape, {
depth: 20,
bevelEnabled: false
});
// Create a mesh and add it to the group
const mesh = new THREE.Mesh(geometry, material);
svgGroup.add(mesh);
});
});
// Get group's size
const box = new THREE.Box3().setFromObject(svgGroup);
const size = new THREE.Vector3();
box.getSize(size);
const yOffset = size.y / -2;
const xOffset = size.x / -2;
// Offset all of group's elements, to center them
svgGroup.children.forEach(item => {
item.position.x = xOffset;
item.position.y = yOffset;
});
svgGroup.position.set(0, blockSize*75, 0);
// Finally we add svg group to the scene
scene.add(svgGroup);
})
Is there a way to modify the code to allow for colored SVGs? Thanks!
You can use the SVGLoader that's available in the "examples/jsm/loaders/" folder.
The docs have outlined how to generate SVGs in 3D space, it looks like your code sample is missing the part where the paths loop makes a new material and assigns a color for each path:
var material = new THREE.MeshBasicMaterial( {
color: path.color,
side: THREE.DoubleSide,
depthWrite: false
} );
Your code seems to create a single LambertMaterial with no colors assigned, and no lights. Lambert materials need lights to be illuminated, whereas BasicMaterial just shows the color without need of lights.
Look at the code in this demo for another example. Instead of using path.color, this demo finds the color by accessing path.userData.style.fill. I think you'll want the latter approach, depending on your SVG file.
In a scene I have added several mesh objects (cubes). An EdgeHelper has been made for each cube. The cubes move and rotate and the Edgehelpers move and rotate with them.
I would like to change the color of an EdgeHelper when it's associated cube mesh is selected. (The selection method is not important).
So, given a particular cube mesh, how do I find the associated EdgeHelper object?
When you create an edgesHelper for a given mesh, all you have to do is add a new property to the mesh:
var mesh = new THREE.Mesh( ... );
var edgesHelper = new THREE.EdgesHelper( mesh );
mesh.edgesHelper = edgesHelper;
Now you can change the helper color like so:
mesh.edgesHelper.material.color.set( 0xff0000 );
three.js r.76
When you create meshes and EdgeHelpers you can assign them the same .name attribute:
mesh0.name = 0;
edgeHelper0.name = 0;
mesh1.name = 1;
edgeHelper1.name = 1;
...and so on
**if you wrap this in a loop even better
so when a mesh is selected you can read its .nameattribute and pick the corresponding edgeHelper.
I'm experimenting with Bjørn Sandvik's really great process for importing terrain data into a scene.
Check it out:
http://blog.thematicmapping.org/2013/10/terrain-building-with-threejs.html
var terrainLoader = new THREE.TerrainLoader();
terrainLoader.load('../assets/jotunheimen.bin', function(data) {
var geometry = new THREE.PlaneGeometry(60, 60, 199, 199);
for (var i = 0, l = geometry.vertices.length; i < l; i++) {
geometry.vertices[i].z = data[i] / 65535 * 10;
}
var material = new THREE.MeshPhongMaterial({
color: 0xdddddd,
wireframe: true
});
var plane = new THREE.Mesh(geometry, material);
scene.add(plane);
});
My intent is to use this to display elevation data from a time series, so multiple .bin files will be loaded to provide data representing a period of several years to show change over time.
I am having difficulties updating the geometry with new data. I think that my difficulties stem from the plane and geometry variables being defined inside of a function, meaning that they are undefined in the global context. So later when I call those variables they don't have any value associated with them.
Does anyone have an idea of how I can update this geometry with new data loaded using the TerrainLoader?
anything you .add() to the scene object is visible as an element of the scene.children array -- to you can still reference your plane and the geometry of it as plane.geometry -- the the plane is the only object in the scene, it will probably be scene.children[0].geometry
See this page: https://github.com/mrdoob/three.js/wiki/Updates for hints on how to let THREE know the geometry is changing
I am creating a line and added to the scene and shows well.
but when I try to create a mesh (using the same coordinates that I formed the line) I get errors that no duplicate points.
"Warning, unable to triangulate polygon!
Duplicate point 653.4789181355854: 204.0166729191409
Either or not infinite solutions!
Its finite solutions.
Either or not infinite solutions!
Too bad, not solutions. "
The strange thing about all this is that the coordinates are more than 4000 points, and I'm sure none of them is repeated. (I even checked in excel, and are only repeated the coordinates of start and end which I understand to be the same) .
what can I do?. no way that from the points of the line can create the mesh without me these errors appear ?. or what other steps should I follow?
for(var x in features.features){
materialLinea[x] = new THREE.LineBasicMaterial( { color: "#FFFFFF"} );
array_extrude[x]=new Array();
material[x] = new THREE.MeshBasicMaterial({
color: "#FF0000"
});
geometria[x] = new THREE.Geometry();
for(var s in features.features[x].geometry.coordinates[0]){
geometria[x].vertices.push(new THREE.Vector3(features.features[x].geometry.coordinates[0][s][0],features.features[x].geometry.coordinates[0][s][1],0))
array_extrude[x].push(new THREE.Vector3(features.features[x].geometry.coordinates[0][s][0],features.features[x].geometry.coordinates[0][s][1],0));
}
line[x] = new THREE.Line( geometria[x], materialLinea[x])
scene.add(line[x])
object3d[x] = new THREE.Shape( array_extrude[x] );
var extrusionSettings = {bevelEnabled: false,amount:10, };
figuraExtrude[x] = new THREE.ExtrudeGeometry( object3d[x], extrusionSettings );
municipios[x] = new THREE.Mesh( figuraExtrude[x], material[x] );
scene.add(municipios[x]);
}
You can merge vertices on your geometries
Geometry.mergeVertices()
I'm using a web-worker to load a .json file of an animated 3D model. For each of the big arrays (vertices, normals, etc.), I'm transferring an Float32Array buffer back to the UI thread. Since such buffers are transferable objects, this will take (almost) zero time.
Now, it turns out that WebGL (and therefore, Three.js) use Float32Array buffers internally, too. This means I could probably load this 3D animation without copying anything, spending almost zero time in the main thread. Isn't that nice?
But it's not clear how to do that part: In the main thread, we have array buffers available for the vertices, normals (the main ones, and the 'morph' ones) and faces. How do I create a working Geometry (or BufferGeometry) from these, without translating or copying the data?
var scene,
vertices, normals, faces,
morphVertices, morphNormals; // <-- we have all these as typed arrays
var geometry = ...; // <-- insert code here
var material = new THREE.MeshLambertMaterial({ morphTargets: true });
var object3D = new THREE.MorphAnimMesh(geometry, material);
scene.add(object3D);
This answer gives a hint, but only point 7 seems relevant, it assumes there is already some Geometry instance, and it doesn't handle morph-targets.
Here's an example based on the mesh loading portion of THREE.GLTF2Loader.
// Create BufferGeometry and assign vertices and normals.
var geometry = new THREE.BufferGeometry();
geometry.addAttribute( 'position', new THREE.BufferAttribute( vertices, 3 ) );
geometry.addAttribute( 'normal', new THREE.BufferAttribute( normals, 3 ) );
geometry.setIndex( new THREE.BufferAttribute( faces, 3 ) );
// Create material.
var material = new THREE.MeshStandardMaterial({
morphTargets: true,
morphNormals: true
});
// Set up morph target attributes.
var posAttr = new THREE.BufferAttribute( morphVertices, 3 );
var normAttr = new THREE.BufferAttribute( morphNormals, 3 );
for (var i = 0; i < posAttr.array.length; i++) {
posAttr.array[i] += geometry.attributes.position.array[i];
}
for (var j = 0; j < normAttr.array.length; j++) {
normAttr.array[j] += geometry.attributes.normal.array[j];
}
// Assign morph target attributes.
geometry.morphAttributes.position = [ posAttr ];
geometry.morphAttributes.normal = [ normAttr ];
// Create Mesh.
var mesh = new THREE.Mesh(geometry, material);
mesh.updateMorphTargets();
// Apply 50/50 blend of morph targets and default position/normals.
mesh.morphTargetInfluences[0] = 0.5;
three.js r86-dev.