Three JS No Visual Update After Manually Editing Geometry - javascript

So I have a Geometry (the scope of this code is THREE.Geometry.prototype) and I am dynamically editing. newData is an object of { faces: [array of face Indexes], vertices: [array of vertice indexes]}. (these arrays maintain the length of the origin face and vertices arrays length and hold the form [null, null, null, "4", "5", null, null... ])
Using these arrays, I strip through all the faces and vertices and apply them to 1 of 2 new arrays, effectively splitting all the data into 2 groups. I also update the pointers on the faces!
In the end I know I've updated the geometry and it is correct, but the changes I make aren't getting displayed. I've tried .elementsNeedUpdate which causes and error. (no property 'a' of undefined in InitWebGlObjects... I looked there, couldn't see a reference to a)
I've tried vertices need update, it does nothing.
I've also tried updateCentroids in combination with the previous tool. It does nothing.
I've heard of not being able to resize the buffer. What is the buffer and the length of the buffer? The amount of verticies I'm giving to a model?
I've seen "You can emulate resizing by pre-allocating larger buffer and then keeping unneeded vertices collapsed / hidden." It sounds like that may be what I'm doing? How can I collapse/ hide a vertice? I haven't seen any references to that.
Thanks for your time!
var oldVertices = this.vertices
var oldFaces = this.faces;
var newVertices = []
var newFaces = [];
var verticeChanges = [];
this.vertices = [];
this.faces = [];
for(var i in oldVertices){
var curAr = ((newData.vertices[i]) ? (newVertices):(this.vertices));
curAr.push(oldVertices[i]);
verticeChanges[i] = curAr.length-1;
}
for(var i in oldFaces){
var curAr = ((newData.faces[i]) ? (newFaces):(this.faces));
oldFaces[i].a = verticeChanges[oldFaces[i].a];
oldFaces[i].b = verticeChanges[oldFaces[i].b];
oldFaces[i].c = verticeChanges[oldFaces[i].c];
}
console.log('Vertices Cut from', oldVertices.length, "to:", newVertices.length, 'and', this.vertices.length);
console.log('Faces Cut from', oldFaces.length, "to:", newFaces.length, 'and', this.faces.length);

I recently ran into this problem myself. I found that if I'm adding vertices and faces to the geometry, I need to set this.groupsNeedUpdate = true in order to tell the renderer to update it's internal buffers.

Maybe connected to this point from this tutorial:
"I just wanted to quickly point out a quick gotcha for Three.js, which
is that if you modify, for example, the vertices of a mesh, you will
notice in your render loop that nothing changes. Why? Well because
Three.js (as far as I can tell) caches the data for a mesh as
something of an optimisation. What you actually need to do is to flag
to Three.js that something has changed so it can recalculate whatever
it needs to. You do this with the following:
// set the geometry to dynamic so that it allow updates
sphere.geometry.dynamic = true;
// changes to the vertices
sphere.geometry.__dirtyVertices = true;
// changes to the normals
sphere.geometry.__dirtyNormals = true;"

Related

Programatically create skeleton in Three.js

I am loading a model of a mechanism (e.g. a robot arm) in Three.js. Sadly the models I am using don't have a skeleton, but I have the locations, axes and so on of the joints. In order to use e.g. inverse kinematic solvers like Three-IK, I want to create a skeleton from these parameters. Since I want to use many different models I would prefer to not create the skeletons by hand but in code.
I have been trying for over a week now to create a valid bone structure from these values that reflects the model, but nothing succeeded. For example, if I create a chain of bones using the positions of the joints I get a very long skeleton which in no way matches the positions I used.
let boneParent;
let bonepos = [];
let bones = [];
model.traverse(child => {
switch(child.type) {
case "joint":
let p = new Vector3();
child.getWorldPosition(p);
bonepos.push(p);
let bone = new Bone();
boneParent && boneParent.add(p);
bone.worldToLocal(p.clone());
bone.position.copy(p);
bone.rotation.copy(child.rotation);
bone.scale.copy(child.scale);
boneParent = bone;
bones.push(bone);
break;
}
});
showPoints(scene, bonepos, 0xff0000);
const skeletonHelper = new SkeletonHelper(bones[0]);
skeletonHelper.visible = true;
scene.add(skeletonHelper);
The code above results in the screenshot below. The red markers are the positions I get from the robot joints, the line snaking into the distance is the skeleton as visualized by the SkeletonHelper.
So my question is this: it seems like I don't understand well enough how bones are handled in Three.js. How can I create a skeleton that reflects my existing model from its joint locations and orientations?
Thanks in advance!
child.getWorldPosition(p);
I'm afraid it's incorrect to apply the position in world space to Bone.position which represents the position in local space.
boneParent = bone;
This line looks problematic, too. A bone can have multiple child elements. It seems to me that this use case is not considered of your code.
After some fiddling around I found a solution:
let root = new Bone();
let parent = root;
let pos = new Vector3();
for (let joint of robot.arm.movable) {
let link = robot.getLinkForJoint(joint);
link.getWorldPosition(pos);
let bone = new Bone();
parent.add(bone);
parent.lookAt(pos);
parent.updateMatrixWorld(); // crucial for worldToLocal!
bone.position.copy(bone.worldToLocal(pos));
parent = bone;
}
The important part is to call updateMatrixWOrld() after lookAt() so that bone.worldToLocal() works correctly. Also lookAt() saves a lot of matrix hassles :)

Change color of shape without recreating

I have a project using easelJS in that I'm trying to simply change the color of a shape object. I have a couple of examples but they both seem to indicate I would need to completely redraw the shape with a graphics.beginFill() call again. This seems completely overkill considering that I've stored my shape object already and am able to perform other manipulations on it.
I've also tried to use ColorFilter and ColorMatrix but have not had luck making a clean color change. In some cases the color essentially "overwrites" the detail of the shape.
I have an array of ShapeObject that I store the createjs.Shape() object in.
var ShapeObject = function()
{
this.name;
this.shape;
this.rotation;
this.color;
};
sObject = new ShapeObject();
myShape = new createjs.Shape();
sObject.shape = myShape;
shapes = new Array();
shapes.push(sObject);
Later I am able to retrieve the shape from the array and apply a filter, for example,
s = shapes[i];
filter = new createjs.ColorFilter(0,0,0,1, 255,128,0,0);
s.shape.filters = [ filter ];
Using this same example I'd like to avoid having to completely recreate the shape. I have tried the following but, while it changes color, I lose all the other details of the shape that was originally applied.
s.shape.graphics.clear().beginFill(color);
Does anyone have an idea of how to simply change the color without completely recreating the original shape?
EDIT
Following the answer regarding .command and the createjs command blog post I have created the following.
var theShape = new createjs.Shape();
var fillCommand = theShape.graphics.beginFill("yellow").command;
theShape.graphics.setStrokeStyle(1)
.beginStroke(createjs.Graphics.getRGB(0, 0, 0))
.drawCircle(0, 0, 20)
.moveTo(0,-20)
.lineTo(0,0)
.moveTo(0,0)
.lineTo(20,0);
fillCommand.style = "orange";
Despite being nearly identical to the examples I am receiving the error Uncaught TypeError: Cannot set property 'style' of undefined at the last line above.
I wouldn't use ColorFilter or ColorMatrix, it will likely be much slower. But you could use the Graphics command object, check out the docs: http://www.createjs.com/docs/easeljs/classes/Graphics.html
var fillCommand = myGraphics.beginFill("red").command;
// ... later, update the fill style/color:
fillCommand.style = "blue";

Three.js: Updating Geometries vs Replacing

I have a scene with lots of objects using ExtrudeGeometry. Each of these need to update each frame, where the shape that is being extruded is changing, along with the amount of extrusion. The shapes are being generated using d3's voronoi algorithm.
See example.
Right now I am achieving this by removing every object from the scene and redrawing them each frame. This is very costly and causing performance issues. Is there a way to edit each mesh/geometry instead of removing from the scene? Would this help with performance? Or is there a more efficient way of redrawing the scene?
I'd need to edit both the shape of the extrusion and the amount of extrusion.
Thanks for taking a look!
If you're not changing the number of faces, you can use morph targets http://threejs.org/examples/webgl_morphtargets.html
You should
Create your geometry
Clone the geometry and make your modifications to it, such as the maximum length of your geometry pillar
Set both geometries as morph targets to your base geometry, for example
baseGeo.morphTargets.push(
{ name: "targetName", vertices: [ modifiedVertexArray ] }
);
After that, you can animate the mesh this using mesh.updateMorphTargets()
See http://threejs.org/examples/webgl_morphtargets.html
So I managed to come up with a way of not having to redraw the scene every time and it massively improved performance.
http://jsfiddle.net/x00xsdrt/4/
This is how I did it:
Created a "template geometry" with ExtrudeGeometry using a dummy
10 sided polygon.
As before, created a bunch of "points", this time assigning each
point one of these template geometries.
On each frame, iterated through each geometry, updating each vertex
to that of the new one (using the voronoi alg as before).
If there are extra vertices left over, "bunch" them up into a single point. (see http://github.com/mrdoob/three.js/wiki/Updates.)
Looking at it now, it's quite a simple process. Before, the thought of manipulating each vertex seemed otherworldly to me, but it's not actually too tricky with simple shapes!
Here's how I did the iteration, polycColumn is just a 2 item array with the same polygon in each item:
// Set the vertex index
var v = 0;
// Iterate over both top and bottom of poly
for (var p=0;p<polyColumn.length;p++) {
// Iterate over half the vertices
for (var j=0;j<verts.length/2;j++) {
// create correct z-index depending on top/bottom
if (p == 1) {
var z = point.extrudeAmount;
} else {
var z = 0;
}
// If there are still legitimate verts
if (j < poly.length) {
verts[v].x = poly[j][0];
verts[v].y = poly[j][1];
verts[v].z = z;
// If we've got extra verts, bunch them up in the same place
} else {
verts[v].x = verts[v - 1].x;
verts[v].y = verts[v - 1].y;
verts[v].z = z;
}
v++;
}
}
point.mesh.geometry.verticesNeedUpdate = true;

ThreeJS - adding objects in different order affects alpha / display

My program creates dynamic number of point cloud objects with custom attributes that includes the alpha value of each particle. This works fine, however, when the objects are nested within each other (say spheres) the smaller (inner) ones are getting obscured by the bigger ones, even though their particles' alpha is set properly. When I reverse the order of adding the point-cloud objects to the scene, starting with the bigger ones, going down to the smaller ones, I can see the smaller ones thru the bigger ones.
My question is whether there is a way to tell the renderer to update or recalculate the alpha values or re-render the smaller inner objects so that they show up?
I ran into the same problem as you do. I fixed it to calculate and set the renderdepth for each mesh. For this you need the camera position and the center of your mesh.
You probably already created meshes for each object. If you save all these meshes into an array, it's easier to calculate and set the renderdepth on these objects.
Here's an example how I did it.
updateRenderDepthOnRooms(cameraPosition: THREE.Vector3): void {
var rooms: Room[] = this.getAllRooms();
rooms.forEach((room) => {
var roomCenter = getCenter(room.mesh.geometry);
var renderDepth = 0 - roomCenter.distanceToSquared(cameraPosition);
room.mesh.renderDepth = renderDepth;
});
}
function getCenter(geometry: THREE.Geometry): THREE.Vector3 {
geometry.computeBoundingBox();
var bb = geometry.boundingBox;
var offset = new THREE.Vector3();
offset.addVectors(bb.min, bb.max);
offset.multiplyScalar(0.5);
return offset;
}
So, to get the center of your object, you can ask the geometry from your mesh and use the getCenter(..) function from my example. Then you calculate the renderdepth with the ThreeJs function distanceToSquared(..) and then set this renderdepth to your mesh.
That's it. Hope this will help you.

OpenLayers ModifyFeature not saving new vertices

I'm just getting started with OpenLayers, and have hit a small snag - when I create a LineString and then try to modify it, I can move the existing vertices and drag the virtual vertices to create new ones. When I continue to add to the line though, only the changes to the existing vertices are saved - new vertices are discarded. Am I missing something? You can see an example of what I'm talking about here:
http://dev.darrenhall.info/temp/open-layers/modify-feature/
Click to add points, and use the dots to edit, then click to continue adding to see what I mean. Any help would be appreciated! Thanks!
Darren
After a quick look, your code looks more complex than it should be.
You manually push point into an array of point manually on click, and generate a linestring with those points.
You don’t listen to any change done with virtual vertices. I don’t get why, in your addWayPoint function, you don’t get the geometry of the feature from the layer rather than your array of point.
Maybe that would be a good start to use the real feature geometry and avoid using your route.waypoints.
In the end I decided not to use modifyFeature, and instead went for using vectors as handles and manually handling the dragging and line modification. You can see my workaround here:
http://dev.darrenhall.info/temp/open-layers/draw-route
The guys at Ordnance Survey cam up with a (rather simple) fix for my code though that repopulates the array from the vertices after modification:
function addWayPoint(e) {
var position = osMap.getLonLatFromViewPortPx(e.xy);
if(route.waypoints.length>1) {
layers.lines.layer.removeFeatures([layers.lines.feature]);
}
/* vvvvvvvvvvv start */
/* Get the potentially modified feature */
if (modifyFeature.feature) {
route.waypoints = [];
var vertices = modifyFeature.feature.geometry.getVertices();
for (i = 0; i < vertices.length; i++) {
//console.log(vertices[i]);
route.waypoints.push(vertices[i]);
}
}
/* ^^^^^^^^^^^ end */
route.waypoints.push(new OpenLayers.Geometry.Point(position.lon, position.lat));
var string = new OpenLayers.Geometry.LineString(route.waypoints);
layers.lines.feature = new OpenLayers.Feature.Vector(string, null, styles.pink);
layers.lines.feature.attributes['id']=1;
layers.lines.layer.addFeatures([layers.lines.feature]);
for (i = 0; i < layers.lines.layer.features.length; i++) {
if (layers.lines.layer.features[i].attributes.id == 1) {
modifyFeature.selectFeature(layers.lines.layer.features[i]);
}
}
}

Categories

Resources