Three.js PolyhedronGeometry Vertices Position - javascript

So I am trying to create my own shape with PolyhedronGeometry
rough sketch:
I'm running into problems. Specifically I'm trying to attach this shape onto a sphere, so using some formulas I came up with the following vertices:
[ -0.6495190528383289, -0.09943689110435831, 0.36157613954394185,
0, -0.09943689110435831, 0.7433789778353299,
0.6495190528383289, -0.09943689110435831, 0.36157613954394185,
0.3897114317029973, -0.39774756441743325, 0.7231522790878837,
0, -0.5966213466261499, 0.5947031822682639,
-0.3897114317029973, -0.39774756441743325, 0.7231522790878837 ]
then for the face indices I have:
[ 5,4,1, 5,1,0, 2,1,3, 4,3,1, 0,1,2, 0,4,5, 2,3,4 ]
When I add spheres as debugging points, they appear at the right place, but no matter how I adjust the vertices/faces, some of the positions are incorrect:
Is order important for the faces?
Why does my polyhedron not draw correctly?

The problem is that the lengths of the original vectors are not equal to each other, so that the construction of their projections on the figure are not the same scope. And maybe, you specify the radius of the sphere wrong.
For example, the length of the first vector in the set:
(new THREE.Vector3(
-0.6495190528383289,
-0.09943689110435831,
0.36157613954394185 )
).length() === 0.75
A length of the last vector in the set:
(new THREE.Vector3(
-0.3897114317029973,
-0.39774756441743325,
0.7231522790878837 )
).length() === 0.9127033163903814
If you set a radius equal to the PolyhedronGeometry of the length of the first vector, the latter vector is outside the sphere of this radius. If you set a radius equal to the PolyhedronGeometry of the length of the last vector is the first vector is inside the sphere.

Related

Rotating a vector around a sphere (simulating sun) in three.js

I am trying to adapt this code (which itself is a implementation of this). I have gotten the general visualization and rendering to work, but I'm now trying to animate some realistic movement.
The point of the light source is determined by a normalized vector (for example THREE.Vector3(1, 0.75, 0). The light will appear to be coming from the top right). Now what I would like to do is to have the vector rotate around the sphere in such a way that it seems like the sphere is orbiting the light source. But I can't figure out how to do this. I've tried updating/changing the position of the vector. But I'm not sure how to calculate the proper next x,y,z values. I've tried applying euler angles and rotational matrix, like so:
euler = new THREE.Euler(f,g,h, 'XYZ');
matrix = new THREE.Matrix4().makeRotationFromEuler(euler);
light = vector.applyMatrix4(matrix);
But here, I'm not sure how to get the correct values of f,g,h such that the light doesn't just wobble around the sphere.
Am I even on the right track?
Working example: http://jsfiddle.net/VsWb9/3890/
You are increasing linearly two of 3 coordinates in euler angles, that is the issue.
For other rotations than around x/y/z axis, the best for understanding/avoiding issues/coding/computational cost is to trust quaternions.
They are rather intuitive, in threejs too. They are made of 4 coordinates : 3 for a rotation axis, and the fourth for the rotation value.
var quat=new THREE.Quaternion();
//we set the axis around which the rotation will occur. It needs to be normalized
var axis=new THREE.Vector3(0,1,0).normalize();
//and the angle value (radians)
var angle=0;
//this is your light vector (=original position of your light)
var light=new THREE.Vector3(1,0,0).normalize();
Then in your render loop you change the angle value and tell the quaternion to use the axis above, and the updating angle :
angle+=.001//(or angle -= if your axis points in the +y direction like here)
quat.setFromAxisAngle(axis,angle);
And finally apply it :
light.applyQuaternion(quat);
Updated fiddle : http://jsfiddle.net/Atrahasis/0v93p2xy/
Note about normalized vectors :
a normalized vector is 1-unit length.
( 1 , 0 , 0 ),( 0 , 1 , 0 ),( 0 , 0 , 1 ) are native-normalized vectors (and unit vectors).
( 1 , .75 , 0 ) is not a normalized vector, its length is √(1²+.75²)=1.5625 (pythagoras)
for example ( 0.6614... , .75 , 0 ) is a normalized vector

How to get 3D point coordinates given UV coordinates on a 3d plane object - Threejs

I'm trying to build some simple data visualisation and my weapon of choice is Three.js.I'have a series of PlaneGeometry meshes on which I apply a transparent texture dynamically created with a series of red square on it drawn at different opacity values.My plan is to use those points to create other meshes ( eg. CylinderGeometry ) and place them on top of the red square with an height value based on the red square opacity value.So far I could manage to find the UV values for each square and store it to an array, but I'm getting blocked at converting such red square UV coordinates to the 3D world coordinates system.I've found several resource describing the same concept applied to a sphere, and surprisingly it is pretty straight forward, but no other resources about applying the same concept to other mesh.
How can I get the 3D coordinates of those red square inside the texture?
EDIT: I think this is it:
function texturePosToPlaneWorld(planeOb, texcoord)
{
var pos = new THREE.Vector3();
pos.x = (texcoord.x - 0.5) * PLANESIZE;
pos.y = (texcoord.y - 0.5) * PLANESIZE;
pos.applyMatrix4(planeOb.matrix);
return pos;
}
Is used like this in the jsfiddle I made: http://jsfiddle.net/L0rdzbej/2/
var texcoord = new THREE.Vector2(0.8, 0.65);
var newpos = texturePosToPlaneWorld(planeOb, texcoord);
cubeOb.position.copy(newpos);
Planes are simple. The edge between vertices A, B -- vector A->B defines the direction for 'x' in your texture, and A->C similarily for the other direction in which the plane goes in the 3d space .. where you have texture's 'y' mapped on the plane.
Let's assume your pivot point is in the middle. So that's known in world space. Then as UV go from 0 to 1, e.g. UV coord (1.0, 0.5) would be in half way of the full width of the plane in the direction of the vector from of your Plane object pivot .. going from middle all the way to the edge. And then in the middle otherwise, where you have 0.5 in V (as in normalized texture y pixelcoord).
To know the coordinates of the vertices of the plane in world space, you just multiple them with the orientation of the object..
Given you know the size of your plane, you actually don't need to look at the vertices as the orientation of the plane is already in the object matrix. So you just need to adapt your UV coord to the pivot in middle (-0.5) and multiply with the plane size to get the point in plane space. Then the matrix multiplication converts that to world space.

Get face rotation Three.js

I am getting the intersections of mouse click with Three.js like this
me.vector.set(
(event.clientX / window.innerWidth) * 2 - 1,
-(event.clientY / window.innerHeight) * 2 + 1,
0.5);
me.vector.unproject(app.camera);
me.ray.set(app.camera.position, me.vector.sub(app.camera.position).normalize());
var intersects = me.ray.intersectObjects(app.colliders, false);
So, i got intersects perfectly, with following properties:
distance, face, faceIndex, object, point, and then I execute a function.
The problem is the following:
I want to detect when i click a face of a cube, that is like a floor, in the next example would be the gray face.
sorry about my engilsh D:
WebGL defines vertices and faces with coordinates, colors, and normals. A face normal is a normalized vector, perpendicular to the face plane (and generally pointing 'outside' the mesh). It defines its orientation and enables calculation of lightning for instance. In three.js you can access it via face.normal.
If your floor-like faces are stricly horizontal, then their normals are all precisely {x:0,y:1,z:0}. And since normals are normalized, simply check whether face.normal.y === 1 also checks that x and y equal 0.
If your faces are not strictly horizontal, you may need to set a limit angle with the y-axis. You can calculate this angle with var angle=Math.acos(Yaxis.dot(faceNormal)) where Yaxis=new THREE.Vector3(0,1,0).

Three.js - ExtrudeGeometry using depth and a direction vector

I want to extrude a shape and create an ExtrudeGeometry, but the shape has to be extruded into a certain direction. I have a direction in a Vector3
The shape is drawn in in the x, y plane and normally the z is the extrude direction (extrusion depth). So a direction vector (0,0,1) would result in the default extrusion. But for example a (0,0,-1) would extrude the shape in the other direction.
I first tried to use an extrude path to achieve this, but when using a path the shape is allowed to "spin" freely and the initial orientation is arbitrary. This is not what I need, the shape must stay oriented as is. You can read details on this here in my previous question.
I already came up with the idea of applying a matrix to the second half of the vertices of the resulting ExtrudedGeometry, but I cannot seem to get the geometry I want. Maybe it is my clumsy use of matrices, but I think that the face normals are pointing inside out after this trick.
Note The direction vector will never be orthogonal to the z axis since this would give invalid shapes
So the question:
How do I get a reliable solution to extrude my shape into the given direction. Here an example. The shape is a square in the x,y plane (width and length 2000) the extrusion depth is also 2000 and three different vectors with a drawing of the expected result seen in 2D (front view) and 3D.
Extrude your geometry in the usual way by specifying an extrusion depth, and then apply a shear matrix to your geometry.
Here is how to specify a shear matrix that will tilt a geometry.
var matrix = new THREE.Matrix4();
var dir = new THREE.Vector3( 0.25, 1, 0.25 ); // you set this. a unit-length vector is not required.
var Syx = dir.x / dir.y,
Syz = dir.z / dir.y;
matrix.set( 1, Syx, 0, 0,
0, 1, 0, 0,
0, Syz, 1, 0,
0, 0, 0, 1 );
geometry.applyMatrix4( matrix );
(The three.js coordinate system has the y-axis up -- unlike in your illustration. You will have to accommodate.)
three.js r.113

Generating a Sphere with Voxel

I have been playing around with voxeljs, I'm new to 3D programming, and it says in the doc that this code generates the "sphere world":
generate: function(x,y,z) {
return x*x+y*y+z*z <= 20*20 ? 1 : 0 // sphere world
},
How is this actually generating a sphere? From my simple understanding, I think that it's basically "looping" through each "chunk" in the 3D world? Any further explanation or a point to a good tutorial on this would be a huge help!
Your function says:
If the voxel at (x, y, z) is part of the sphere, return 1, else 0.
The author applies the sphere equation. Your sphere is formed by the following set of voxels:
That basically means a voxel is part of the sphere, if the distance to the center (0, 0, 0) is less than the radius. The distance is calculated using the Pythagorean Theorem. By squaring the radius (in your case 20) you can compare it to the squared distance without calculating a square root.
This is based on the distance formula in three-dimensional space, since you can define a sphere as every point within a certain distance of the center point.
The distance between any two objects is equal to the square root of (x1-x2)^2 + (y1-y2)^2 + (z1-z2)^2.
The above function is flagging each voxel if they are within 20 units of the origin. Since the origin is (0, 0, 0), the distance function simplifies down to square root of x1^2 + y1^2 + z1^2. This also throws in another optimization by getting rid of the square root, and comparing the result to 20^2.

Categories

Resources