In the following line of code
mesh = new THREE.Mesh(new THREE.SphereGeometry(500,60,40),
new THREE.MeshBasicMaterial({map:texture,overdraw:true}));
What are the values 60 and 40 and what is their effect on the sphere?
mesh.scale.x = -1;
What does the above statement do??
I have gone through many articles but none explains the above and even the three.js documentation gives the syntax for use and not the description.
Take a look at the documentation of the Three.js:
http://threejs.org/docs/#Reference/Extras.Geometries/SphereGeometry
So 60 and 40 are numbers of segments that sphere is divided into, horizontally and vertically.
mesh.scale.x = -1; would invert the mesh "inside-out".
Generally, the scale value for same axis multiplies vertex's position on according axis with scale factor for that axis. So scale on x axis would multiply x-component of the vertex's position with it.
Try to avoid negative scaling factors, it might lead to very undesirable effects. It is also recommended to always scale mesh uniformly on all three axis, something like:
var factor = 2.0;
mesh.scale = new THREE.Vector3(factor, factor, factor);
Related
TLDR; Given a set of triangle vertices and a normal vector (all in unit space), how do I calculate X, Y, Z Euler rotation angles of the triangle in world space?
I am attemping to display a 3D model in HTML - with actual HTML tags and CSS transforms. I've already loaded an OBJ file into a Javascript class instance.
The model is triangulated. My first aim is just to display the triangles as planes (HTML elements are rectangular) - I'll be 'cutting out' the triangle shapes with CSS clip-path later on.
I am really struggling to understand and get the triangles of the model rotated correctly.
I thought a rotation matrix could help me out, but my only experience with those is where I already have the rotation vector and I need to convert and send that to WebGL. This time there is no WebGL (or tutorials) to make things easier.
The following excerpt shows the face creation/'rendering' of faces. I'm using the face normal as the rotation but I know this is wrong.
for (const face of _obj.faces) {
const vertices = face.vertices.map(_index => _obj.vertices[_index]);
const center = [
(vertices[0][0] + vertices[1][0] + vertices[2][0]) / 3,
(vertices[0][1] + vertices[1][1] + vertices[2][1]) / 3,
(vertices[0][2] + vertices[1][2] + vertices[2][2]) / 3
];
// Each vertex has a normal but I am just picking the first vertex' normal
// to use as the 'face normal'.
const normals = face.normals.map(_index => _obj.normals[_index]);
const normal = normals[0];
// HTML element creation code goes here; reference is 'element'.
// Set face position (unit space)
element.style.setProperty('--posX', center[0]);
element.style.setProperty('--posY', center[1]);
element.style.setProperty('--posZ', center[2]);
// Set face rotation, converting to degrees also.
const rotation = [
normal[0] * toDeg,
normal[1] * toDeg,
normal[2] * toDeg,
];
element.style.setProperty('--rotX', rotation[0]);
element.style.setProperty('--rotY', rotation[1]);
element.style.setProperty('--rotZ', rotation[2]);
}
The CSS first translates the face on X,Y,Z, then rotates it on X,Y,Z in that order.
I think I need to 'decompose' my triangles' rotation into separate axis rotations - i.e rotate on X, then on Y, then on Z to get the correct rotation as per the model face.
I realise that the normal vector gives me an orientation but not a rotation around itself - I need to calculate that. I think I have to determine a vector along one triangle side and cross it with the normal, but this is something I am not clear on.
I have spent hours looking at similar questions on SO but I'm not smart enough to understand or make them work for me.
Is it possible to describe what steps to take without Latex equations? I'm good with pseudo code but my Math skills are severely lacking.
The full code is here: https://whoshotdk.co.uk/cssfps/ (view HTML source)
The mesh building function is at line 422.
The OBJ file is here: https://whoshotdk.co.uk/cssfps/data/model/test.obj
The Blender file is here: https://whoshotdk.co.uk/cssfps/data/model/test.blend
The mesh is just a single plane at an angle, displayed in my example (wrongly) in pink.
The world is setup so that -X is left, -Y is up, -Z is into the screen.
Thank You!
If you have a plane and want to rotate it to be in the same direction as some normal, you need to figure out the angles between that plane's normal vector and the normal vector you want. The Euler angles between two 3D vectors can be complicated, but in this case the initial plane normal should always be the same, so I'll assume the plane normal starts pointing towards positive X to make the maths simpler.
You also probably want to rotate before you translate, so that everything is easier since you'll be rotating around the origin of the coordinate system.
By taking the general 3D rotation matrix (all three 3D rotation matrices multiplied together, you can find it on the Wikipedia page) and applying it to the vector (1,0,0) you can then get the equations for the three angles a, b, and c needed to rotate that initial vector to the vector (x,y,z). This results in:
x = cos(a)*cos(b)
y = sin(a)*cos(b)
z = -sin(b)
Then rearranging these equations to find a, b and c, which will be the three angles you need (the three values of the rotation array, respectively):
a = atan(y/x)
b = asin(-z)
c = 0
So in your code this would look like:
const rotation = [
Math.atan2(normal[1], normal[0]) * toDeg,
Math.asin(-normal[2]) * toDeg,
0
];
It may be that you need to use a different rotation matrix (if the order of the rotations is not what you expected) or a different starting vector (although you can just use this method and then do an extra 90 degree rotation if each plane actually starts in the positive Y direction, for example).
I am trying to adapt this code (which itself is a implementation of this). I have gotten the general visualization and rendering to work, but I'm now trying to animate some realistic movement.
The point of the light source is determined by a normalized vector (for example THREE.Vector3(1, 0.75, 0). The light will appear to be coming from the top right). Now what I would like to do is to have the vector rotate around the sphere in such a way that it seems like the sphere is orbiting the light source. But I can't figure out how to do this. I've tried updating/changing the position of the vector. But I'm not sure how to calculate the proper next x,y,z values. I've tried applying euler angles and rotational matrix, like so:
euler = new THREE.Euler(f,g,h, 'XYZ');
matrix = new THREE.Matrix4().makeRotationFromEuler(euler);
light = vector.applyMatrix4(matrix);
But here, I'm not sure how to get the correct values of f,g,h such that the light doesn't just wobble around the sphere.
Am I even on the right track?
Working example: http://jsfiddle.net/VsWb9/3890/
You are increasing linearly two of 3 coordinates in euler angles, that is the issue.
For other rotations than around x/y/z axis, the best for understanding/avoiding issues/coding/computational cost is to trust quaternions.
They are rather intuitive, in threejs too. They are made of 4 coordinates : 3 for a rotation axis, and the fourth for the rotation value.
var quat=new THREE.Quaternion();
//we set the axis around which the rotation will occur. It needs to be normalized
var axis=new THREE.Vector3(0,1,0).normalize();
//and the angle value (radians)
var angle=0;
//this is your light vector (=original position of your light)
var light=new THREE.Vector3(1,0,0).normalize();
Then in your render loop you change the angle value and tell the quaternion to use the axis above, and the updating angle :
angle+=.001//(or angle -= if your axis points in the +y direction like here)
quat.setFromAxisAngle(axis,angle);
And finally apply it :
light.applyQuaternion(quat);
Updated fiddle : http://jsfiddle.net/Atrahasis/0v93p2xy/
Note about normalized vectors :
a normalized vector is 1-unit length.
( 1 , 0 , 0 ),( 0 , 1 , 0 ),( 0 , 0 , 1 ) are native-normalized vectors (and unit vectors).
( 1 , .75 , 0 ) is not a normalized vector, its length is √(1²+.75²)=1.5625 (pythagoras)
for example ( 0.6614... , .75 , 0 ) is a normalized vector
I'm trying to build some simple data visualisation and my weapon of choice is Three.js.I'have a series of PlaneGeometry meshes on which I apply a transparent texture dynamically created with a series of red square on it drawn at different opacity values.My plan is to use those points to create other meshes ( eg. CylinderGeometry ) and place them on top of the red square with an height value based on the red square opacity value.So far I could manage to find the UV values for each square and store it to an array, but I'm getting blocked at converting such red square UV coordinates to the 3D world coordinates system.I've found several resource describing the same concept applied to a sphere, and surprisingly it is pretty straight forward, but no other resources about applying the same concept to other mesh.
How can I get the 3D coordinates of those red square inside the texture?
EDIT: I think this is it:
function texturePosToPlaneWorld(planeOb, texcoord)
{
var pos = new THREE.Vector3();
pos.x = (texcoord.x - 0.5) * PLANESIZE;
pos.y = (texcoord.y - 0.5) * PLANESIZE;
pos.applyMatrix4(planeOb.matrix);
return pos;
}
Is used like this in the jsfiddle I made: http://jsfiddle.net/L0rdzbej/2/
var texcoord = new THREE.Vector2(0.8, 0.65);
var newpos = texturePosToPlaneWorld(planeOb, texcoord);
cubeOb.position.copy(newpos);
Planes are simple. The edge between vertices A, B -- vector A->B defines the direction for 'x' in your texture, and A->C similarily for the other direction in which the plane goes in the 3d space .. where you have texture's 'y' mapped on the plane.
Let's assume your pivot point is in the middle. So that's known in world space. Then as UV go from 0 to 1, e.g. UV coord (1.0, 0.5) would be in half way of the full width of the plane in the direction of the vector from of your Plane object pivot .. going from middle all the way to the edge. And then in the middle otherwise, where you have 0.5 in V (as in normalized texture y pixelcoord).
To know the coordinates of the vertices of the plane in world space, you just multiple them with the orientation of the object..
Given you know the size of your plane, you actually don't need to look at the vertices as the orientation of the plane is already in the object matrix. So you just need to adapt your UV coord to the pivot in middle (-0.5) and multiply with the plane size to get the point in plane space. Then the matrix multiplication converts that to world space.
I am getting the intersections of mouse click with Three.js like this
me.vector.set(
(event.clientX / window.innerWidth) * 2 - 1,
-(event.clientY / window.innerHeight) * 2 + 1,
0.5);
me.vector.unproject(app.camera);
me.ray.set(app.camera.position, me.vector.sub(app.camera.position).normalize());
var intersects = me.ray.intersectObjects(app.colliders, false);
So, i got intersects perfectly, with following properties:
distance, face, faceIndex, object, point, and then I execute a function.
The problem is the following:
I want to detect when i click a face of a cube, that is like a floor, in the next example would be the gray face.
sorry about my engilsh D:
WebGL defines vertices and faces with coordinates, colors, and normals. A face normal is a normalized vector, perpendicular to the face plane (and generally pointing 'outside' the mesh). It defines its orientation and enables calculation of lightning for instance. In three.js you can access it via face.normal.
If your floor-like faces are stricly horizontal, then their normals are all precisely {x:0,y:1,z:0}. And since normals are normalized, simply check whether face.normal.y === 1 also checks that x and y equal 0.
If your faces are not strictly horizontal, you may need to set a limit angle with the y-axis. You can calculate this angle with var angle=Math.acos(Yaxis.dot(faceNormal)) where Yaxis=new THREE.Vector3(0,1,0).
So I'm working on a particle emitter with javascript and canvas.
And I want to be able to set what direction the particles are emitting based on an angle.
This can be done with this function:
y = Math.tan(45 * Math.PI/180);
Which returns 1 if the angle is 45. etc.
But I don't exacly know how I should implement this since pixels are calculated a little different. Think -1 as removing one pixel each step and 1 as adding one pixel.
If the angle is 45, Y is 1 and X is 1 which is correct.
But to get a pixel traveling at 315 degrees Y is -1 and X should be 1.
And at 225 degrees Y should be -1 (but is 1) and X should be -1.
How should the function look like if it should work like this?
Here is an image of how im thinking:
(The emitter is in the origin.)
Actually it's simple,
angle = (angle * Math.PI/180) % 360;
tangent = Math.tan(angle);
Since you do not know where is x;
section_x_positive = (angle<90||angle>270?1:-1);
section_y_positive = (angle>0&&angle<180?1:-1);
x = abs(tangent) * section_x_positive;
y = abs(tangent) * section_y_positive;
It sounds to me like your problem is that you're thinking about direction, which is a vector quantity, as if it were a scalar.
You need to remember that a 2D vector is represented as two components:
You can work in terms of unit vectors, so the magnitude r = 1.
So if you have a direction angle, which should be measured in radians, increasing in the counterclockwise direction, and starting at the x = 0 horizontal axis, you'll end up with two components of the unit vector that points in the direction you want.