Three.js - ExtrudeGeometry using depth and a direction vector - javascript

I want to extrude a shape and create an ExtrudeGeometry, but the shape has to be extruded into a certain direction. I have a direction in a Vector3
The shape is drawn in in the x, y plane and normally the z is the extrude direction (extrusion depth). So a direction vector (0,0,1) would result in the default extrusion. But for example a (0,0,-1) would extrude the shape in the other direction.
I first tried to use an extrude path to achieve this, but when using a path the shape is allowed to "spin" freely and the initial orientation is arbitrary. This is not what I need, the shape must stay oriented as is. You can read details on this here in my previous question.
I already came up with the idea of applying a matrix to the second half of the vertices of the resulting ExtrudedGeometry, but I cannot seem to get the geometry I want. Maybe it is my clumsy use of matrices, but I think that the face normals are pointing inside out after this trick.
Note The direction vector will never be orthogonal to the z axis since this would give invalid shapes
So the question:
How do I get a reliable solution to extrude my shape into the given direction. Here an example. The shape is a square in the x,y plane (width and length 2000) the extrusion depth is also 2000 and three different vectors with a drawing of the expected result seen in 2D (front view) and 3D.

Extrude your geometry in the usual way by specifying an extrusion depth, and then apply a shear matrix to your geometry.
Here is how to specify a shear matrix that will tilt a geometry.
var matrix = new THREE.Matrix4();
var dir = new THREE.Vector3( 0.25, 1, 0.25 ); // you set this. a unit-length vector is not required.
var Syx = dir.x / dir.y,
Syz = dir.z / dir.y;
matrix.set( 1, Syx, 0, 0,
0, 1, 0, 0,
0, Syz, 1, 0,
0, 0, 0, 1 );
geometry.applyMatrix4( matrix );
(The three.js coordinate system has the y-axis up -- unlike in your illustration. You will have to accommodate.)
three.js r.113

Related

Three.js PolyhedronGeometry Vertices Position

So I am trying to create my own shape with PolyhedronGeometry
rough sketch:
I'm running into problems. Specifically I'm trying to attach this shape onto a sphere, so using some formulas I came up with the following vertices:
[ -0.6495190528383289, -0.09943689110435831, 0.36157613954394185,
0, -0.09943689110435831, 0.7433789778353299,
0.6495190528383289, -0.09943689110435831, 0.36157613954394185,
0.3897114317029973, -0.39774756441743325, 0.7231522790878837,
0, -0.5966213466261499, 0.5947031822682639,
-0.3897114317029973, -0.39774756441743325, 0.7231522790878837 ]
then for the face indices I have:
[ 5,4,1, 5,1,0, 2,1,3, 4,3,1, 0,1,2, 0,4,5, 2,3,4 ]
When I add spheres as debugging points, they appear at the right place, but no matter how I adjust the vertices/faces, some of the positions are incorrect:
Is order important for the faces?
Why does my polyhedron not draw correctly?
The problem is that the lengths of the original vectors are not equal to each other, so that the construction of their projections on the figure are not the same scope. And maybe, you specify the radius of the sphere wrong.
For example, the length of the first vector in the set:
(new THREE.Vector3(
-0.6495190528383289,
-0.09943689110435831,
0.36157613954394185 )
).length() === 0.75
A length of the last vector in the set:
(new THREE.Vector3(
-0.3897114317029973,
-0.39774756441743325,
0.7231522790878837 )
).length() === 0.9127033163903814
If you set a radius equal to the PolyhedronGeometry of the length of the first vector, the latter vector is outside the sphere of this radius. If you set a radius equal to the PolyhedronGeometry of the length of the last vector is the first vector is inside the sphere.

Rotating a vector around a sphere (simulating sun) in three.js

I am trying to adapt this code (which itself is a implementation of this). I have gotten the general visualization and rendering to work, but I'm now trying to animate some realistic movement.
The point of the light source is determined by a normalized vector (for example THREE.Vector3(1, 0.75, 0). The light will appear to be coming from the top right). Now what I would like to do is to have the vector rotate around the sphere in such a way that it seems like the sphere is orbiting the light source. But I can't figure out how to do this. I've tried updating/changing the position of the vector. But I'm not sure how to calculate the proper next x,y,z values. I've tried applying euler angles and rotational matrix, like so:
euler = new THREE.Euler(f,g,h, 'XYZ');
matrix = new THREE.Matrix4().makeRotationFromEuler(euler);
light = vector.applyMatrix4(matrix);
But here, I'm not sure how to get the correct values of f,g,h such that the light doesn't just wobble around the sphere.
Am I even on the right track?
Working example: http://jsfiddle.net/VsWb9/3890/
You are increasing linearly two of 3 coordinates in euler angles, that is the issue.
For other rotations than around x/y/z axis, the best for understanding/avoiding issues/coding/computational cost is to trust quaternions.
They are rather intuitive, in threejs too. They are made of 4 coordinates : 3 for a rotation axis, and the fourth for the rotation value.
var quat=new THREE.Quaternion();
//we set the axis around which the rotation will occur. It needs to be normalized
var axis=new THREE.Vector3(0,1,0).normalize();
//and the angle value (radians)
var angle=0;
//this is your light vector (=original position of your light)
var light=new THREE.Vector3(1,0,0).normalize();
Then in your render loop you change the angle value and tell the quaternion to use the axis above, and the updating angle :
angle+=.001//(or angle -= if your axis points in the +y direction like here)
quat.setFromAxisAngle(axis,angle);
And finally apply it :
light.applyQuaternion(quat);
Updated fiddle : http://jsfiddle.net/Atrahasis/0v93p2xy/
Note about normalized vectors :
a normalized vector is 1-unit length.
( 1 , 0 , 0 ),( 0 , 1 , 0 ),( 0 , 0 , 1 ) are native-normalized vectors (and unit vectors).
( 1 , .75 , 0 ) is not a normalized vector, its length is √(1²+.75²)=1.5625 (pythagoras)
for example ( 0.6614... , .75 , 0 ) is a normalized vector

How to get 3D point coordinates given UV coordinates on a 3d plane object - Threejs

I'm trying to build some simple data visualisation and my weapon of choice is Three.js.I'have a series of PlaneGeometry meshes on which I apply a transparent texture dynamically created with a series of red square on it drawn at different opacity values.My plan is to use those points to create other meshes ( eg. CylinderGeometry ) and place them on top of the red square with an height value based on the red square opacity value.So far I could manage to find the UV values for each square and store it to an array, but I'm getting blocked at converting such red square UV coordinates to the 3D world coordinates system.I've found several resource describing the same concept applied to a sphere, and surprisingly it is pretty straight forward, but no other resources about applying the same concept to other mesh.
How can I get the 3D coordinates of those red square inside the texture?
EDIT: I think this is it:
function texturePosToPlaneWorld(planeOb, texcoord)
{
var pos = new THREE.Vector3();
pos.x = (texcoord.x - 0.5) * PLANESIZE;
pos.y = (texcoord.y - 0.5) * PLANESIZE;
pos.applyMatrix4(planeOb.matrix);
return pos;
}
Is used like this in the jsfiddle I made: http://jsfiddle.net/L0rdzbej/2/
var texcoord = new THREE.Vector2(0.8, 0.65);
var newpos = texturePosToPlaneWorld(planeOb, texcoord);
cubeOb.position.copy(newpos);
Planes are simple. The edge between vertices A, B -- vector A->B defines the direction for 'x' in your texture, and A->C similarily for the other direction in which the plane goes in the 3d space .. where you have texture's 'y' mapped on the plane.
Let's assume your pivot point is in the middle. So that's known in world space. Then as UV go from 0 to 1, e.g. UV coord (1.0, 0.5) would be in half way of the full width of the plane in the direction of the vector from of your Plane object pivot .. going from middle all the way to the edge. And then in the middle otherwise, where you have 0.5 in V (as in normalized texture y pixelcoord).
To know the coordinates of the vertices of the plane in world space, you just multiple them with the orientation of the object..
Given you know the size of your plane, you actually don't need to look at the vertices as the orientation of the plane is already in the object matrix. So you just need to adapt your UV coord to the pivot in middle (-0.5) and multiply with the plane size to get the point in plane space. Then the matrix multiplication converts that to world space.

WebGL Vertex Space Coordinates

I try to draw a simple rectangle in webgl ( i use webgl like a 2d api ). The idea is to send attributes ( the points ), and transform them in the vertex shader to fit on the screen. But when i render with the vertex shader : gl_Position = vec4( a_point, 0.0, 1.0 ); i don't see anything. I saw WebGL Fundamentals for 2d webgl and it does not seem to work on my computer. There's rectangles but i think they are not on the good coordinates !
Can you explain me how to draw a rectangle in a special coordinate system :
-width/2 < x < width/2
-height/2 < y < height/2
and then transform them in the vertex shader to have the same position in each browser( chrome, firefox, internet explorer 11. It seems to be very simple but i have not reach my goal. I tried to make a transformation of the vertex in the vertex shader too. Maybe i can use viewport ?
In WebGL, all coordinates are in the range from -1.00f (f=float) to +1.00f. They automatically represent whatever canvas width and height you got. By default, you don't use absolute pixel numbers in WebGL.
If you set a vertex (point) to be on x=0.00 and y=0.00, it will be in the center of your screen. If one of the coordinates goes below -1 or above +1, it will be outside of your rendered canvas and some pixels from your triangle won't even be passed to fragment shader (fragment = a pixel of your framebuffer).
This way guarantees that all of your objects will have the same relative position and size, no matter how many pixels your canvas will be.
If you want to have an object of a specific pixel size, you can pre-calculate it like this:
var objectWidth = OBJECT_WIDTH_IN_PIXEL / CANVAS_WIDTH;
var objectHeight = OBJECT_HEIGHT_IN_PIXEL / CANVAS_HEIGHT;
In some cases, as you might see down below, it's better to know the half width and height in floating point -1.00 to +1.00 universe. To position this object's center into the center of your canvas, you need to setup your vertex data like:
GLfloat vVertices[] = {
-(objectWidth/2.0), -(objectHeight/2.0), 0.0, // Point 1, Triangle 1
+(objectWidth/2.0), -(objectHeight/2.0), 0.0, // Point 2, Triangle 1
-(objectWidth/2.0), +(objectHeight/2.0), 0.0, // Point 3, Triangle 1
-(objectWidth/2.0), +(objectHeight/2.0), 0.0, // Point 4, Triangle 2
+(objectWidth/2.0), -(objectHeight/2.0), 0.0, // Point 5, Triangle 2
+(objectWidth/2.0), +(objectHeight/2.0), 0.0 // Point 6, Triangle 2
}
The above vertex data sets up two triangles to create a rectangle in the center of your screen. Many of these things can be found in tutorials like WebGL Fundamentals by Greggman.
Please have a look at this post:
http://games.greggman.com/game/webgl-fundamentals/
It shows how to do 2d drawing with WebGL.
I guess you can easily adapt it to suit your need for custom 2d space coordinates.

3D normal/look-at vector from Euler angles

I'm working on a JavaScript/Canvas 3D FPS-like engine and desperately need a normal vector (or look-at vector if you will) for near and far plane clipping. I have the x and y axis rotation angles and am able to do it easily with only one of them at the time, but I just can't figure out how to get both of them...
The idea is to use this vector it to calculate a point in front of the camera, the near and far clipping planes must also be definable by constants so the vector has to be normalized, I hoped that with only the angles it would be possible to get this vector length to 1 without normalizing, but that's not the problem.
I don't have any roll (rotation around z axis) so it's that much easier.
My math looks like this:
zNear = 200; // near plane at an arbitrary 200 "points" away from camera position
// normal calculated with only y rotation angle (vertical axis)
normal = {
x: Math.sin(rotation.y),
y: 0,
z: Math.cos(rotation.y)};
Then clip a point in 3D space by testing the vector from the plane to it by means of a dot product.
nearPlane = {
x: position.x+normal.x*zNear,
y: position.y+normal.y*zNear,
z: position.z+normal.z*zNear};
// test a point at x, y, z against the near clipping plane
if(
(nearPlane.x-x)*normal.x+
(nearPlane.y-y)*normal.y+
(nearPlane.z-z)*normal.z < 0
)
{
return;
}
// then project the 3D point to screen
When a point is behind the player its projection coordinates are reversed (x=-x, y=-y) so nothing makes sense any more, that's why I'm trying to remove them.
I want that green arrow there, but in 3D.
After some intensive brain processing I figured out that
My original look-at vector was (0, 0, 1)
The z-rotation angle (roll) was always 0
There was no reason for the rotation matrix found on Wikipedia not to work
By applying the full rotation matrix on the (0, 0, 1) vector and taking in account that rz = 0 the solution I got was:
normal = {
x: Math.cos(camera.rotation.x)*Math.sin(camera.rotation.y),
y: -Math.sin(camera.rotation.x),
z: Math.cos(camera.rotation.y)*Math.cos(camera.rotation.x)};
And now everything works perfectly. The error was using only the x and y rotation matrices without taking in account rz = 0 for all angles which changed the matrix a little.

Categories

Resources