Cesium JS ellipsoid tangent plane calculation - javascript

Problem
I am a bit confused on how Cesium calculates geodetic and geocentric surface normal. Planes, generated from calculated normal do not actually tangent to ellipsoid surface in a given point, more then, the plane created from geodetic surface normal is just the same as one generated from geocentric surface normal.
Example & why do I need this
In this Sandcastle XYZ axis along with planes that should be tangent to ellipsoid are drawn when you click on the globe surface.
What I need is to get an XY plane tangent to Ellipsoid at given point and then project other points onto this plane. The problem is as I see the plane is not tangent to ellipsoid at all.
By the way - when I use built in method: Cesium.EllipsoidTangentPlane(); I also get strange result - plane is still not tangent to ellipsoid and also floating somewhere in space. Here is sandcastle.
Appreciate any help because I do not understand what happens here.

The normal of the plane is defined at the local coordinate system of the plane, of which x, y, and z-axis indicate the east, north, and up direction in the earth(world or ECEF ), respectively.
So if the normal vector of a plane is given in the ECEF, you have to convert it at the plane's local coordinate system like this.
// get the local coordinate system of the plane
var transform = Cesium.Transforms.eastNorthUpToFixedFrame(clickedPoint);
// get invert matrix
var inv = Cesium.Matrix4.inverseTransformation(transform, new Cesium.Matrix4());
// in this case actually world normal coincide with up direction(z axis of coordinate system)
// to avoid error we slightly extend it.
var extendedWordNormal = Cesium.Cartesian3.multiplyByScalar(clickedPoint, 1.001, new Cesium.Cartesian3());
// it will be nearly same as (0, 0, 1)
var localNormal = Cesium.Matrix4.multiplyByPoint(inv, extendedWordNormal, new Cesium.Cartesian3());
//var localNormal = new Cesium.Cartesian3(0, 0, 1);
Please check this sandcastle and this sample

Related

3D model in HTML/CSS; Calculate Euler rotation of triangle

TLDR; Given a set of triangle vertices and a normal vector (all in unit space), how do I calculate X, Y, Z Euler rotation angles of the triangle in world space?
I am attemping to display a 3D model in HTML - with actual HTML tags and CSS transforms. I've already loaded an OBJ file into a Javascript class instance.
The model is triangulated. My first aim is just to display the triangles as planes (HTML elements are rectangular) - I'll be 'cutting out' the triangle shapes with CSS clip-path later on.
I am really struggling to understand and get the triangles of the model rotated correctly.
I thought a rotation matrix could help me out, but my only experience with those is where I already have the rotation vector and I need to convert and send that to WebGL. This time there is no WebGL (or tutorials) to make things easier.
The following excerpt shows the face creation/'rendering' of faces. I'm using the face normal as the rotation but I know this is wrong.
for (const face of _obj.faces) {
const vertices = face.vertices.map(_index => _obj.vertices[_index]);
const center = [
(vertices[0][0] + vertices[1][0] + vertices[2][0]) / 3,
(vertices[0][1] + vertices[1][1] + vertices[2][1]) / 3,
(vertices[0][2] + vertices[1][2] + vertices[2][2]) / 3
];
// Each vertex has a normal but I am just picking the first vertex' normal
// to use as the 'face normal'.
const normals = face.normals.map(_index => _obj.normals[_index]);
const normal = normals[0];
// HTML element creation code goes here; reference is 'element'.
// Set face position (unit space)
element.style.setProperty('--posX', center[0]);
element.style.setProperty('--posY', center[1]);
element.style.setProperty('--posZ', center[2]);
// Set face rotation, converting to degrees also.
const rotation = [
normal[0] * toDeg,
normal[1] * toDeg,
normal[2] * toDeg,
];
element.style.setProperty('--rotX', rotation[0]);
element.style.setProperty('--rotY', rotation[1]);
element.style.setProperty('--rotZ', rotation[2]);
}
The CSS first translates the face on X,Y,Z, then rotates it on X,Y,Z in that order.
I think I need to 'decompose' my triangles' rotation into separate axis rotations - i.e rotate on X, then on Y, then on Z to get the correct rotation as per the model face.
I realise that the normal vector gives me an orientation but not a rotation around itself - I need to calculate that. I think I have to determine a vector along one triangle side and cross it with the normal, but this is something I am not clear on.
I have spent hours looking at similar questions on SO but I'm not smart enough to understand or make them work for me.
Is it possible to describe what steps to take without Latex equations? I'm good with pseudo code but my Math skills are severely lacking.
The full code is here: https://whoshotdk.co.uk/cssfps/ (view HTML source)
The mesh building function is at line 422.
The OBJ file is here: https://whoshotdk.co.uk/cssfps/data/model/test.obj
The Blender file is here: https://whoshotdk.co.uk/cssfps/data/model/test.blend
The mesh is just a single plane at an angle, displayed in my example (wrongly) in pink.
The world is setup so that -X is left, -Y is up, -Z is into the screen.
Thank You!
If you have a plane and want to rotate it to be in the same direction as some normal, you need to figure out the angles between that plane's normal vector and the normal vector you want. The Euler angles between two 3D vectors can be complicated, but in this case the initial plane normal should always be the same, so I'll assume the plane normal starts pointing towards positive X to make the maths simpler.
You also probably want to rotate before you translate, so that everything is easier since you'll be rotating around the origin of the coordinate system.
By taking the general 3D rotation matrix (all three 3D rotation matrices multiplied together, you can find it on the Wikipedia page) and applying it to the vector (1,0,0) you can then get the equations for the three angles a, b, and c needed to rotate that initial vector to the vector (x,y,z). This results in:
x = cos(a)*cos(b)
y = sin(a)*cos(b)
z = -sin(b)
Then rearranging these equations to find a, b and c, which will be the three angles you need (the three values of the rotation array, respectively):
a = atan(y/x)
b = asin(-z)
c = 0
So in your code this would look like:
const rotation = [
Math.atan2(normal[1], normal[0]) * toDeg,
Math.asin(-normal[2]) * toDeg,
0
];
It may be that you need to use a different rotation matrix (if the order of the rotations is not what you expected) or a different starting vector (although you can just use this method and then do an extra 90 degree rotation if each plane actually starts in the positive Y direction, for example).

Create a triangle around a point, perpendicular to a normal

I'd like get the points of a triangle around a point where the face would point in the direction of a specified normal. I'll be using THREE.js to add them to a BufferGeometry.
Very crude drawing:
Here's the code I have so far:
//The XYZ location of a point:
var x = model.points[i*3];
var y = model.points[i*3+1];
var z = model.points[i*3+2];
//The normal vector direction:
var nx = model.normals[i*3];
var ny = model.normals[i*3+1];
var nz = model.normals[i*3+2];
How can I pick 3 more points around this point that are all perpendicular to the normal and the same distance from the point / each other?
THANKS!
1) Take cross product of the normal with an arbitrary non-parallel vector. This will get you a vector perpendicular to the normal vector.
1.5) Normalize and scale the perpendicular vector to desired size. The length of this vector will be the distance from the triangle's centroid to each of its vertices.
2) Rotate the perpendicular vector by 2PI/3 and 4PI/3 around the normal vector.
3) Add the 3 vectors to the center point.
Note that there are infinitely many triangles that fit your criteria, even if we limit to only equilateral triangles. This is because there is an entire plane which is perpendicular to the given vector <nx, ny, nz> through the given point (x, y, z). Read here to see how to derive the equation for that plane. From there, you will need to pick a point on the plane. Then you can calculate the other two points by rotating around the given point at (x, y, z).
You need to find the plane parallel to the normal and containing the point (there is only one) and then pick any point in this plane with the specified distance and rotate it two times by 120 degree around the centeral point.

Project visible pixels in one view onto another

In WebGL or in pure matrix math I would like to match the pixels in one view to another view. That is, imagine I take pixel with x,y = 0,0. This pixel lies on the surface of a 3d object in my world. I then orbit around the object slightly. Where does that pixel that was at 0,0 now lie in my new view?
How would I calculate a correspondence between each pixel in the first view with each pixel in the second view?
The goal of all this is to run a genetic algorithm to generate camouflage patterns that disrupt a shape from multiple directions.
So I want to know what the effect of adding a texture over the object would be from multiple angles. I want the pixel correspondencies because rendering all the time would be too slow.
To transform a point from world to screen coordinates, you multiply it by view and projection matrices. So if you have a pixel on the screen, you can multiply its coordinates (in range -1..1 for all three axes) by inverse transforms to find the corresponding point in world space, then multiply it by new view/projection matrices for the next frame.
The catch is that you need the correct depth (Z coordinate) if you want to find the movement of mesh points. For that, you can either do trace a ray through that pixel and find its intersection with your mesh the hard way, or you can simply read the contents of the Z-buffer by rendering it to texture first.
A similar technique is used for motion blur, where a velocity of each pixel is calculated in fragment shader. A detailed explanation can be found in GPU Gems 3 ch27.
I made a jsfiddle with this technique: http://jsfiddle.net/Rivvy/f9kpxeaw/126/
Here's the relevant fragment code:
// reconstruct normalized device coordinates
ivec2 coord = ivec2(gl_FragCoord.xy);
vec4 pos = vec4(v_Position, texelFetch(u_Depth, coord, 0).x * 2.0 - 1.0, 1.0);
// convert to previous frame
pos = u_ToPrevFrame * pos;
vec2 prevCoord = pos.xy / pos.w;
// calculate velocity
vec2 velocity = -(v_Position - prevCoord) / 8.0;

How to get 3D point coordinates given UV coordinates on a 3d plane object - Threejs

I'm trying to build some simple data visualisation and my weapon of choice is Three.js.I'have a series of PlaneGeometry meshes on which I apply a transparent texture dynamically created with a series of red square on it drawn at different opacity values.My plan is to use those points to create other meshes ( eg. CylinderGeometry ) and place them on top of the red square with an height value based on the red square opacity value.So far I could manage to find the UV values for each square and store it to an array, but I'm getting blocked at converting such red square UV coordinates to the 3D world coordinates system.I've found several resource describing the same concept applied to a sphere, and surprisingly it is pretty straight forward, but no other resources about applying the same concept to other mesh.
How can I get the 3D coordinates of those red square inside the texture?
EDIT: I think this is it:
function texturePosToPlaneWorld(planeOb, texcoord)
{
var pos = new THREE.Vector3();
pos.x = (texcoord.x - 0.5) * PLANESIZE;
pos.y = (texcoord.y - 0.5) * PLANESIZE;
pos.applyMatrix4(planeOb.matrix);
return pos;
}
Is used like this in the jsfiddle I made: http://jsfiddle.net/L0rdzbej/2/
var texcoord = new THREE.Vector2(0.8, 0.65);
var newpos = texturePosToPlaneWorld(planeOb, texcoord);
cubeOb.position.copy(newpos);
Planes are simple. The edge between vertices A, B -- vector A->B defines the direction for 'x' in your texture, and A->C similarily for the other direction in which the plane goes in the 3d space .. where you have texture's 'y' mapped on the plane.
Let's assume your pivot point is in the middle. So that's known in world space. Then as UV go from 0 to 1, e.g. UV coord (1.0, 0.5) would be in half way of the full width of the plane in the direction of the vector from of your Plane object pivot .. going from middle all the way to the edge. And then in the middle otherwise, where you have 0.5 in V (as in normalized texture y pixelcoord).
To know the coordinates of the vertices of the plane in world space, you just multiple them with the orientation of the object..
Given you know the size of your plane, you actually don't need to look at the vertices as the orientation of the plane is already in the object matrix. So you just need to adapt your UV coord to the pivot in middle (-0.5) and multiply with the plane size to get the point in plane space. Then the matrix multiplication converts that to world space.

How to hide parts of 3D objects that stick out of the back of other (complex) 3D objects?

I'm rendering a complex 3D mesh with Three.js (an iliac bone). Then I'm rendering some simple spheres along with this mesh to mark certain points on the surface (where muscles would attach):
The problem is, the mesh is quite thin in some areas, and the markers will stick out the back.
Assume that the marker coordinates are always closer to the front face of the mesh than the back face, and that the spheres always show more surface area / volume on the front of the mesh than on the back:
How could I hide the parts that extrude out the back without manually intervening for specific markers?
Edit: Here's a (naive?) way of how I might do it. I would like feedback on the feasibility of the idea, and (some pointers to writing) actual code to do it:
for each marker sphere:
find all faces of the mesh that intersect with the sphere
compute all outward-facing normal vectors of those faces (vertex-normals? face-normals?)
compute all distances from the center of the face to the center of the sphere
add all those normal vectors, weighed by their respective distances
given the (normalized?) result vector, hide the hemisphere pointing in that direction
I'm not sure how to code any of those steps. Nor am I sure if this is even a sensible approach.
Draw hemispheres instead of full spheres.
Use phiStart and phiLength parameters of the SphereGeometry constructor.
The centers of the spheres will still be on the surface of the bone (a vertex).
The orientation of one sphere will be given by the normal calculated in the sphere origin.
Three.js already calculates the normals for a mesh in order to determine how light will bounce from the mesh. You can use the VertexNormalsHelper to display normals for your mesh:
var bone = ...; // bone mesh
var scene = ...; //your THREE.Scene
scene.add(new THREE.VertexNormalsHelper(bone));
The source code for VertexNormalsHelper can be found here:VertexNormalsHelper
You have to calculate the difference angles between the normal vector and oZ axis so you obtain difX and difY. These are the ammounts you must rotate your sphere in the X and Y directions to make it perpendicular on the local surface of the bone.

Categories

Resources