Compute UV coordinates for threecsg mesh - javascript

for a university project I created a threecsg subtract in ThreeJS. I want to apply a texture for this mesh. But the missing uv coordinates after the processing is causing me some trouble. This needs to be a threecsg, because this a project requirement.
This is how the mesh looks like now: screenshot link
I found some code here: THREE.js generate UV coordinate
And it did get me closer to the solution. The side faces are now facing the applied texture in the right way: screenshot link
The upside has many weird faces. I tried to use the THREE.SimplifyModifier to get fewer faces so I might be able to calculate and set the uv-coordinates by myself, but I failed.
I thought it might be a solution to "just iterate" over the up- and downside and to kinda "cut of" the texture at the border, like it would be if the mesh were a cube. The mesh has about 350 Faces, I probably could be able to set the corner vertices but it would be nice if the uv-coordinates of the vertices in between could be calculated - but I have no idea how to to this. I do not mind about the side where cylinder is cut off, because you will not see it at the end.
thank you so much!

The CSG library I'm maintaining preserves UV coordinates from cut geometry.
https://github.com/manthrax/THREE-CSGMesh
http://vectorslave.com/csg/CSGDemo.html
http://vectorslave.com/csg/CSGShinyDemo.html
#JonasOe

Related

Why is that the light is moving in this Webgl example

Here is an example of Goraud interpolation and a Lambertian reflection model from a textbook.
https://jsfiddle.net/zhenghaohe/r73knp0h/6/
However in the textbook there is a stupid error, in this book it says the code should contain this following line, when in fact it does not.
vec3 light = vec3(uModelViewMatrix * vec4(uLightDirection, 0.0));
The weird thing is the example still seems to work.
I am aware of that the sphere is rotating because
mat4.rotate(modelViewMatrix, modelViewMatrix, angle * Math.PI / 180, [0, 1, 0]);
However it seems to me that the light is also moving with the sphere. But in the code I cannot find how the light is being moved around.
Can someone please point me to the code where we also rotate the light?
The light does not rotate, it is fixed in a static position and direction. The problem here is you do not seem to understand what a normal is and how it is used in computer graphics.
A computer model is a series of "vertices" that connect to form "faces" (usually triangles). When "realistic" light is introduced into a scene an additional piece of information is necessary to determine how it should interact with each face of the model, this is called a "normal." A normal is a directional vector that generally forms a line perpendicular to a face, but it does not have to which will become important for your problem. This normal is used to compute how light interacts with that surface.
So you have three sets of data: The vertices, the indicies (how the verticies come together to form faces), and the normals (computed automatically in your example). The problem arises when you begin to make transformations to the model (like rotation) but do not perform similar transformations to the normals that were computed before the transformation.
Let's visualize this... say we have the following pyramid with one of it's normals drawn to illustrate the problem:
Now when we start to rotate the pyramid, but we leave the normals directions unchanged we see that the angle between the normal and the face begins to change.
For things to work as expected we need to also rotate the normals so that the angle relative to the face does not change.
The angle of the light relative to the surface normal is what dictates how the surface is shaded by the light. When you're rotating the model the normals begin pointing in "random" directions, this messes with the light computation and it appears as if the light is rotating, but it is not.
Obviously this is a very watered down explanation of what is happening, but it should give you a basic understanding of what a normal is and why you need to apply transformations to them as well.

Three js rotate mesh towards a sphere

I'm kinda new in Three js and I've been struggling with this for a while.
I have a 3d model facing a certain direction.
There is also a sphere around it and before moving the mesh, I want to animate it's rotation so it will face specified sphere.
So far I managed to get the angle of rotation but I suppose that is not the way to go
this is what I use for rotating the object towards a specified point:
if(movementTarget) { playerModel.lookAt(movementTarget); }
and this is the content of the
movementTarget = {x:154,y:55,z:35};
seems like the model is not actually orienting towards the sphere, but an empty spot, not sure what is the issue
I have managed to solve the issue, the coordinate system had a general variable which amplified the distance between the objects, by calling lookAt() function, the camera was oriented in the correct direction, but since the corrdinates Were not multiplied since they came straight from the server.

How do I make the Three.js camera look at the face of an object?

I'm have a sphere made of hexagons and pentagons and I am trying to make the camera look at a particular hexagon directly - so the centre of a user's view is the hex and it is flat.
The hexagons are made using the hexasphere.js plugin (https://github.com/arscan/hexasphere.js/tree/master). I am able to extract information from a mesh object which makes up a hex. But I don't know how to take the object info and tell the camera where to go.
I have tried using the normal matrix element of the mesh and finding the euler angles - but I don't know what to then do with them.
Ok, I've found a solution. The hexasphere plugin provides the centre point of a face with hexasphereObj.tiles[i].centrePoint which is a point object and this has a method project(radius, percent) which gets the coordinates of a point at a projection from the centre of the hexasphere and through the centre of the face.
I was then able to move the camera to this projected point and have it lookAt the centre of the hexasphere.

Rounded Plane In THREE JS

THREE JS, can often seem angular and straight edged. I haven't used it for very long and thus am struggling to understand how to curve the world so to speak. I would imagine a renderer or something must be changed, but the idea is to take a 2d map and turn it into a simple three lane running game. However, if you look at the picture below from another similar game, how can i achieve the fish eye effect?
I would do that kind of effect on per-vertex base depending on the distance from the camera.
Also, maybe a bit tweaked perspective camera with bigger vertical fov would boost up the effect of the "curviness".
It's just a simple distortion effect that has been simulated in some way, it probably isn't really curved. Hope this helps.
I'm sure there are many possible different approaches... Here's one that creates nice barrel distortion effect.
You can do something like that by rendering normal wide angle camera to a texture, then project it to a lens-shaped plane (a sphere even), then the actual on-screen render is from a camera pointing to that.
I don't have the code available ATM, but I should be able to dig it up in few days if interested. Or you can just adapt from the three.js examples. Three.js includes some postprocessing examples where the scene is first rendered into a texture, that texture is applied to a a quad then photographed with ortographic camera. You can modify such an example by changing the ortographic camera to a perspective one, then distorting/changing the quad to something more appropriately shaped.
Taken to extremes, this approach can produce some pixelization / blocky artifacts.

webgl shadow mapping gl.DEPTH_COMPONENT

Hey im trying to implement shadow mapping in webgl using this example:
tutorial
What im trying to do is
initialize the depth texture and framebuffer.
draw a scene to that framebuffer with a simple shader, then draw a new scene with a box that has the depthtexture as texture so i can see the depth map using an other shader.
I think i look ok with the colortexture but cant get i to work with the depthtexture its all white.
i put the code on dropbox:
source code
most is in the files
index html
webgl_all js
objects js
have some light shaders im not using at the moment.
Really hope somebody can help me.
greetings from denmark
This could have several causes:
For common setups of the near and far planes, normalized depth values will be high enough to appear all white for most of the scene, even though they are not actually identical (remember that a depth texture has an accuracy of at least 16bits, while your screen output has only 8 bits per color channel. So a depth texture may appear all white, even when its values are not all identical.)
On some setups (e.g. desktop OpenGl), a texture may appear all white, when it is incomplete, that is when texture filtering is set to use mipmaps, but not all mipmap levels have been created. This may be the same with WebGl.
You may have hit a browser WebGl implementation bug.

Categories

Resources