I loaded a Mesh from a JSON File, here is my current result:
my Project
It is an object I exported from blender as JSON and then used its vertices to create a geometry of Points (THREE.Points) (which is important for the looks of it)
I am now looking for a way to "animate" the Points, so that the "surface" looks vivid / living. So basically it should be moving around a bit, something like this (without the rotation):
Link to animated Gif
I have ruled out displacementMap, as this does not work for the PointsMaterial (or does someone know a workaround?)
Does anyone have hints or ideas? I thought of maybe morphing 2-3 Objects .. but I am not sure if this will work for a points mesh.
One approach to achieve your desired effect is to use morph target animation (also called vertex morphing). As you can see at the following example, three.js does support morph target animations with points.
https://threejs.org/examples/webgl_morphtargets_sphere.html
There is a lot of existing literature about vertex morphing, so it should be no problem to get familiar with this technique. I suggest you create your animations in Blender, export the model to glTF and the load the file via GLTFLoader into your app like shown in the example.
Related
We've been dealing with the drawing blueprints with Three.JS. We want convert our 3D object into to the manufacturing bluepritns (TOP Orthogonal view on drawings) .
What we have:
What we want to export:
Does anybody dealt with that ? I cannot find workable solution on internet. We have been struggling with this for a long time :\
You can just rotate your object and then download it from canvas:
How do you save an image from a Three.js canvas?
You can also change object type from full mesh into armature so that it becomes transparent.
Another option if you need some postprocessing is export your 3d object into blender (there are many exporter modules available).
Then in blender, write a script to create image in three projections you need.
I think you can automate this all quite quickly.
PD. If you need stl format exporter. Will this fit your need?
https://threejs.org/examples/misc_exporter_stl.html
In working with Three.js and I’ve run across several useful Helper classes that really make displaying and or modifying the scene much easier. There is one tool out there that I can’t seem to find again. It is kind of like the AxisHelper however it has a plane between the axis when you mouse over that area allowing the user to move the object along the xy, xz, or yz plane depending on what you pick. I’ve drawn an example of what it adds to the object in order to help the user move the object along the plane. If anyone knows of this tool or maybe an example of something that uses a utility like this, it would be great if you could point it out to me. Thanks.
I expect you are looking for TransformControls. There is a three.js example of its use here.
TransformControls is not part of the library -- it is part of the examples. You must include it explicitly in your project.
three.js r.80
I'm working on porting an existing three.js project to WebVR + Oculus Rift. Basically, this app takes an STL file as input, creates a THREE.Mesh based on it and renders it on an empty scene. I managed to make it work in Firefox Nightly with VREffect plugin to three.js and VRControls. A problem I have is models rendered in VR aren't really 3D. Namely, when I move the HMD back and forth an active 3D model doesn't get closer/farther, and I can't see different sides of the model. It looks like the model is rather a flat background image stuck to its position. If I add THREE.AxisHelper to the scene, it is transformed correctly when HMD is moved.
Originally, THREE.OrbitControls were used in the app and models were rotated and moved properly.
There's quite some amount of source code so I'll post some snippets on demand.
It turned out that technically there was no problem. The issue was essentially with different scales of my models and Oculus movements. When VRControls is used with default settings, it reports a position of HMD as it reads it from Oculus, in meters. So, the range of movements of my head could barely exceed 1 m, whereas average sizes of my models are about a few dozens of their own units. When I used them altogether at the same scene, it was like a viewer is an ant looking at a giant model. Naturally, the ant have to walk a while to see another side of the model. That's why it seemed like not a 3D body.
Fortunately, there's a scale property of VRControls that should be used for adjusting scale of HMD movements. When I set it to about 30, everything works pretty well.
Thanks to #brianpeiris's comment, I decided to check coordinates of the model and camera once again to make sure they're not knit with each other. And, it led me to the solution.
I was making my own custom geometry with three.js (using typescript). Something was wrong with it, the object appeared all dark with Lambert material. I checked the three.js source code to see if I forgot to do something when creating the geometry. I saw these two lines appear at the end of the constructor of nearly every geometry class:
this.computeCentroids();
this.computeFaceNormals();
Adding the computeFaceNormals solved my problem. I remember something about normals having to do with lighting (so that makes sense).
But I don't know what the computeCentroids does, and where/why those centroids are needed. Can someone explain? Also do I need to call that function? What can happen if I don't?
computeCentroids calculates the centroid of each triangle in a mesh, not the center of the mesh itself.
Probably the easiest way to see their purpose is to search for .centroid in the three.js source code. AFAICS, they are not used for much apart from lighting, but then only if you're using CanvasRenderer.
Three.js is commonly used with WebGL, but I am interested in using its CanvasRenderer, because of compatibility issues. However, I require textured models.
I have seen this one demo, and none else, showing that it is possible to have a textured mesh created in a 3D program and rendered with Three.js. This demo even has animations, but I just need as much as textured meshes.
I'd like to know if there is a way to do this without crafting my own solution. Specifically I'm looking for a way to export from something like Blender and be able to import it with Three.js using the Canvas renderer.
Also, I know the speed implications, I need simple low-poly output.
Have you considered using the Blender exporter?