I am new to Blender (using 2.70a) and I made a mesh from a bezier curve. [File: http://ivybaumgarten.com/3d/models/wiggler5.blend ]
Exporting it to three.js format gives just a shell of an export with nothing inside: http://pastebin.com/aVNnjE5f
Exporting it as Collada contains something that looks promising (http://pastebin.com/XFf7wMR8 ), but if I replace monster.dae from this example (http://threejs.org/examples/webgl_loader_collada.html) with my model, it doesn't show up. Exporting just a cube shows up fine.
In Blender, it also appears in Texture view, but when I switch over to Rendered, it's blank.
What am I doing wrong?
A friend pointed out that the export was empty because I hadn't selected anything before exporting. Selecting the model before exporting made the model show up in my three.js project.
I still don't know why Render is empty.
I don't know three.js and collada but if you want to export your mesh or a scene in order to show it through the Web via WebGL, I suggest you to use babylonjs. I have already done an export from 2.69 & 2.71 and it works well.
Related
I'm simply trying to add a model with a diffuse and bump texture to a simple scene in react 3 fiber.
I have literally no clue what I'm doing wrong.
Heres the sandbox: https://codesandbox.io/s/three-point-lighting-in-react-three-fiber-forked-qeqlx?file=/src/index.js
The model is a GLTF moon, that has the textures baked in. The moon is just a sphere but I want to use the GLTF model. Currently, the scene displays the background and lights, but no model.
If you have any insight about this I would appreciate it immensely!
At first glance, it looks like you're importing GLTFLoader and moon, but you're never using them. That's what those squiggly yellow lines mean:
Make sure you implement them as outlined in the documents:
const loader = new GLTFLoader();
loader.load(
moon,
function(gltf) {
scene.add(gltf.scene);
}
);
Alright, I figured it out. Thanks to #Marquizzo
Issues:
The .gltf file was not in the correct directory. It needed to be in the public directory since I am using the "npx #react-three/gltfjsx" script.
The .gltf was incredibly large relative to the scene. In Cinema 4d, it is perfectly sized. But in three.js, it was around 99x too large.
The position of the mesh was behind the camera. I had no clue that you could use Orbital Controls to move the camera around and manually find the object. I also positioned the object at [0,0,0].
The template scene I was using was from a tutorial from almost a year ago. So there had been some major developments since then that caused simple bugs on runtime. So I updated the dependencies.
Issues I still have:
Textures aren't loading from the baked-in .gltf file.
I increased the lighting and it seems that the lighting isnt the issue.
Heres the fixed sandbox: fixed
What I learned:
Orbital Controls
Use of OBJLoader
useHelpers like CameraHelper
I loaded a Mesh from a JSON File, here is my current result:
my Project
It is an object I exported from blender as JSON and then used its vertices to create a geometry of Points (THREE.Points) (which is important for the looks of it)
I am now looking for a way to "animate" the Points, so that the "surface" looks vivid / living. So basically it should be moving around a bit, something like this (without the rotation):
Link to animated Gif
I have ruled out displacementMap, as this does not work for the PointsMaterial (or does someone know a workaround?)
Does anyone have hints or ideas? I thought of maybe morphing 2-3 Objects .. but I am not sure if this will work for a points mesh.
One approach to achieve your desired effect is to use morph target animation (also called vertex morphing). As you can see at the following example, three.js does support morph target animations with points.
https://threejs.org/examples/webgl_morphtargets_sphere.html
There is a lot of existing literature about vertex morphing, so it should be no problem to get familiar with this technique. I suggest you create your animations in Blender, export the model to glTF and the load the file via GLTFLoader into your app like shown in the example.
I have the Structure.IO sensor which generates OBJ+MTL models that are correctly loaded into 3D editors (Blender, Cinema4D), but they are not showing in three.js. I've checked the loaded object and it has geometry and material that seems to be correct, but still it shows nothing when I add it to the scene.
Here you have the model I'm trying to use:
https://drive.google.com/file/d/0B8Hv0HwLV830a2cwWFpZMEpKNlU/view
Thanks in advance.
When I export an animated skinned mesh from Maya using the Three.js exporter, it always appears to have a duplicate mesh combined with it that is not animated.
I've tried all the export settings I can think of and also made sure there is not another mesh being exported with it. I can reproduce the problem by skinning, animating, and exporting any mesh.
I compared my exported .js file and I can't see any difference between that file and this working three.js example:
http://threejs.org/examples/#webgl_skinning_simple
This problem does not exist using a previous release of Three.js (r73).
There appears to be a bug in r74 and r75.
I create very simple models in Blender contains some faces with color materials then export them to Collada format and import into Three.js, the problem is some faces do not appear from some angles of view, i create this model from plane then exclude it to make modeling.
Your faces are probably facing (hehe) the wrong way. Try inverting these faces in blender and then export again.
Or try making a double sided material that draws the material inside and outside, but the first option is more efficient.
See this tutorial