I'm using BabylonJS V3 with Blender 2.79 to create product visualizations. Many times, it is necessary to define more complex shaders in the JS code. I'm using lines like
scene.meshes[1].material.emissiveColor = new BABYLON.Color3(1, 0, 0);
to define the shaders after export. Usually every mesh can get it's own shader this way. Unfortunately in one case, the shader of multiple meshes is overwritten. Did someone have a similar problem?
All meshes are named individually, they all have a basic (individual) shader from blender. They don't share any datablocks, no instancing or duplication was done. I'm thankful for every hint.
Edit
It seems, the error occurs with the new version (3.0), updating to 3.1 fixes the problem, but introduces errors with the arc-rotate camera. As soon as you click on the canvas, to rotate the view, you can't release the mouse anymore. Are the latest stable releases buggy?
Edit 2
After some in depth trouble shooting we came to the conclusion, that the 3.0 and 3.1 versions and/or their exporter plugins are faulty. Even in the simplest testscenes, this error occurs. Alongside other problems, like broken cameras and displaced geometry.
Be aware that by default materials are shared for performance reason. So this is probably not a bug but a feature.
If you want to change the material for a single mesh you will first need to clone it
Related
I have a general scene where I'd like to show different kind of models. Depending on the source of the model sometimes the model contains MeshPhongMaterial, sometimes MeshStandardMaterial, and sometimes both of them.
I also have a specific lighting model with an AmbientLight, and a DirectionalLight that points always to the same direction as the camera, so you will see clearly what you are looking at right now.
To make MeshStandardMaterial look better I've also added an environment map to the scene (not the materials), and I was pretty satisfied with the result.
Here is the result with r130 (Phong material on the left, Standard material on the right):
After I update three.js to r131 the result looks something like this:
I understand that environment maps are auto-converted to PMREM from r131, and this causes the change. I also understand that this is more correct than using non PMREM environment maps, but now it messes up my scene.
On some other topic it was recommended to remove ambient and directional light (because lighting now comes from the environment), but it results in this:
Now the object with standard material looks fine, but the object with phong material is completely black. I've also lost my previous feature that the directional light always points where the camera looks.
By removing ambient light only I get this (still not what I want to achieve):
So basically my question is: Although I know that this is not physically correct, is there a way to apply an environment map that doesn't affect the lighting of the scene, but affects reflections of standard materials?
Here you can find the code of the mentioned scene:
https://github.com/kovacsv/Online3DViewer/blob/dev/sandbox/three_envmap_issue/three_viewer.html
And here you can see it live:
https://raw.githack.com/kovacsv/Online3DViewer/dev/sandbox/three_envmap_issue/envmap_issue.html
So basically my question is: Although I know that this is not physically correct, is there a way to apply an environment map that doesn't affect the lighting of the scene, but affects reflections of standard materials?
No, there isn't. MeshStandardMaterial and MeshPhysicalMaterial require now a more strict PBR workflow. As you pointed out correctly, your previous setup was physically incorrect. This has been fixed and there are no plans right now to allow previous workflows again. Environment maps are considered to be used as IBLs. So conceptually they always affect the lighting no matter how you parameterize the material.
The solution for your use case is to a) use phong materials or b) update the lighting of your scene and accept the new style.
I loaded a Mesh from a JSON File, here is my current result:
my Project
It is an object I exported from blender as JSON and then used its vertices to create a geometry of Points (THREE.Points) (which is important for the looks of it)
I am now looking for a way to "animate" the Points, so that the "surface" looks vivid / living. So basically it should be moving around a bit, something like this (without the rotation):
Link to animated Gif
I have ruled out displacementMap, as this does not work for the PointsMaterial (or does someone know a workaround?)
Does anyone have hints or ideas? I thought of maybe morphing 2-3 Objects .. but I am not sure if this will work for a points mesh.
One approach to achieve your desired effect is to use morph target animation (also called vertex morphing). As you can see at the following example, three.js does support morph target animations with points.
https://threejs.org/examples/webgl_morphtargets_sphere.html
There is a lot of existing literature about vertex morphing, so it should be no problem to get familiar with this technique. I suggest you create your animations in Blender, export the model to glTF and the load the file via GLTFLoader into your app like shown in the example.
Although I'm yet to touch Three.js, I know that it simply abstracts away many of the boiler-plate that comes with WebGL.
As a result of this, and a learn-by-example style documentation, what utility of Three.js should I use for displaying 4 million points which will be mostly static, but animate to a new position on an uncommon click event?
I'm assuming the use of VBO or FbO would be needed, but how are these functionalities encapsulated into Three.js, if at all?
Thank you.
I have a simple indoor scenario I've exported from blender. It has a room with 3 spheres on the ceiling, and the respective light sources inside them. Each one of the lights work well on their own, but when I insert all of them in the scene, only one of them works! Works with 2, sometimes, but never with the three of them.
Here's my code for the lights:
luz_sala1 = new THREE.PointLight(0xFFFFFF,0.5, 50.0);
luz_sala1.position = new THREE.Vector3(16.14323,2.52331,13.93375);
scene.add(luz_sala1);
luz_sala2 = new THREE.PointLight(0xFFFFFF, 0.5, 50.0);
luz_sala2.position = new THREE.Vector3(27.70114,2.52331,-6.20571);
scene.add(luz_sala2);
luz_sala3 = new THREE.PointLight(0xFFFFFF, 0.5, 50.0);
luz_sala3.position = new THREE.Vector3(21.50580,3.10719,-27.82775);
scene.add(luz_sala3);
If I set the distances to 0, it works well, but I need these lights to influence only the area they are in.
I've also tried with THREE.Spotlight(0xFFFFFF,0.5,50.0,Math.PI, 0) but with the same result.
It looks like the lights negate each other when they share the same distance somehow?
Please help, this is very confusing.
EDIT: Also, I have another section of the room with some spotlight models (I have about 4 of them), but I'm getting shader compiling errors when I add those 4 more spotlights to the scene. After searching for the problem, I saw that I need to set the maxLights property in the renderer. I set it to 10, but the problem still occurs, I can't have more than 4 lights in the scene. Is there anything else I can do?
EDIT 2: Here are some images. For reference, the "luz_sala1" is the one closer to the TV, the "luz_sala2" is the middle one, and the "luz_sala3" is the one more far away.
This one is with the code above (all 3 lights), except with 0.8 intensity.
http://www.mediafire.com/view/?s85qr4rplhort29
And this is with the 2 and 3 turned on (commented the "scene.add(luz_sala1);"):
http://www.mediafire.com/view/?83qbbua9f8ee3b4
So, as you can see, 2 point lights work well together, but with 3 they seem to "add up" to the first?
The maxLight property not having any effect is most likely due to your hardware, drivers or ANGLE (library that translates WebGL to Direct3D) not supporting enough varying vectors in shaders - each light requires one and other things too. This might also be in the background of your general problem.
In order to have more lights there are three options:
Try if it helps if you make your browser prefer native OpenGL over ANGLE (google for instructions). Make sure you have up-to-date OpenGL drivers installed though.
Implement a deferred renderer. This is nowadays very common in the desktop world, but it's tricky if not impossible to implement with good performance in WebGL due to framebuffer limitations.
Implement a light manager that only ever uses some lights, disabling the rest. Simplest, though far from perfect method would be to select the lights closest to the camera.
Also worth mentioning is that currently SpotLights are just PointLights that cast shadow to one direction.
Three.js is commonly used with WebGL, but I am interested in using its CanvasRenderer, because of compatibility issues. However, I require textured models.
I have seen this one demo, and none else, showing that it is possible to have a textured mesh created in a 3D program and rendered with Three.js. This demo even has animations, but I just need as much as textured meshes.
I'd like to know if there is a way to do this without crafting my own solution. Specifically I'm looking for a way to export from something like Blender and be able to import it with Three.js using the Canvas renderer.
Also, I know the speed implications, I need simple low-poly output.
Have you considered using the Blender exporter?