Problems with lights (using three.js) - javascript

I have a simple indoor scenario I've exported from blender. It has a room with 3 spheres on the ceiling, and the respective light sources inside them. Each one of the lights work well on their own, but when I insert all of them in the scene, only one of them works! Works with 2, sometimes, but never with the three of them.
Here's my code for the lights:
luz_sala1 = new THREE.PointLight(0xFFFFFF,0.5, 50.0);
luz_sala1.position = new THREE.Vector3(16.14323,2.52331,13.93375);
scene.add(luz_sala1);
luz_sala2 = new THREE.PointLight(0xFFFFFF, 0.5, 50.0);
luz_sala2.position = new THREE.Vector3(27.70114,2.52331,-6.20571);
scene.add(luz_sala2);
luz_sala3 = new THREE.PointLight(0xFFFFFF, 0.5, 50.0);
luz_sala3.position = new THREE.Vector3(21.50580,3.10719,-27.82775);
scene.add(luz_sala3);
If I set the distances to 0, it works well, but I need these lights to influence only the area they are in.
I've also tried with THREE.Spotlight(0xFFFFFF,0.5,50.0,Math.PI, 0) but with the same result.
It looks like the lights negate each other when they share the same distance somehow?
Please help, this is very confusing.
EDIT: Also, I have another section of the room with some spotlight models (I have about 4 of them), but I'm getting shader compiling errors when I add those 4 more spotlights to the scene. After searching for the problem, I saw that I need to set the maxLights property in the renderer. I set it to 10, but the problem still occurs, I can't have more than 4 lights in the scene. Is there anything else I can do?
EDIT 2: Here are some images. For reference, the "luz_sala1" is the one closer to the TV, the "luz_sala2" is the middle one, and the "luz_sala3" is the one more far away.
This one is with the code above (all 3 lights), except with 0.8 intensity.
http://www.mediafire.com/view/?s85qr4rplhort29
And this is with the 2 and 3 turned on (commented the "scene.add(luz_sala1);"):
http://www.mediafire.com/view/?83qbbua9f8ee3b4
So, as you can see, 2 point lights work well together, but with 3 they seem to "add up" to the first?

The maxLight property not having any effect is most likely due to your hardware, drivers or ANGLE (library that translates WebGL to Direct3D) not supporting enough varying vectors in shaders - each light requires one and other things too. This might also be in the background of your general problem.
In order to have more lights there are three options:
Try if it helps if you make your browser prefer native OpenGL over ANGLE (google for instructions). Make sure you have up-to-date OpenGL drivers installed though.
Implement a deferred renderer. This is nowadays very common in the desktop world, but it's tricky if not impossible to implement with good performance in WebGL due to framebuffer limitations.
Implement a light manager that only ever uses some lights, disabling the rest. Simplest, though far from perfect method would be to select the lights closest to the camera.
Also worth mentioning is that currently SpotLights are just PointLights that cast shadow to one direction.

Related

Environment map affects scene lighting after update to r131

I have a general scene where I'd like to show different kind of models. Depending on the source of the model sometimes the model contains MeshPhongMaterial, sometimes MeshStandardMaterial, and sometimes both of them.
I also have a specific lighting model with an AmbientLight, and a DirectionalLight that points always to the same direction as the camera, so you will see clearly what you are looking at right now.
To make MeshStandardMaterial look better I've also added an environment map to the scene (not the materials), and I was pretty satisfied with the result.
Here is the result with r130 (Phong material on the left, Standard material on the right):
After I update three.js to r131 the result looks something like this:
I understand that environment maps are auto-converted to PMREM from r131, and this causes the change. I also understand that this is more correct than using non PMREM environment maps, but now it messes up my scene.
On some other topic it was recommended to remove ambient and directional light (because lighting now comes from the environment), but it results in this:
Now the object with standard material looks fine, but the object with phong material is completely black. I've also lost my previous feature that the directional light always points where the camera looks.
By removing ambient light only I get this (still not what I want to achieve):
So basically my question is: Although I know that this is not physically correct, is there a way to apply an environment map that doesn't affect the lighting of the scene, but affects reflections of standard materials?
Here you can find the code of the mentioned scene:
https://github.com/kovacsv/Online3DViewer/blob/dev/sandbox/three_envmap_issue/three_viewer.html
And here you can see it live:
https://raw.githack.com/kovacsv/Online3DViewer/dev/sandbox/three_envmap_issue/envmap_issue.html
So basically my question is: Although I know that this is not physically correct, is there a way to apply an environment map that doesn't affect the lighting of the scene, but affects reflections of standard materials?
No, there isn't. MeshStandardMaterial and MeshPhysicalMaterial require now a more strict PBR workflow. As you pointed out correctly, your previous setup was physically incorrect. This has been fixed and there are no plans right now to allow previous workflows again. Environment maps are considered to be used as IBLs. So conceptually they always affect the lighting no matter how you parameterize the material.
The solution for your use case is to a) use phong materials or b) update the lighting of your scene and accept the new style.

babylon.js meshes get same material

I'm using BabylonJS V3 with Blender 2.79 to create product visualizations. Many times, it is necessary to define more complex shaders in the JS code. I'm using lines like
scene.meshes[1].material.emissiveColor = new BABYLON.Color3(1, 0, 0);
to define the shaders after export. Usually every mesh can get it's own shader this way. Unfortunately in one case, the shader of multiple meshes is overwritten. Did someone have a similar problem?
All meshes are named individually, they all have a basic (individual) shader from blender. They don't share any datablocks, no instancing or duplication was done. I'm thankful for every hint.
Edit
It seems, the error occurs with the new version (3.0), updating to 3.1 fixes the problem, but introduces errors with the arc-rotate camera. As soon as you click on the canvas, to rotate the view, you can't release the mouse anymore. Are the latest stable releases buggy?
Edit 2
After some in depth trouble shooting we came to the conclusion, that the 3.0 and 3.1 versions and/or their exporter plugins are faulty. Even in the simplest testscenes, this error occurs. Alongside other problems, like broken cameras and displaced geometry.
Be aware that by default materials are shared for performance reason. So this is probably not a bug but a feature.
If you want to change the material for a single mesh you will first need to clone it

Why is my simple webgl demo so slow

I've been trying to learn Web GL using these awesome tutorials. My goal is to make a very simple 2D game framework to replace the canvas-based jawsJS.
I basically just want to be able to create a bunch of sprites and move them around, and then maybe some tiles later.
I put together a basic demo that does this, but I hit a performance problem that I can't track down. once I get to ~2000 or so sprites on screen, the frame rate tanks and I can't work out why. Compared to this demo of the pixi.js webgl framework, which starts losing frames at about ~30000 bunnies or so (on my machine), I'm a bit disappointed.
My demo (framework source) has 5002 sprites, two of which are moving, and the frame rate is in the toilet.
I've tried working through the pixi.js framework to try to work out what they do differently, but it's 500kloc and does so much more than mine that I can't work it out.
I found this answer that basically confirmed that what I'm doing is roughly right - my algorithm is pretty much the same as the one in the answer, but there must be more to it.
I have so far tried a few things - using just a single 'frame buffer' with a single shape defined which then gets translated 5000 times for each sprite. This did help the frame rate a little bit, but nothing close the the pixi demo (it then meant that all sprites had to be the same shape!). I cut out all of the matrix maths for anything that doesn't move, so it's not that either. It all seems to come down to the drawArrays() function - it's just going really slow for me, but only for my demo!
I've also tried removing all of the texture based stuff, replacing the fragment shader with a simple block colour for everything instead. It made virtually no difference so I eliminated dodgy texture handling as a culprit.
I'd really appreciate some help in tracking down what incredibly stupid thing I've done!
Edit: I'm definitely misunderstanding something key here. I stripped the whole thing right back to basics, changing the vertex and fragment shaders to super simple:
attribute vec2 a_position;
void main() {
gl_Position = vec4(a_position, 0, 1);
}
and:
void main() {
gl_FragColor = vec4(0,1,0,1); // green
}
then set the sprites up to draw to (0,0), (1,1).
With 5000 sprites, it takes about 5 seconds to draw a single frame. What is going on here?
A look at a the frame calls using WebGLInspector or the experimental canvas inspector in chrome reveals a totally not optimized rendering loop.
You can and should use one and the same vertexbuffer to render all your geometry,
this way you can save the bindBuffer aswell as the vertexAttribPointer calls.
You can also save 99% of your texture binds as you're repetively rebinding one and the same texture. A texture remains bound as long as you do not bind something else to the same texture unit.
Having a state cache is helpful to avoid binding data that is already bound.
Take a look at my answer here about the gpu as a statemachine.
Once your rendering loop is optimized you can go ahead and consider the following things:
Use ANGLE_instanced_arrays extension
Avoid constructing data in your render loop.
Use an interlaced vertexbuffer.
In some cases not using an indexbuffer also increases
performance.
Check if you can shave off a few GPU cycles in your shaders
Break up your objects into chunks and do view frustum culling on the CPU side.
The problem is probably this line in render: glixl.context.uniformMatrix3fv(glixl.matrix, false, this.matrix);.
In my experience, passing uniforms for each model is very slow in webGL, and I was unable to maintain 60FPS after ~1,000 unique models. Unfortunately there is no uniform buffers in webgl to alleviate this problem.
I solved my problem by just calculating all the vertex positions on the CPU and draw them all using one drawArray call. This should work if the vertex count isnt overwhelming. I can draw 2k moving + rotating cubes at 60 FPS. I dont recall exactly how many cubes you can draw at 60 FPS but it is quite a bit higher than 2k. If that isnt fast enough then you have to look into drawArrayInstanced. Basically, store all the matrices on an arraybuffer and draw all your models using one drawArrayInstanced call with correct offset and such.
EDIT: also to the OP, if you want to see how PIXI does the vertex update rendering (NOT uniform instancing), see https://github.com/GoodBoyDigital/pixi.js/blob/master/src/pixi/renderers/webgl/utils/WebGLFastSpriteBatch.js.

Rendering 4 million points in Three.js

Although I'm yet to touch Three.js, I know that it simply abstracts away many of the boiler-plate that comes with WebGL.
As a result of this, and a learn-by-example style documentation, what utility of Three.js should I use for displaying 4 million points which will be mostly static, but animate to a new position on an uncommon click event?
I'm assuming the use of VBO or FbO would be needed, but how are these functionalities encapsulated into Three.js, if at all?
Thank you.

webgl shadow mapping gl.DEPTH_COMPONENT

Hey im trying to implement shadow mapping in webgl using this example:
tutorial
What im trying to do is
initialize the depth texture and framebuffer.
draw a scene to that framebuffer with a simple shader, then draw a new scene with a box that has the depthtexture as texture so i can see the depth map using an other shader.
I think i look ok with the colortexture but cant get i to work with the depthtexture its all white.
i put the code on dropbox:
source code
most is in the files
index html
webgl_all js
objects js
have some light shaders im not using at the moment.
Really hope somebody can help me.
greetings from denmark
This could have several causes:
For common setups of the near and far planes, normalized depth values will be high enough to appear all white for most of the scene, even though they are not actually identical (remember that a depth texture has an accuracy of at least 16bits, while your screen output has only 8 bits per color channel. So a depth texture may appear all white, even when its values are not all identical.)
On some setups (e.g. desktop OpenGl), a texture may appear all white, when it is incomplete, that is when texture filtering is set to use mipmaps, but not all mipmap levels have been created. This may be the same with WebGl.
You may have hit a browser WebGl implementation bug.

Categories

Resources