three.js extend a builtin shader - javascript

I want to take an existing working scene setup which has some meshes loaded which render perfectly when setting a MeshLambertMaterial or MeshPhongMaterial onto them. It works with the lights, all the uniforms for the lights are magically being setup under the hood, works great. Shadow mapping works great, too.
Now I want to enhance the shader. I want to add some new stuff in the shader. in my situation this is a set of point lights with hardcoded parameters but you can imagine a practical thing to do would be to open up the ability to configure these dynamically using some uniforms. This is just an intermediate step before re-engineering the render pipeline into performing deferred rendering.
When I go to three.js source and look at renderers/shaders/ShaderLib.js we see that the shader for MeshLambertMaterial is called lambert here and it is constructed mostly out of shader chunks.
What I have done is taken the entire content verbatim and stuck all of this content into my own ShaderMaterial, but my mesh now renders all black.
There is clearly some additional housekeeping that I must do with ShaderMaterial to make it start to behave properly with the lights I have set up through three.js, (so that it is like how the builtin materials behave) but I can't figure out what this is. (I tried RawShaderMaterial but all it did was make a bunch of shader compiler errors about the #if variables not being declared)

Related

Environment map affects scene lighting after update to r131

I have a general scene where I'd like to show different kind of models. Depending on the source of the model sometimes the model contains MeshPhongMaterial, sometimes MeshStandardMaterial, and sometimes both of them.
I also have a specific lighting model with an AmbientLight, and a DirectionalLight that points always to the same direction as the camera, so you will see clearly what you are looking at right now.
To make MeshStandardMaterial look better I've also added an environment map to the scene (not the materials), and I was pretty satisfied with the result.
Here is the result with r130 (Phong material on the left, Standard material on the right):
After I update three.js to r131 the result looks something like this:
I understand that environment maps are auto-converted to PMREM from r131, and this causes the change. I also understand that this is more correct than using non PMREM environment maps, but now it messes up my scene.
On some other topic it was recommended to remove ambient and directional light (because lighting now comes from the environment), but it results in this:
Now the object with standard material looks fine, but the object with phong material is completely black. I've also lost my previous feature that the directional light always points where the camera looks.
By removing ambient light only I get this (still not what I want to achieve):
So basically my question is: Although I know that this is not physically correct, is there a way to apply an environment map that doesn't affect the lighting of the scene, but affects reflections of standard materials?
Here you can find the code of the mentioned scene:
https://github.com/kovacsv/Online3DViewer/blob/dev/sandbox/three_envmap_issue/three_viewer.html
And here you can see it live:
https://raw.githack.com/kovacsv/Online3DViewer/dev/sandbox/three_envmap_issue/envmap_issue.html
So basically my question is: Although I know that this is not physically correct, is there a way to apply an environment map that doesn't affect the lighting of the scene, but affects reflections of standard materials?
No, there isn't. MeshStandardMaterial and MeshPhysicalMaterial require now a more strict PBR workflow. As you pointed out correctly, your previous setup was physically incorrect. This has been fixed and there are no plans right now to allow previous workflows again. Environment maps are considered to be used as IBLs. So conceptually they always affect the lighting no matter how you parameterize the material.
The solution for your use case is to a) use phong materials or b) update the lighting of your scene and accept the new style.

"Liquify" Surface of Points Mesh in Three.js

I loaded a Mesh from a JSON File, here is my current result:
my Project
It is an object I exported from blender as JSON and then used its vertices to create a geometry of Points (THREE.Points) (which is important for the looks of it)
I am now looking for a way to "animate" the Points, so that the "surface" looks vivid / living. So basically it should be moving around a bit, something like this (without the rotation):
Link to animated Gif
I have ruled out displacementMap, as this does not work for the PointsMaterial (or does someone know a workaround?)
Does anyone have hints or ideas? I thought of maybe morphing 2-3 Objects .. but I am not sure if this will work for a points mesh.
One approach to achieve your desired effect is to use morph target animation (also called vertex morphing). As you can see at the following example, three.js does support morph target animations with points.
https://threejs.org/examples/webgl_morphtargets_sphere.html
There is a lot of existing literature about vertex morphing, so it should be no problem to get familiar with this technique. I suggest you create your animations in Blender, export the model to glTF and the load the file via GLTFLoader into your app like shown in the example.

babylon.js meshes get same material

I'm using BabylonJS V3 with Blender 2.79 to create product visualizations. Many times, it is necessary to define more complex shaders in the JS code. I'm using lines like
scene.meshes[1].material.emissiveColor = new BABYLON.Color3(1, 0, 0);
to define the shaders after export. Usually every mesh can get it's own shader this way. Unfortunately in one case, the shader of multiple meshes is overwritten. Did someone have a similar problem?
All meshes are named individually, they all have a basic (individual) shader from blender. They don't share any datablocks, no instancing or duplication was done. I'm thankful for every hint.
Edit
It seems, the error occurs with the new version (3.0), updating to 3.1 fixes the problem, but introduces errors with the arc-rotate camera. As soon as you click on the canvas, to rotate the view, you can't release the mouse anymore. Are the latest stable releases buggy?
Edit 2
After some in depth trouble shooting we came to the conclusion, that the 3.0 and 3.1 versions and/or their exporter plugins are faulty. Even in the simplest testscenes, this error occurs. Alongside other problems, like broken cameras and displaced geometry.
Be aware that by default materials are shared for performance reason. So this is probably not a bug but a feature.
If you want to change the material for a single mesh you will first need to clone it

Using a cubemap texture as diffuse source in place of a 2D one

I'm trying to project massive, 32k-resolution equirectangular maps on spheres.
Since a 32k texture is hardly accessible for older graphics cards supporting 1k-2k sized textures, and scaling a 32k image to 1k loses a tremendous amount of detail, I've resolved to splitting each source map by projecting each into 6 cube faces to make up a cubemap, so that more detail can be displayed on older cards.
However, I'm having trouble actually displaying these cubemapped spheres with three.js. I can set the MeshPhongMaterial.envMap to my cubemap, but of course this makes the sphere mesh reflect the texture instead.
An acceptable result can be produced by using ShaderMaterial along with ShaderLib['cube'] to "fake" a skysphere of some sort. But this drops all ability for lighting, normal mapping and all the other handy (and pretty) things possible with MeshPhongMaterial.
If at all possible, I'd like to not have to write an entire shader from scratch for such a simple tweak (switching one texture2D call to textureCube). Is there a way to coerce three.js to read the diffuse term from a cubemap instead of a 2D texture, or a simple way to edit the shader three.js uses internally?

Conflict when using two or more shaders with different number of attributes

I'm trying to implement picking(using colors and readPixels) in a WebGL program of mine. When I start my program I create to seperate shaderProgram. One for phong shading and another that simply give shapes a color to be use to detect which shape has been clicked on.
The phong shader has 2 attributes. Vertex position and vertex normal. The picking one simply has the position.
Now I discovered that for some odd reason, when both of these shaders exist in the same program and I'm using the picking one, my drawArray call seems to fail. The last thing to happen is my gl.vertexAttribPointer call. I've been messing around and found out that when I check for active attrib arrays using:
gl.getVertexAttrib(index,gl.VERTEX_ATTRIB_ARRAY_ENABLED);
both 0,1 return true (this is when the picking shader is active with gl.useProgram(picking))
Now if I disable 1 with gl.disableVertexAttribArray(1); Everything works again. Another fix is to draw with the phong shader first and then use the picking shader and somehow that magically makes it ok. I'm guessing that in that case when attaching my vertex normals buffer while using the phong shader, it somehow stays when I then switch to the picking shader and the drawArray call works.
I'd like to know if I'm using gl.enableAttribArray wrong and should disable them when switching shaders or something like that.
I've also tried creating the shader programs in different order with no success.
As you've probably figured out, useProgram doesn't affect anything but the program (shader code) supposed to run in the next draw call. You have to make sure only attributes used by the current program is enabled.
If you've wrapped your WebGL code somehow, a tip is to keep the highest available attribute number for each program stored somewhere in your wrapper, then compare with the latest used program and enable/disable accordingly before the draw call.
It's hard to tell without seeing your code but... WebGL requires that all attributes that will be accessed have enough data to satisfy the draw call. So, if you setup 2 attributes each with 3 vertices of data and draw 3 vertices, then switch shaders and setup 1 attribute with 6 vertices and leave the second attribute with only 3 vertices then attempt to draw with 6 vertices, if the shader you're currently drawing with accesses both attributes WebGL will fail the draw call.
If you run Chrome 19 it should tell you in the JavaScript console if this was the issue.
OpenGL is a state machine. By selecting the picking shader you put OpenGL in a state where there's no longer the additional attributes of the phong shader.
A lot of people fall into the wrong and bad habbit to assume there was some kind of "one time initialization" with OpenGL. This is not the case. You're supposed to set all the state you need for some drawing operation, just right before that drawing operation, and ideally also to revert the settings once you're done. This means: After binding a shader, you're also supposed to enable and bind the required shader inputs aka vertex attributes.

Categories

Resources