Custom shader - Three.js - javascript

I am trying to use a custom shader with Three.js. I tried to do it like the many examples, but it doesn't work. My code is:
var vertex = "void main(){vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );gl_Position = projectionMatrix * mvPosition;}";
var fragment = "precision highp float;void main(void){gl_FragColor = vec4(0.0,1.0,0.0,1.0);}";
material = new THREE.ShaderMaterial({
vertexShader: vertex,
fragmentShader: fragment
});
var mesh = new THREE.Mesh(geometry,material);
…and everything is blank. But if I use this material :
material = new THREE.MeshBasicMaterial({ color: 0xff0000, wireframe: true });
…everything works perfectly. What's wrong?

I found the problem: I had to use:
renderer = new THREE.WebGLRenderer();
instead of :
renderer = new THREE.CanvasRenderer();

Related

Vertex Colors are changing to white

I'm working with THREE.js points and sometimes I need them to have different per point color. Sometimes, I'm also modifying their alpha value so I had to write my own shader programs.
In JavaScript I have the following code:
let materials;
if (pointCloudData.colors !== undefined) {
geometry.colors = pointCloudData.colors.map(hexColor => new THREE.Color(hexColor));
// If the point cloud has color for each point...
materials = new THREE.ShaderMaterial({
vertexColors: THREE.VertexColors,
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
} else {
// Set color for the whole cloud
materials = new THREE.ShaderMaterial({
uniforms: {
unicolor: { value: pointCloudData.color },
},
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
}
const pointCloud = new THREE.Points(geometry, materials);
I am basically setting the mesh color to a uniform value unless I have defined per point colors - then I set vertexColors to the geometry. I also checked the values being stored in the geometry.colors and they are correct RGB values in range [0,1].
My Vertex Shader code:
attribute float size;
attribute float alpha;
varying float vAlpha;
varying vec3 vColor;
void main() {
vAlpha = alpha;
#ifdef USE_COLOR
vColor = color;
#endif
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = size * ( 300.0 / -mvPosition.z );
gl_Position = projectionMatrix * mvPosition;
}
And my Fragment shader code:
uniform vec3 unicolor;
varying vec3 vColor;
varying float vAlpha;
void main() {
#ifdef USE_COLOR
gl_FragColor = vec4(vColor, vAlpha);
#else
gl_FragColor = vec4(unicolor, vAlpha);
#endif
}
Again, I am checking if the vertexColor is set and then passing it to the Fragment Shader which then sets the per point.
For some reason, the vertices are all white when setting the color per point (screenshot: The white pixels should be green/red). I'm far from advanced user in WebGL and any help would be appreciated. Am I doing something wrong that I'm not aware of?
You are creating a custom ShaderMaterial and using this pattern in your fragment shader:
#ifdef USE_COLOR
vColor = color;
#endif
Consequently, you need to specify the material.defines like so:
var defines = {};
defines[ "USE_COLOR" ] = "";
// points material
var shaderMaterial = new THREE.ShaderMaterial( {
defines: defines,
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexshader' ).textContent,
fragmentShader: document.getElementById( 'fragmentshader' ).textContent,
transparent: true
} );
You do not need to set vertexColors: THREE.VertexColors. That is just a flag used by built-in materials to alert the renderer to set the defines for you.
three.js r.85
OK, I think I figured it out since it's working now.
I had to set the colors as geometry attributes:
const colors = new Float32Array(n * 3);
for (let i = 0; i < n; i += 1) {
new THREE.Color(pointCloudData.colors[i]).toArray(colors, i * 3);
}
geometry.addAttribute('colors', new THREE.BufferAttribute(colors, 1));
I also used the suggestion provided by WestLangley and removed the vertexColors: THREE.VertexColors, part from the Material definition and set the define as well:
materials = new THREE.ShaderMaterial({
defines: {
USE_COLOR: '',
},
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
Then in my Vertex shader I added:
attributes vec3 colors;
to get the colors passed from the JavaScript. The rest is the same, I just passed the colors to the fragment shader using the same code as in the posted question above.

Render error with edges helper and semi transparent object

I am trying to create an earth with three js like this example which is an upgrade from this one. The problem i have is that even thought in the code i add to the scene first the sky, after the earth and at the end the atmosphere the render seams not to understand this order a respondes with this
Now when i zoom in and get near the earth object the render is working correct giving this responce.
The problem as you can see is also with the THREE.EdgesHelper when in long zoom level it gets rendered in some parts even thought it is behind the earth, but in short zoom level it works perfect. Any ideas how to overcome this? This is the all the code to create the spheres:
function earthView(){
if (!scene){
main();//here i create the controls,camera,scene,renderer etc
}
// create the geometry sphere stars
var geometry = new THREE.SphereGeometry(6371000000, 36, 36)
// create the material, using a texture of startfield
var material = new THREE.MeshBasicMaterial()
material.map = THREE.ImageUtils.loadTexture('images/earthView/ESO_-_Milky_Way.jpg')
material.side = THREE.BackSide
// create the mesh based on geometry and material
var mesh = new THREE.Mesh(geometry, material)
mesh.position.set(0,0,-6371000)
scene.add(mesh)
//earth
var geometry = new THREE.SphereGeometry(6371000, 36, 36)
var material = new THREE.MeshPhongMaterial()
var earthMesh = new THREE.Mesh(geometry, material)
//earthMesh.position.set(0,-6371000,0)
//earthMesh.rotation.set(0,-Math.PI/2,0)
helper = new THREE.EdgesHelper( earthMesh );
helper.material.color.set( 0xffffff );
material.map = THREE.ImageUtils.loadTexture('images/earthView/earthmap1k.jpg')
material.bumpMap = THREE.ImageUtils.loadTexture('images/earthView/earthbump1k.jpg')
material.bumpScale = 100
material.specularMap = THREE.ImageUtils.loadTexture('images/earthView/earthspec1k.jpg')
scene.add(earthMesh);
scene.add( helper );
//atmosphere
var geometry = new THREE.SphereGeometry(7365000, 36, 36)
var material = new createAtmosphereMaterial()
material.uniforms.glowColor.value.set(0x00b3ff)
material.uniforms.coeficient.value = 0.1
material.uniforms.power.value = 2.0
//material.side = THREE.BackSide
var earthAtmo = new THREE.Mesh(geometry, material)
//earthAtmo.position.set(0,0,-6371000)
scene.add(earthAtmo);
/**
* from http://stemkoski.blogspot.fr/2013/07/shaders-in-threejs-glow-and- halo.html
* #return {[type]} [description]
*/
function createAtmosphereMaterial(){
var vertexShader = [
'varying vec3 vNormal;',
'void main(){',
' // compute intensity',
' vNormal = normalize( normalMatrix * normal );',
' // set gl_Position',
' gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );',
'}',
].join('\n')
var fragmentShader = [
'uniform float coeficient;',
'uniform float power;',
'uniform vec3 glowColor;',
'varying vec3 vNormal;',
'void main(){',
' float intensity = pow( coeficient - dot(vNormal, vec3(0.0, 0.0, 1.0)), power );',
' gl_FragColor = vec4( glowColor * intensity, 1.0 );',
'}',
].join('\n')
// create custom material from the shader code above
// that is within specially labeled script tags
var material = new THREE.ShaderMaterial({
uniforms: {
coeficient : {
type : "f",
value : 1.0
},
power : {
type : "f",
value : 2
},
glowColor : {
type : "c",
value : new THREE.Color('blue')
},
},
vertexShader : vertexShader,
fragmentShader : fragmentShader,
side : THREE.FrontSide,
blending : THREE.AdditiveBlending,
transparent : true,
depthWrite : false,
});
return material
}
}
I use renderer.sortObjects = false when i define the renderer so the objects get rendered by the order they are added in the scene.
Brief summary: get the helper and the atmosphere rendered like pic 2 in all zoom levels.
Update Hint : 1.might this be a problem of the graphic card?2.might the long distance i am using in pixels be the problem?
This question has already been answered at this post.So setting the logarithmicDepthBuffer: true did the job!

Three.js: Best way to draw lots of triangles (50k +) with alpha color

I'm currently building a web application to visualize CAD / CAE data.
data
is an array with all my triangles.
My first approach to draw the triangles is to create one geometry:
triangles = new THREE.Geometry();
And then add all triangles to it:
for (var i = 0; i < data.triangles.length; i++) {
triangles.vertices.push(new THREE.Vector3(data.triangles[i][0].x, data.triangles[i][0].y, data.triangles[i][0].z));
triangles.vertices.push(new THREE.Vector3(data.triangles[i][1].x, data.triangles[i][1].y, data.triangles[i][1].z));
triangles.vertices.push(new THREE.Vector3(data.triangles[i][2].x, data.triangles[i][2].y, data.triangles[i][2].z));
triangles.faces.push(new THREE.Face3(0,1,2));
triangles.faces[i].vertexColors[0] = new THREE.Color(0xFF0000);
triangles.faces[i].vertexColors[0].setRGB(data.triangles[i][0].r, data.triangles[i][0].g, data.triangles[i][0].b);
triangles.faces[i].vertexColors[1] = new THREE.Color(0x00FF00);
triangles.faces[i].vertexColors[1].setRGB(data.triangles[i][1].r, data.triangles[i][1].g, data.triangles[i][1].b);
triangles.faces[i].vertexColors[2] = new THREE.Color(0x0000FF);
triangles.faces[i].vertexColors[2].setRGB(data.triangles[i][2].r, data.triangles[i][2].g, data.triangles[i][2].b);
lvar += 3;
}
Add a material
material = new THREE.MeshBasicMaterial({
vertexColors: THREE.VertexColors,
side: THREE.DoubleSide,
transparent: true,
opacity: .99
});
and add the mesh to my scene object.
var mesh = new THREE.Mesh(triangles, material);
scene.add(mesh);
This approach works fine so far but because I only use one material I don't know how to add alpha to my vertexColors.
My second approach is to create a single geometry for every triangle.
for (var i = 0; i < data.triangles.length; i++) {
triangles = new THREE.Geometry();
triangles.vertices.push(new THREE.Vector3(data.triangles[i][0].x, data.triangles[i][0].y, data.triangles[i][0].z));
triangles.vertices.push(new THREE.Vector3(data.triangles[i][1].x, data.triangles[i][1].y, data.triangles[i][1].z));
triangles.vertices.push(new THREE.Vector3(data.triangles[i][2].x, data.triangles[i][2].y, data.triangles[i][2].z));
triangles.faces.push(new THREE.Face3(0,1,2));
triangles.faces[i].vertexColors[0] = new THREE.Color(0xFF0000);
triangles.faces[i].vertexColors[0].setRGB(data.triangles[i][0].r, data.triangles[i][0].g, data.triangles[i][0].b);
triangles.faces[i].vertexColors[1] = new THREE.Color(0x00FF00);
triangles.faces[i].vertexColors[1].setRGB(data.triangles[i][1].r, data.triangles[i][1].g, data.triangles[i][1].b);
triangles.faces[i].vertexColors[2] = new THREE.Color(0x0000FF);
triangles.faces[i].vertexColors[2].setRGB(data.triangles[i][2].r, data.triangles[i][2].g, data.triangles[i][2].b);
lvar += 3;
material.triangles = new THREE.MeshBasicMaterial({
vertexColors: THREE.VertexColors,
side: THREE.DoubleSide,
transparent: true,
opacity: .99
});
var mesh = new THREE.Mesh(triangles, material);
scene.add(mesh);
}
With this approach I could add my alpha / opacity to the material for every facet.
Unfortunately I get low fps with the second aproach.
Approach 1: 6000 triangles => 60 fps
Approach 2: 6000 triangles => 15 fps
Is there a way to draw lots of triangles (> 50.000) with alpha for every face / point and still beeing around 60 fps (yeah I know it depends on the hardware, too).
Edit:
Using Raycaster is essential for my project.
Edit 2:
After some testing I decided to go with the following approach:
Create one geometry like I did in my first approach
Replace the MeshBasicMaterial with ShaderMaterial
First I created my two shaders:
<script type="x-shader/x-vertex" id="vertexshader">
attribute vec3 customColor;
attribute float customOpacity;
varying vec3 vColor;
varying float vOpacity;
void main() {
vColor = customColor;
vOpacity = customOpacity;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
</script>
<script type="x-shader/x-fragment" id="fragmentshader">
varying vec3 vColor;
varying float vOpacity;
void main() {
gl_FragColor = vec4( vColor, vOpacity);
}
</script>
Then I replaced my MeshBasicMaterial with ShaderMaterial.
Define my attributes.
attributes = {
customColor: { type: 'c', value: [] },
customOpacity: { type: 'f', value: []}
};
Create/Filled my geometry, colors and opacity:
geometry = new THREE.Geometry();
for (var i = 0, k, lvar = 0; i < data.triangles.length; i++) {
for (k = 0; k < 3; k++) {
geometry.vertices.push(new THREE.Vector3(data.triangles[i][k].x, data.triangles[i][k].y, data.triangles[i][k].z));
attributes.customColor.value[lvar + k] = new THREE.Color(THREE.ColorKeywords.black);
attributes.customColor.value[lvar + k].setRGB(data.triangles[i][k].r, data.triangles[i][k].g, data.triangles[i][k].b);
attributes.customOpacity.value[lvar + k] = 1.0;
}
geometry.faces.push(new THREE.Face3(lvar, lvar + 1, lvar + 2));
lvar += 3;
}
Created my ShaderMaterial:
var shaderMaterial = new THREE.ShaderMaterial({
attributes: attributes,
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
blending: THREE.NormalBlending,
depthTest: true,
transparent: true,
side: THREE.DoubleSide,
linewidth: 2
});
And finaly my mesh:
var mesh = new THREE.Mesh(sc.geometry.triangles, shaderMaterial);
scene.add(mesh);

Three.js light is not static

I need to create static light (invariant to move of camera) and i need to get actual position of light in fragment shader.
What i am doing now :
scene = new THREE.Scene();
camera = new THREE.PerspectiveCamera(60, canvas.width() / canvas.height(), 1, 10000);
camera.position.z = 2000;
camera.lookAt(0, 0, 0);
var light = new THREE.SpotLight(0xFFFFFF, 1);
light.position.set(0.5, 0.5, 0.1).normalize();
camera.add(light);
....
var lambertShader = THREE.ShaderLib['lambert'];
uniformsVolume = THREE.UniformsUtils.clone(lambertShader.uniforms);
....
materialVolumeRendering = new THREE.ShaderMaterial({
uniforms: uniformsVolume,
vertexColors: THREE.VertexColors,
vertexShader: vertVolumeRendering,
fragmentShader: fragVolumeRendering,
vertexColors: THREE.VertexColors,
lights :true
});
....
scene.add(camera);
Than in fragment shader i set uniform variable:
uniform vec3 spotLightPosition;
and compute light for voxel:
float dProd = max(0.0, dot(getGradient(posInCube), normalize(lightPos
- posInCube)));
voxelColored.rgb = voxelColored.rgb * dProd + voxelColored.rgb * 0.2;
Problem is, that it doesnt work correctly. My idea is, that i will moving with object (in reality with camera). Light will shine still from the same side (will be static in scene). At this time light is not static and work very strange.
Any idea?
Somebody please...
Tanks a lot.
Tomáš
Try with PointLight instead. SpotLight is a bit trickier to use.
To make your light static in position, then don't add it to your camera, but instead add it to the scene:
scene.add(light);
To find the position reference the variable you used for the light.
light.position

three.js skybox assigned to camera

I'm trying to make skybox assigned to player camera.
When camera moves(also skybox moves with it), texture get stretched.
How to get rid of this?
Code:
var textureCube = THREE.ImageUtils.loadTextureCube( urls );
textureCube.format = THREE.RGBFormat;
var shader = THREE.ShaderUtils.lib[ "cube" ];
shader.uniforms[ "tCube" ].value = textureCube;
cubematerial = new THREE.ShaderMaterial({
fragmentShader: shader.fragmentShader,
vertexShader: shader.vertexShader,
uniforms: shader.uniforms,
depthWrite: false,
side: THREE.BackSide
});
skyBox = new THREE.Mesh(new THREE.CubeGeometry(1000,1000,1000), cubematerial);
camera.add(skyBox);
So, after digging into Three.js examples, I found a way how to do this. http://learningthreejs.com/blog/2011/08/15/lets-do-a-sky/ is outdated. A way used in examples is to add skybox into second scene with fixed camera, and render both scenes. Look at webgl_materials_cars.html example.
Also because I use 3rd person camera assigned to character, I must get world rotation from character camera to skybox camera. This can be done on render with:
function render(){
<...>
skyboxCamera.rotation.setEulerFromRotationMatrix( new THREE.Matrix4().extractRotation( camera.matrixWorld ), skyboxCamera.eulerOrder );
renderer.render(skyboxScene, skyboxCamera);
renderer.render(scene, camera);
<...>
}
I know it's a closed question but I want to offer an alternative that does not require an additional scene, for future seekers:
start by reading and following this tutorial: http://learningthreejs.com/blog/2011/08/15/lets-do-a-sky/
now create the following shader (I added it to three.js ShaderLib, but if you don't want to temper with three' source code add it outside):
'skybox': {
uniforms: { "tCube": { type: "t", value: null },
"tFlip": { type: "f", value: -1 } },
vertexShader: [
"varying vec3 vWorldPosition;",
THREE.ShaderChunk[ "logdepthbuf_pars_vertex" ],
"void main() {",
" vec4 worldPosition = modelMatrix * vec4( position, 1.0 );",
" vWorldPosition = worldPosition.xyz;",
" gl_Position = projectionMatrix * modelViewMatrix * vec4( position + cameraPosition, 1.0 );",
THREE.ShaderChunk[ "logdepthbuf_vertex" ],
"}"
].join("\n"),
fragmentShader: [
"uniform samplerCube tCube;",
"uniform float tFlip;",
"varying vec3 vWorldPosition;",
THREE.ShaderChunk[ "logdepthbuf_pars_fragment" ],
"void main() {",
" gl_FragColor = textureCube( tCube, vec3( tFlip * vWorldPosition.x, vWorldPosition.yz ) );",
THREE.ShaderChunk[ "logdepthbuf_fragment" ],
"}"
].join("\n")
},
create your skybox like this:
// urls is a list of textures to use
var cubemap = THREE.ImageUtils.loadTextureCube(urls);
cubemap.format = THREE.RGBFormat;
var shader = THREE.ShaderLib['skybox']; // init the skybox shader we created above
shader.uniforms['tCube'].value = cubemap; // apply textures to shader
// create shader material
var skyBoxMaterial = new THREE.ShaderMaterial( {
fragmentShader: shader.fragmentShader,
vertexShader: shader.vertexShader,
uniforms: shader.uniforms,
depthWrite: false,
side: THREE.BackSide
});
// create skybox mesh
var skybox = new THREE.Mesh(
new THREE.CubeGeometry(1000, 1000, 1000),
skyBoxMaterial
);
// THIS IS IMPORTANT! or the skybox will get culled after you move the camera too far..
skybox.frustumCulled = false;

Categories

Resources