I am trying to create an earth with three js like this example which is an upgrade from this one. The problem i have is that even thought in the code i add to the scene first the sky, after the earth and at the end the atmosphere the render seams not to understand this order a respondes with this
Now when i zoom in and get near the earth object the render is working correct giving this responce.
The problem as you can see is also with the THREE.EdgesHelper when in long zoom level it gets rendered in some parts even thought it is behind the earth, but in short zoom level it works perfect. Any ideas how to overcome this? This is the all the code to create the spheres:
function earthView(){
if (!scene){
main();//here i create the controls,camera,scene,renderer etc
}
// create the geometry sphere stars
var geometry = new THREE.SphereGeometry(6371000000, 36, 36)
// create the material, using a texture of startfield
var material = new THREE.MeshBasicMaterial()
material.map = THREE.ImageUtils.loadTexture('images/earthView/ESO_-_Milky_Way.jpg')
material.side = THREE.BackSide
// create the mesh based on geometry and material
var mesh = new THREE.Mesh(geometry, material)
mesh.position.set(0,0,-6371000)
scene.add(mesh)
//earth
var geometry = new THREE.SphereGeometry(6371000, 36, 36)
var material = new THREE.MeshPhongMaterial()
var earthMesh = new THREE.Mesh(geometry, material)
//earthMesh.position.set(0,-6371000,0)
//earthMesh.rotation.set(0,-Math.PI/2,0)
helper = new THREE.EdgesHelper( earthMesh );
helper.material.color.set( 0xffffff );
material.map = THREE.ImageUtils.loadTexture('images/earthView/earthmap1k.jpg')
material.bumpMap = THREE.ImageUtils.loadTexture('images/earthView/earthbump1k.jpg')
material.bumpScale = 100
material.specularMap = THREE.ImageUtils.loadTexture('images/earthView/earthspec1k.jpg')
scene.add(earthMesh);
scene.add( helper );
//atmosphere
var geometry = new THREE.SphereGeometry(7365000, 36, 36)
var material = new createAtmosphereMaterial()
material.uniforms.glowColor.value.set(0x00b3ff)
material.uniforms.coeficient.value = 0.1
material.uniforms.power.value = 2.0
//material.side = THREE.BackSide
var earthAtmo = new THREE.Mesh(geometry, material)
//earthAtmo.position.set(0,0,-6371000)
scene.add(earthAtmo);
/**
* from http://stemkoski.blogspot.fr/2013/07/shaders-in-threejs-glow-and- halo.html
* #return {[type]} [description]
*/
function createAtmosphereMaterial(){
var vertexShader = [
'varying vec3 vNormal;',
'void main(){',
' // compute intensity',
' vNormal = normalize( normalMatrix * normal );',
' // set gl_Position',
' gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );',
'}',
].join('\n')
var fragmentShader = [
'uniform float coeficient;',
'uniform float power;',
'uniform vec3 glowColor;',
'varying vec3 vNormal;',
'void main(){',
' float intensity = pow( coeficient - dot(vNormal, vec3(0.0, 0.0, 1.0)), power );',
' gl_FragColor = vec4( glowColor * intensity, 1.0 );',
'}',
].join('\n')
// create custom material from the shader code above
// that is within specially labeled script tags
var material = new THREE.ShaderMaterial({
uniforms: {
coeficient : {
type : "f",
value : 1.0
},
power : {
type : "f",
value : 2
},
glowColor : {
type : "c",
value : new THREE.Color('blue')
},
},
vertexShader : vertexShader,
fragmentShader : fragmentShader,
side : THREE.FrontSide,
blending : THREE.AdditiveBlending,
transparent : true,
depthWrite : false,
});
return material
}
}
I use renderer.sortObjects = false when i define the renderer so the objects get rendered by the order they are added in the scene.
Brief summary: get the helper and the atmosphere rendered like pic 2 in all zoom levels.
Update Hint : 1.might this be a problem of the graphic card?2.might the long distance i am using in pixels be the problem?
This question has already been answered at this post.So setting the logarithmicDepthBuffer: true did the job!
Related
I'm learning Threejs and glsl shaders. I want to fade in/out a mesh when adding or removing it from the scene. My mesh is a tubeGeometry (which joins two points from an sphere, taken from this tutorial at the end) and a shaderMaterial. My mesh configuration looks like this.
const tubeSegments = 20;
const path = new CatmullRomCurve3(points);
// Points is an array of vec3
const geom = new TubeGeometry(path, tubeSegments, 0.01, 8, false);
const material = new ShaderMaterial({
vertexShader,
fragmentShader,
side: DoubleSide,
uniforms: {
time: {
value: mesh_fragments_time.get(),
},
color: {
value: new Vector3(1, 1, 0),
},
},
});
const mesh = new Mesh(
geom,
material
);
The vertexShader:
varying vec2 vertexUV;
varying vec3 vertexNormal;
void main(){
vertexUV=uv;
vertexNormal=normalize(normalMatrix * normal);
gl_Position=projectionMatrix*modelViewMatrix*vec4(position,1);
}
The fragmenShader:
varying vec2 vertexUV;
uniform float time;
uniform vec3 color;
void main () {
float dash = sin(vertexUV.x*60. - time);
if (dash>0.) discard;
gl_FragColor=vec4(color,0);
}
Where mesh_fragments_time.get() or time is a number that changes on my requestAnimation and makes the object dashed.
I've tried adding opacity to the shader material and changing it after adding to scene, but doesn't work.
I suppose I have to do that inside of the fragment shader but don't know how. Can someone help me?
R147
I'm working with THREE.js points and sometimes I need them to have different per point color. Sometimes, I'm also modifying their alpha value so I had to write my own shader programs.
In JavaScript I have the following code:
let materials;
if (pointCloudData.colors !== undefined) {
geometry.colors = pointCloudData.colors.map(hexColor => new THREE.Color(hexColor));
// If the point cloud has color for each point...
materials = new THREE.ShaderMaterial({
vertexColors: THREE.VertexColors,
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
} else {
// Set color for the whole cloud
materials = new THREE.ShaderMaterial({
uniforms: {
unicolor: { value: pointCloudData.color },
},
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
}
const pointCloud = new THREE.Points(geometry, materials);
I am basically setting the mesh color to a uniform value unless I have defined per point colors - then I set vertexColors to the geometry. I also checked the values being stored in the geometry.colors and they are correct RGB values in range [0,1].
My Vertex Shader code:
attribute float size;
attribute float alpha;
varying float vAlpha;
varying vec3 vColor;
void main() {
vAlpha = alpha;
#ifdef USE_COLOR
vColor = color;
#endif
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = size * ( 300.0 / -mvPosition.z );
gl_Position = projectionMatrix * mvPosition;
}
And my Fragment shader code:
uniform vec3 unicolor;
varying vec3 vColor;
varying float vAlpha;
void main() {
#ifdef USE_COLOR
gl_FragColor = vec4(vColor, vAlpha);
#else
gl_FragColor = vec4(unicolor, vAlpha);
#endif
}
Again, I am checking if the vertexColor is set and then passing it to the Fragment Shader which then sets the per point.
For some reason, the vertices are all white when setting the color per point (screenshot: The white pixels should be green/red). I'm far from advanced user in WebGL and any help would be appreciated. Am I doing something wrong that I'm not aware of?
You are creating a custom ShaderMaterial and using this pattern in your fragment shader:
#ifdef USE_COLOR
vColor = color;
#endif
Consequently, you need to specify the material.defines like so:
var defines = {};
defines[ "USE_COLOR" ] = "";
// points material
var shaderMaterial = new THREE.ShaderMaterial( {
defines: defines,
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexshader' ).textContent,
fragmentShader: document.getElementById( 'fragmentshader' ).textContent,
transparent: true
} );
You do not need to set vertexColors: THREE.VertexColors. That is just a flag used by built-in materials to alert the renderer to set the defines for you.
three.js r.85
OK, I think I figured it out since it's working now.
I had to set the colors as geometry attributes:
const colors = new Float32Array(n * 3);
for (let i = 0; i < n; i += 1) {
new THREE.Color(pointCloudData.colors[i]).toArray(colors, i * 3);
}
geometry.addAttribute('colors', new THREE.BufferAttribute(colors, 1));
I also used the suggestion provided by WestLangley and removed the vertexColors: THREE.VertexColors, part from the Material definition and set the define as well:
materials = new THREE.ShaderMaterial({
defines: {
USE_COLOR: '',
},
vertexShader: document.getElementById('vertexshader').textContent,
fragmentShader: document.getElementById('fragmentshader').textContent,
transparent: true,
});
Then in my Vertex shader I added:
attributes vec3 colors;
to get the colors passed from the JavaScript. The rest is the same, I just passed the colors to the fragment shader using the same code as in the posted question above.
The problem: I have a point cloud with quite a lot of data points (around one million). When I apply transparency to the rendered points, the transparency somehow does not show what is behind the rendered points
As you can see in the example of the marked point, it does not show what it should, it is as if there is a problem with the buffering.
I use three.js to create a point cloud using the following "setup":
The renderer:
this.renderer = new THREE.WebGLRenderer({
canvas: this.canvas,
antialias: true
});
The material:
this.pointMaterial = new THREE.ShaderMaterial( {
uniforms: {
time: { type: "f", value: 1.0 }
},
vertexShader: document.getElementById('vertexShader').textContent,
fragmentShader: document.getElementById('fragmentShader').textContent,
transparent: true
});
The vertex shader:
attribute float size;
attribute float opacity;
attribute vec3 color;
varying vec3 vColor;
varying float vOpacity;
void main() {
vColor = color;
vOpacity = opacity;
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
gl_PointSize = size * (500.0 / length(mvPosition.xyz));
gl_Position = projectionMatrix * mvPosition;
}
The fragment shader:
uniform float time;
varying vec3 vColor;
varying float vOpacity;
void main() {
gl_FragColor = vec4(vColor, vOpacity);
}
The geometry (where I left out the part where I populate the arrays):
var bufferGeometry = new THREE.BufferGeometry();
var vertices = new Float32Array(vertexPositions.length * 3);
var colors = new Float32Array(vertexColors.length * 3);
var sizes = new Float32Array(vertexSizes.length);
var opacities = new Float32Array(vertexOpacities.length);
bufferGeometry.addAttribute('position', new THREE.BufferAttribute(vertices, 3));
bufferGeometry.addAttribute('color', new THREE.BufferAttribute(colors, 3));
bufferGeometry.addAttribute('size', new THREE.BufferAttribute(sizes, 1));
bufferGeometry.addAttribute('opacity', new THREE.BufferAttribute(opacities, 1));
this.points = new THREE.Points(bufferGeometry, this.pointMaterial);
this.scene.add(this.points);
I tried this with the built-in point material, where the same happens
this.pointMaterial = new THREE.PointsMaterial({
size: this.pointSize,
vertexColors: THREE.VertexColors,
transparent: true,
opacity: 0.25
});
Is this a but, expected behaviour or am I doing something wrong?
The way the alpha blending equation works is that a source colour for geometry that is behind is covered by a destination colour for geometry that is in front. This means you need to render your transparent geometry in sorted order from back to front, so that geometry in front will correctly blend with geometry behind.
If all you have is transparent geometry then you can just disable depth testing, render in reverse depth sorted order, and it will work. If you have opaque geometry as well then you need to first render all opaque geometry normally, then disable depth writing (not testing) and render transparent geometry in reverse depth sorted order, then re-enable depth writing.
Here are some answers to similar questions if you're interested in learning a bit more.
I'm trying to make skybox assigned to player camera.
When camera moves(also skybox moves with it), texture get stretched.
How to get rid of this?
Code:
var textureCube = THREE.ImageUtils.loadTextureCube( urls );
textureCube.format = THREE.RGBFormat;
var shader = THREE.ShaderUtils.lib[ "cube" ];
shader.uniforms[ "tCube" ].value = textureCube;
cubematerial = new THREE.ShaderMaterial({
fragmentShader: shader.fragmentShader,
vertexShader: shader.vertexShader,
uniforms: shader.uniforms,
depthWrite: false,
side: THREE.BackSide
});
skyBox = new THREE.Mesh(new THREE.CubeGeometry(1000,1000,1000), cubematerial);
camera.add(skyBox);
So, after digging into Three.js examples, I found a way how to do this. http://learningthreejs.com/blog/2011/08/15/lets-do-a-sky/ is outdated. A way used in examples is to add skybox into second scene with fixed camera, and render both scenes. Look at webgl_materials_cars.html example.
Also because I use 3rd person camera assigned to character, I must get world rotation from character camera to skybox camera. This can be done on render with:
function render(){
<...>
skyboxCamera.rotation.setEulerFromRotationMatrix( new THREE.Matrix4().extractRotation( camera.matrixWorld ), skyboxCamera.eulerOrder );
renderer.render(skyboxScene, skyboxCamera);
renderer.render(scene, camera);
<...>
}
I know it's a closed question but I want to offer an alternative that does not require an additional scene, for future seekers:
start by reading and following this tutorial: http://learningthreejs.com/blog/2011/08/15/lets-do-a-sky/
now create the following shader (I added it to three.js ShaderLib, but if you don't want to temper with three' source code add it outside):
'skybox': {
uniforms: { "tCube": { type: "t", value: null },
"tFlip": { type: "f", value: -1 } },
vertexShader: [
"varying vec3 vWorldPosition;",
THREE.ShaderChunk[ "logdepthbuf_pars_vertex" ],
"void main() {",
" vec4 worldPosition = modelMatrix * vec4( position, 1.0 );",
" vWorldPosition = worldPosition.xyz;",
" gl_Position = projectionMatrix * modelViewMatrix * vec4( position + cameraPosition, 1.0 );",
THREE.ShaderChunk[ "logdepthbuf_vertex" ],
"}"
].join("\n"),
fragmentShader: [
"uniform samplerCube tCube;",
"uniform float tFlip;",
"varying vec3 vWorldPosition;",
THREE.ShaderChunk[ "logdepthbuf_pars_fragment" ],
"void main() {",
" gl_FragColor = textureCube( tCube, vec3( tFlip * vWorldPosition.x, vWorldPosition.yz ) );",
THREE.ShaderChunk[ "logdepthbuf_fragment" ],
"}"
].join("\n")
},
create your skybox like this:
// urls is a list of textures to use
var cubemap = THREE.ImageUtils.loadTextureCube(urls);
cubemap.format = THREE.RGBFormat;
var shader = THREE.ShaderLib['skybox']; // init the skybox shader we created above
shader.uniforms['tCube'].value = cubemap; // apply textures to shader
// create shader material
var skyBoxMaterial = new THREE.ShaderMaterial( {
fragmentShader: shader.fragmentShader,
vertexShader: shader.vertexShader,
uniforms: shader.uniforms,
depthWrite: false,
side: THREE.BackSide
});
// create skybox mesh
var skybox = new THREE.Mesh(
new THREE.CubeGeometry(1000, 1000, 1000),
skyBoxMaterial
);
// THIS IS IMPORTANT! or the skybox will get culled after you move the camera too far..
skybox.frustumCulled = false;
I am trying to use a custom shader with Three.js. I tried to do it like the many examples, but it doesn't work. My code is:
var vertex = "void main(){vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );gl_Position = projectionMatrix * mvPosition;}";
var fragment = "precision highp float;void main(void){gl_FragColor = vec4(0.0,1.0,0.0,1.0);}";
material = new THREE.ShaderMaterial({
vertexShader: vertex,
fragmentShader: fragment
});
var mesh = new THREE.Mesh(geometry,material);
…and everything is blank. But if I use this material :
material = new THREE.MeshBasicMaterial({ color: 0xff0000, wireframe: true });
…everything works perfectly. What's wrong?
I found the problem: I had to use:
renderer = new THREE.WebGLRenderer();
instead of :
renderer = new THREE.CanvasRenderer();