I need to create static light (invariant to move of camera) and i need to get actual position of light in fragment shader.
What i am doing now :
scene = new THREE.Scene();
camera = new THREE.PerspectiveCamera(60, canvas.width() / canvas.height(), 1, 10000);
camera.position.z = 2000;
camera.lookAt(0, 0, 0);
var light = new THREE.SpotLight(0xFFFFFF, 1);
light.position.set(0.5, 0.5, 0.1).normalize();
camera.add(light);
....
var lambertShader = THREE.ShaderLib['lambert'];
uniformsVolume = THREE.UniformsUtils.clone(lambertShader.uniforms);
....
materialVolumeRendering = new THREE.ShaderMaterial({
uniforms: uniformsVolume,
vertexColors: THREE.VertexColors,
vertexShader: vertVolumeRendering,
fragmentShader: fragVolumeRendering,
vertexColors: THREE.VertexColors,
lights :true
});
....
scene.add(camera);
Than in fragment shader i set uniform variable:
uniform vec3 spotLightPosition;
and compute light for voxel:
float dProd = max(0.0, dot(getGradient(posInCube), normalize(lightPos
- posInCube)));
voxelColored.rgb = voxelColored.rgb * dProd + voxelColored.rgb * 0.2;
Problem is, that it doesnt work correctly. My idea is, that i will moving with object (in reality with camera). Light will shine still from the same side (will be static in scene). At this time light is not static and work very strange.
Any idea?
Somebody please...
Tanks a lot.
Tomáš
Try with PointLight instead. SpotLight is a bit trickier to use.
To make your light static in position, then don't add it to your camera, but instead add it to the scene:
scene.add(light);
To find the position reference the variable you used for the light.
light.position
Related
The problem: I have a point cloud with quite a lot of data points (around one million). When I apply transparency to the rendered points, the transparency somehow does not show what is behind the rendered points
As you can see in the example of the marked point, it does not show what it should, it is as if there is a problem with the buffering.
I use three.js to create a point cloud using the following "setup":
The renderer:
this.renderer = new THREE.WebGLRenderer({
canvas: this.canvas,
antialias: true
});
The material:
this.pointMaterial = new THREE.ShaderMaterial( {
uniforms: {
time: { type: "f", value: 1.0 }
},
vertexShader: document.getElementById('vertexShader').textContent,
fragmentShader: document.getElementById('fragmentShader').textContent,
transparent: true
});
The vertex shader:
attribute float size;
attribute float opacity;
attribute vec3 color;
varying vec3 vColor;
varying float vOpacity;
void main() {
vColor = color;
vOpacity = opacity;
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
gl_PointSize = size * (500.0 / length(mvPosition.xyz));
gl_Position = projectionMatrix * mvPosition;
}
The fragment shader:
uniform float time;
varying vec3 vColor;
varying float vOpacity;
void main() {
gl_FragColor = vec4(vColor, vOpacity);
}
The geometry (where I left out the part where I populate the arrays):
var bufferGeometry = new THREE.BufferGeometry();
var vertices = new Float32Array(vertexPositions.length * 3);
var colors = new Float32Array(vertexColors.length * 3);
var sizes = new Float32Array(vertexSizes.length);
var opacities = new Float32Array(vertexOpacities.length);
bufferGeometry.addAttribute('position', new THREE.BufferAttribute(vertices, 3));
bufferGeometry.addAttribute('color', new THREE.BufferAttribute(colors, 3));
bufferGeometry.addAttribute('size', new THREE.BufferAttribute(sizes, 1));
bufferGeometry.addAttribute('opacity', new THREE.BufferAttribute(opacities, 1));
this.points = new THREE.Points(bufferGeometry, this.pointMaterial);
this.scene.add(this.points);
I tried this with the built-in point material, where the same happens
this.pointMaterial = new THREE.PointsMaterial({
size: this.pointSize,
vertexColors: THREE.VertexColors,
transparent: true,
opacity: 0.25
});
Is this a but, expected behaviour or am I doing something wrong?
The way the alpha blending equation works is that a source colour for geometry that is behind is covered by a destination colour for geometry that is in front. This means you need to render your transparent geometry in sorted order from back to front, so that geometry in front will correctly blend with geometry behind.
If all you have is transparent geometry then you can just disable depth testing, render in reverse depth sorted order, and it will work. If you have opaque geometry as well then you need to first render all opaque geometry normally, then disable depth writing (not testing) and render transparent geometry in reverse depth sorted order, then re-enable depth writing.
Here are some answers to similar questions if you're interested in learning a bit more.
I really hope you can help me with this question, as it confusses me since some time:
I have a three js context.
There i create a custom material and let it render into a texture.
`
/* Texture render environment */
fbo_renderer_scene = new THREE.Scene();
fbo_renderer_ripple_update_scene = new THREE.Scene();
var fbo_texture_light = new THREE.DirectionalLight(0xFFFFFF, 1.5);
fbo_texture_light.position.set(0.0, 0.0, -1.0).normalize();
fbo_renderer_scene.add(fbo_texture_light);
fbo_renderer_ripple_update_scene.add(fbo_texture_light);
ripple_texture = new THREE.WebGLRenderTarget(width, height, render_target_params);
ripple_update_texture = new THREE.WebGLRenderTarget(width, height, render_target_params);
ripple_material = new THREE.ShaderMaterial(
{
uniforms:
{
texture1: { type: "t", value: bottom_plane_texture},
},
vertexShader: document.getElementById('drop_vert_shader').textContent,
fragmentShader: document.getElementById('drop_frag_shader').textContent,
depthWrite: false
});
fbo_renderer_camera = new THREE.OrthographicCamera(width / -2.0, width / 2.0, height / 2.0, height / -2.0, -10000, 10000);
var texture_mesh = new THREE.Mesh(bottom_plane_geometry, ripple_material);
fbo_renderer_scene.add(texture_mesh);
renderer.render(fbo_renderer_scene, fbo_renderer_camera, ripple_texture, true);
Ok fine now i have everything in ripple_texture
Now there is another shader which should update the ripple shape on Animate() gets called.
And of course it should render the result into the ripple_texture again:
/* Texture render environment */
fbo_renderer_scene = new THREE.Scene();
fbo_renderer_ripple_update_scene = new THREE.Scene();
var fbo_texture_light = new THREE.DirectionalLight(0xFFFFFF, 1.5);
fbo_texture_light.position.set(0.0, 0.0, -1.0).normalize();
fbo_renderer_scene.add(fbo_texture_light);
fbo_renderer_ripple_update_scene.add(fbo_texture_light);
ripple_texture = new THREE.WebGLRenderTarget(width, height, render_target_params);
ripple_update_texture = new THREE.WebGLRenderTarget(width, height, render_target_params);
ripple_material = new THREE.ShaderMaterial(
{
uniforms:
{
texture1: { type: "t", value: bottom_plane_texture},
},
vertexShader: document.getElementById('drop_vert_shader').textContent,
fragmentShader: document.getElementById('drop_frag_shader').textContent,
depthWrite: false
});
fbo_renderer_camera = new THREE.OrthographicCamera(width / -2.0, width / 2.0, height / 2.0, height / -2.0, -10000, 10000);
var texture_mesh = new THREE.Mesh(bottom_plane_geometry, ripple_material);
fbo_renderer_scene.add(texture_mesh);
renderer.render(fbo_renderer_scene, fbo_renderer_camera, ripple_texture, true);
But everytime i try to do this Chromium reports this:
WebGLRenderingContext-0x3d022469aa80]GL ERROR :GL_INVALID_OPERATION :
glDrawElements: Source and destination textures of the draw are the
same.
Firefox just shows a black canvas.
I guess this might be a timing issue but im not sure and have no clue how to bypass this, since three js has no callback for render.
It's telling you what's wrong:
WebGLRenderingContext-0x3d022469aa80]GL ERROR :GL_INVALID_OPERATION : glDrawElements: Source and destination textures of the draw are the same.
You can't read and render to the same texture at once because your render operations will trample the texture data you're reading from during rendering, giving you garbage output.
You'll need to have another texture that you render into that isn't the same as the one you're reading from.
I have a sphere and light source (basically sun and earth). On the sphere, I'm using a greyscale heightmap for terrain texture so I am using three.js's ShaderTerrain.js. I'm also using a directional light source. My code:
scene = new THREE.Scene();
camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
lightCamera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
controls = new THREE.OrbitControls( camera, renderer.domElement );
lightControls = new THREE.TrackballControls(lightCamera, renderer.domElement);
light = new THREE.DirectionalLight(0xffffff, 1.5);
light.position.set(0,0,20);
lightCamera.position.z = 225;
lightCamera.add(light);
camera.add(lightCamera);
scene.add(camera);
var terrainShader = THREE.ShaderTerrain[ "terrain" ];
uniformsTerrain = THREE.UniformsUtils.clone(terrainShader.uniforms);
// displacement is heightmap (greyscale image)
uniformsTerrain[ "tDisplacement" ].value = THREE.ImageUtils.loadTexture('heightmap.jpg');
uniformsTerrain[ "uDisplacementScale" ].value = 15;
// diffuse is texture overlay
uniformsTerrain[ "tDiffuse1" ].value = THREE.ImageUtils.loadTexture('earth.jpg');
uniformsTerrain[ "enableDiffuse1" ].value = true;
var material = new THREE.ShaderMaterial({
uniforms: uniformsTerrain,
vertexShader: terrainShader.vertexShader,
fragmentShader: terrainShader.fragmentShader,
lights: true,
fog: false
});
var geometry = new THREE.SphereGeometry(100,100,100);
geometry.computeTangents();
scene.add(new THREE.Mesh(geometry, material));
With this code, the sphere is created just fine with the raised texture and everything.
The problem is that when I rotate the sphere, the light source doesn't appear to stay fixed. It rotates with the sphere and you end up with dark spots (rather than the light always coming from the front and keeping what the users sees illuminated).
If I create a simple sphere instead, like so:
geometry = new THREE.SphereGeometry(100,100,100);
material = new THREE.MeshLambertMaterial({color: 0x00ee00, wireframe: true, transparent: true, needsUpdate: true});
sphere = new THREE.Mesh(geometry, material);
scene.add(sphere);
Everything works perfectly. The light source stays fixed while the sphere/camera rotates.
I've also tried code that simply rotates the (first) sphere (the one using the ShaderMaterial) and not the camera (attaching a function to the mousemove event and simply doing a sphere.rotation.x/y with the mouse position). This doesn't work either; When the sphere rotates, there are still shadows that appear to the user.
Not sure what I'm missing here.
Here's a jsfiddle: http://jsfiddle.net/Z5sS5/1/. Left click/drag to spin everything (camera + light), right click/drag to spin only the light. To see it working, keep basicSphere() uncommented. To see it not working, comment basicSphere() and uncomment terrainSphere().
I am rendering a sphere via three.js and when I apply a texture it works just fine.
However, the equation I'm using to make markers isn't something I can play around with.
How can I rotate a texture on a sphere so that I can align the image according to the marker positions? specifically in the x direction.
Problem...Markers should be over Kagoshima, Japan and Hong Kong, China
Should Be....and No I did not solve it...this is what it should look like not how it is now
var geometry = new THREE.SphereGeometry(200, 40, 30);
shader = Shaders['earth'];
uniforms = THREE.UniformsUtils.clone(shader.uniforms);
uniforms['texture'].texture = THREE.ImageUtils.loadTexture('/images/world5.png');
texture.wrapS = THREE.RepeatWrapping; // This causes globe not to load
texture.offset.x = radians / ( 2 * Math.PI ); // causes globe not to load
material = new THREE.MeshShaderMaterial({
uniforms: uniforms,
vertexShader: shader.vertexShader,
fragmentShader: shader.fragmentShader
});
mesh = new THREE.Mesh(geometry, material);
mesh.matrixAutoUpdate = false;
scene.addObject(mesh);
To shift the texture a certain number of radians of longitude, use this pattern:
texture.wrapS = THREE.RepeatWrapping; // You do not need to set `.wrapT` in this case
texture.offset.x = radians / ( 2 * Math.PI );
three.js r.58
To shift texture have a look at this question, this is another solution that worked for me.
I am trying to use a custom shader with Three.js. I tried to do it like the many examples, but it doesn't work. My code is:
var vertex = "void main(){vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );gl_Position = projectionMatrix * mvPosition;}";
var fragment = "precision highp float;void main(void){gl_FragColor = vec4(0.0,1.0,0.0,1.0);}";
material = new THREE.ShaderMaterial({
vertexShader: vertex,
fragmentShader: fragment
});
var mesh = new THREE.Mesh(geometry,material);
…and everything is blank. But if I use this material :
material = new THREE.MeshBasicMaterial({ color: 0xff0000, wireframe: true });
…everything works perfectly. What's wrong?
I found the problem: I had to use:
renderer = new THREE.WebGLRenderer();
instead of :
renderer = new THREE.CanvasRenderer();