I'm using threejs to display an object and OrbitControls to control movment of the scene with my mouse. My scene also inclues a DirectionalLight.
When the scene first renders, the DirectionalLight lights my object in the way I expect. However, when I modify the scene by rotating using my mouse the DirectionalLight source does not update and 'shine' the light from the new angle as I expect.
Can someone explain how I can update my scene so the light updates based on my rotation with OrbitControls?
If you want the relationship between the light direction and the camera view to remain constant, I suggest that you add the light to the CAMERA, not the SCENE.
So instead of, say scene.add(dirLight) you use camera.add(dirLight)
Related
I have a mesh object on a floor and based on a click I want to rotate that mesh so it is on the floor. I'm able to get the click and select the mesh but how can I get the face to figure out how to rotate it.
It's hard to say without code example, but probably you're looking for:
Use Raycaster intersectObject for determining both mesh and
face you're clicking at;
Determine face rotation by looking into face normal vector . Note that you must call Geometry.computeFaceNormals() before using it if you build your mesh manually
Rotate mesh around face center to the angle, which can be calculated from face normal vector. The simplest approach to rotate around Vector3 here
Hope it will help
I am using threejs in a project where I need to keep a basic object in my scene just in front of the camera, whichever way the camera is pointed. I am using the Orbit Controls plugin for movement and want to move the object around the scene so it is always in the middle of the camera view (same distance away all the time).
I am a relative newcomer to threejs so am not sure how to approach this - thoughts would be appreciated!
Before rendering your scene, you need to update the position of your object by placing it in front of the camera, for that you can use camera.position and camera.getWorldDirection(), then do the math to compute what should be the position of your object.
So if any THREE.js pros can understand why I can't get the WebGLRenderTarget to be used as a material for a plane in another scene I'd be pretty happy.
How it works right now is I create a scene with a perspective camera which renders a simple plane. This happens in the Application object.
I also have a WaveMap object that uses another scene and an orthographic camera and using a fragment shader draws the cos(x) * sin(y) function on another quad that takes up the entire screen. I render this to a texture and then I create a material that uses this texture.
I then pass that Material to be used in the Application object to draw the texture on the first plane I mentioned.
Problem is for some reason I could get this to work in the scene with the orthographic camera inside the WaveMap object but not in the scene with the perspective camera in the Application object after passing the material over. :(
I've tried simply passing a simple material with a solid color and that works but when I try to pass over a material which uses a WebGLRenderTarget as a texture it doesn't show up anymore.
https://github.com/ArminTaheri/rendertotexture-threejs
You need to clone any material/texture that you want to render by two different renderers.
var materialOnOther = originalMaterial.clone();
Prior to r72 you needed to force the image buffers for the textures to be updated like so:
materialOnOther.uniforms.exampleOfATexture.value.needsUpdate = true;
I'm trying to make a sort of "lottery ball" simulator.
I'm having difficulty adding an object inside an object with physijis.
Physics work when I add the sphere to the scene, but when I add the child spheres to the main "lottery" sphere, it loses the physics.
If anyone could help, that would be great.
Thanks
How do you make a mini-cam or mini-map using three.js?
Two camera views. One mini-map "mini-cam" - top right corner (see image below), and the main camera view should span the rest of the scene (the rest, without the border in image below).
N.B. This actually can't quite be done via the viewport method, as the scene shouldn't be cropped off by the x-dimensionality, but should stretch out.
I have created a working example of a scene together with a minimap at:
http://stemkoski.github.io/Three.js/Viewports-Minimap.html
I don't understand your N.B. about not able to be done using viewports; as you'll see in the code at the website posted above, if you use for example an orthogonal camera positioned above the scene, you can set the left/right/top/bottom values so that the entire scene is included in your render, you just have to be careful about the height-to-width ratio so that your scene doesn't appear stretched.
On the other hand, if you're looking to go to the render-to-texture route, I have a relevant example at http://stemkoski.github.io/Three.js/Camera-Texture.html but as yaku mentioned in his response, you will have to update the position of the plane mesh so that it follows both the camera position (offset by some amount) and also rotation so that it always faces the camera. Due to the more complex nature of this approach, I really recommend the viewport method.
You can make a plane geometry (normal 3D object, and it actually doesn't need to be a plane, any shape goes) representing the minimap, and add it to camera so it follows the camera movement and is always visible.
Then use "Render to Texture" method to render the minimap and slap the texture to the plane "container".
There are render to texture examples around the net, SO, as well as in the Three.js examples folder, those should help get you started with that.