THREE.js Render target texture will not draw in a different scene - javascript

So if any THREE.js pros can understand why I can't get the WebGLRenderTarget to be used as a material for a plane in another scene I'd be pretty happy.
How it works right now is I create a scene with a perspective camera which renders a simple plane. This happens in the Application object.
I also have a WaveMap object that uses another scene and an orthographic camera and using a fragment shader draws the cos(x) * sin(y) function on another quad that takes up the entire screen. I render this to a texture and then I create a material that uses this texture.
I then pass that Material to be used in the Application object to draw the texture on the first plane I mentioned.
Problem is for some reason I could get this to work in the scene with the orthographic camera inside the WaveMap object but not in the scene with the perspective camera in the Application object after passing the material over. :(
I've tried simply passing a simple material with a solid color and that works but when I try to pass over a material which uses a WebGLRenderTarget as a texture it doesn't show up anymore.
https://github.com/ArminTaheri/rendertotexture-threejs

You need to clone any material/texture that you want to render by two different renderers.
var materialOnOther = originalMaterial.clone();
Prior to r72 you needed to force the image buffers for the textures to be updated like so:
materialOnOther.uniforms.exampleOfATexture.value.needsUpdate = true;

Related

Using Native WebGL API to assign a video texture to Sphere

I using webgl programing ,I want to assign a video texture to a sphere,I try many many times, at last I failed,I there any good way to assign a video texture to a sphere using webgl native API without using any other library or framework such as three.js and so on, actually,gl-matrix is not incluted. I think,when you want to assign a vedio texture to sphere, you must constructed the sphere's vetex position,texture coordinate and vextex index.and you must update the texture time after time. I do so as above,but it nothing display on the screen, Who can help me,and show me a such demo for me,Thankyou.

depth of field with webgl

I would like to simulate the "depth of field"-effect in webgl,
moving the camera on a circle: https://en.wikibooks.org/wiki/OpenGL_Programming/Depth_of_Field
In OpenGl I would use the accumulation-buffer. But unfortunately webgl doesn´t know such buffer.
Is it possible to use blending to simulate such an effect?
A simple way of simulating depth of field is
Render scene to texture
Blur texture with renderer scene to another texture
Mix the 2 textures (in focus scene texture + blurred scene texture) using the depth information.
There's an example here. Click the tiny * to and adjust the "dof" slider. Press d a few times to see the different textures.
You can also render the scene to several different framebuffers, then bind those framebuffers as textures and accumulate the color from all of them in one final post-processing gathering pass. So that is more or less the manual way of doing accumulation.

ThreeJS - Keep object in Camera view at all times

I am using threejs in a project where I need to keep a basic object in my scene just in front of the camera, whichever way the camera is pointed. I am using the Orbit Controls plugin for movement and want to move the object around the scene so it is always in the middle of the camera view (same distance away all the time).
I am a relative newcomer to threejs so am not sure how to approach this - thoughts would be appreciated!
Before rendering your scene, you need to update the position of your object by placing it in front of the camera, for that you can use camera.position and camera.getWorldDirection(), then do the math to compute what should be the position of your object.

Threejs draw diiferent lines

We use three.js to draw some maps. On a map there are should be some lines (like some borders, isolines). So we know how to draw just dashed lines. But also I want to draw lines with such type of patterns. So what would be better way to do that? Texturing?
UPD:
Map is gonna be 3D (but the lines are not), you can rotate it and zoom it etc.
It uses WebGl renderer.
Showing you exactly what needs to be done to achieve all this would cover several tutorials by itself, so I'm going to link you to some.
You could start here: http://www.html5canvastutorials.com/tutorials/html5-canvas-element ... that'll give you a basic page and a canvas to start drawing on, and various instructions on how to draw some shapes (this is going to be the hardest part for you; it'll require many different draw calls for these various shapes and it'll probably be somewhat difficult to position them correctly on your map). I'd suggest you create functions to draw the various basic shapes you need, and you will need plenty of 2D math in these functions which are perhaps worthy of SO questions by themselves.
To make dashed lines like in your example picture, http://www.rgraph.net/blog/2013/january/html5-canvas-dashed-lines.html is useful. Anything more advanced than that you'll have to draw yourself though, combining various shapes on the canvas.
Also note you'll want to use a canvas size where the width and height are a power of two, so use something like 512x512 or 1024x1024 before you use it as a texture.
Once you have your canvas rendered the way you want it to, applying it to an object in a three.js scene is pretty easy:
var texture = new THREE.Texture( canvas );
var material = new THREE.MeshBasicMaterial({ map: texture });
var geometry = new THREE.PlaneGeometry( 10, 10 );
var plane = new THREE.Mesh( geometry, material );
scene.add( plane );
You'll have to use a more advanced material if you're not copying the map you want to draw on to the canvas first as you can only apply limited textures to a single surface. But, get your 2D drawing sorted first as that is the hard part.

three.js Is it possible to make a mini-cam / mini-map - or second camera on same scene without viewport cropping

How do you make a mini-cam or mini-map using three.js?
Two camera views. One mini-map "mini-cam" - top right corner (see image below), and the main camera view should span the rest of the scene (the rest, without the border in image below).
N.B. This actually can't quite be done via the viewport method, as the scene shouldn't be cropped off by the x-dimensionality, but should stretch out.
I have created a working example of a scene together with a minimap at:
http://stemkoski.github.io/Three.js/Viewports-Minimap.html
I don't understand your N.B. about not able to be done using viewports; as you'll see in the code at the website posted above, if you use for example an orthogonal camera positioned above the scene, you can set the left/right/top/bottom values so that the entire scene is included in your render, you just have to be careful about the height-to-width ratio so that your scene doesn't appear stretched.
On the other hand, if you're looking to go to the render-to-texture route, I have a relevant example at http://stemkoski.github.io/Three.js/Camera-Texture.html but as yaku mentioned in his response, you will have to update the position of the plane mesh so that it follows both the camera position (offset by some amount) and also rotation so that it always faces the camera. Due to the more complex nature of this approach, I really recommend the viewport method.
You can make a plane geometry (normal 3D object, and it actually doesn't need to be a plane, any shape goes) representing the minimap, and add it to camera so it follows the camera movement and is always visible.
Then use "Render to Texture" method to render the minimap and slap the texture to the plane "container".
There are render to texture examples around the net, SO, as well as in the Three.js examples folder, those should help get you started with that.

Categories

Resources