Here is my fragment shader code
uniform sampler2D texture;
precision mediump float;
varying vec2 vUv;
void main()
{
gl_FragColor = texture2D(texture, vUv);
// gl_FragColor = texture2D(texture, vUv.xy); // also tried this
}
Here is code for uniforms:
uniforms = {
"uTime": { type: "f", value: 0.0 },
texture: { value: new THREE.TextureLoader().load("lol.png") },
}
Here is texture I load:
And as result - it's completely transparent. If I render any other stuff (color, circle etc.) it works.
Related
I've a plane geometry and I'm creating a CustomShader material related to it. It will receive some textures as uniforms. I'd like the textures to perfectly cover my plane (like the background-size:cover css property)
I managed to do it with an utility function when I used my textures with a MeshBasicMaterial :
cover( texture, aspect ) {
var imageAspect = texture.image.width / texture.image.height;
if ( aspect < imageAspect ) {
texture.matrix.setUvTransform( 0, 0, aspect / imageAspect, 1, 0, 0.5, 0.5 );
} else {
texture.matrix.setUvTransform( 0, 0, 1, imageAspect / aspect, 0, 0.5, 0.5 );
}
}
But unfortunately since I'm using the ShaderMaterial, my "cover" function doesn't apply anymore. Am I force to do it inside my fragment shader? If so how can I manage to reproduce this behavior ?
Here's my code :
const vertexShader = `
precision highp float;
uniform mat3 uUvTransform;
varying vec2 vUv;
void main() {
vUv = ( uUvTransform * vec3( uv, 1 ) ).xy;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}`;
const fragmentShader = `
precision highp float;
uniform sampler2D uText1;
varying vec2 vUv;
void main() {
vec2 xy = vUv;
vec4 color = texture2D(uText1,xy);
gl_FragColor = color;
}`;
And here's my current result :
Thanks a lot
You could simply use a custom uniform, e.g. :
uniform sampler2D uText1;
uniform vec2 uUvScale;
varying vec2 vUv;
void main() {
vec2 uv = (vUv - 0.5) * uUvScale + 0.5;
gl_FragColor = texture2D(uText1, uv);
}
And :
var imageAspect = texture.image.width / texture.image.height;
if ( aspect < imageAspect ) {
material.uniforms.uUvScale.value.set(aspect / imageAspect, 1)
} else {
material.uniforms.uUvScale.value.set(1, imageAspect / aspect)
}
The way Three.js handles texture transformations like .offset, .repeat, .rotation, .center is via a Matrix3 that gets passed as a uniform into the vertex shader. The vertex shader performs the matrix multiplication, then passes the modified UVs as a varying to the fragment shader.
You can see that uniform being declared in the uv_pars_vertex.glsl.js file
You can see the transform being applied in the uv_vertex.glsl.js file
You could copy those lines of GLSL code to your ShaderMaterial's vertex shader, and I think the texture properties will come through in the Matrix3 automatically. However, if for some reason it doesn't, you could recreate the Matrix3 by copying it from the source and passing it as a uniform manually. I don't know what your utility function looks like, so it's hard to tell how you're achieving the desired scaling.
I'm new to three.js and shaders at all.
I need to create a sphere of particles, which move on surface like waves, but that's not a problem. Right now I got something like this.
And here is result I need.
So, how to render each point as circle or maybe, render texture? Right now my fragment shader is
uniform sampler2D texture;
uniform vec2 repeat;
uniform float uTime;
varying vec2 vOffset;
precision mediump float;
varying vec3 vColor;
varying vec2 vUv;
void main()
{
vec2 uv = vec2( gl_PointCoord.x, 1.0 - gl_PointCoord.y );
vec4 tex = texture2D( texture, uv * 0.5);
gl_FragColor = vec4(vec3(0.5, 0.8, 0.85), 0.8);
}
Ofc i tried to render gl_FragColor = tex, but it doesn't seem to work though. My texture is just a particle.
What do you mean gl_FragColor = tex didn't work?
If you want to use a texture the code should be
uniform sampler2D texture;
void main() {
gl_FragColor = texture2D(texture, gl_PointCoord);
}
and you should probably turn on blending and set it up for premultiplied alpha and make sure your texture is using premultiplied alpha and the depth test is off.
I have a very simple object and a texture drown to it (using shader). Everything works great, except when rotating the object, the texture is not rotating along it, but appears to stay in the 2D space, creating the 'mask' effect below:
Texture not rotating along the object
When I use a regular material and attach texture to it, all works fine, so I'm guessing I'm doing something wrong at the vertex shader.
I load the model the following way:
var loader = new THREE.JSONLoader();
loader.load( "models/cube.json", addModelToScene );
var texture = THREE.ImageUtils.loadTexture( "images/woods.jpg" );
texture.wrapS = THREE.RepeatWrapping;
texture.wrapT = THREE.RepeatWrapping;
texture.repeat.set( 1.0, 1.0 );
uniforms =
{
time:
{
type: "f",
value: 1.0
},
texture1: { type: "t", value: THREE.ImageUtils.loadTexture( "images/woods.jpg" ) }
};
function addModelToScene( geometry, materials ) {
var material = new THREE.MeshFaceMaterial( materials );
var shaderMaterial = new THREE.ShaderMaterial
(
{
vertexShader: $('#vertexshader').text(),
fragmentShader: $('#fragmentshader').text(),
uniforms: uniforms
}
);
model = new THREE.Mesh( geometry, shaderMaterial );
model.scale.set( 2.5, 2.5, 2.5 );
scene.add( model );
}
Vertex shader:
varying vec2 vUv;
#ifdef GL_ES
precision highp float;
#endif
uniform float time;
uniform sampler2D texture1;
void main()
{
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
Fragment shader:
varying vec2 vUv;
#ifdef GL_ES
precision highp float;
#endif
uniform sampler2D texture1;
uniform float time;
void main()
{
vec2 u_resolution = vec2( 1700, 1000 );
vec2 uv = gl_FragCoord.xy / u_resolution.xy;
gl_FragColor = texture2D( texture1, uv );
}
And finally I rotate the object the following way:
model.rotation.z += 0.00013;
model.rotation.z += 0.004;
Why is the texture not one with the object, but instead stays in static position? Thanks!
That's because you should use vUv varying to address the texture, not gl_FragCoord.xy:
gl_FragColor = texture2D(texture1, vUv);
gl_FragColor.xy is just pixel's (or, to be more accurate, fragment's) coordinates on a screen (i.e., window coordinates). They don't depend on rotation (or any transformations for that matter) of your object (or the object itself). They only depend upon where the pixel currently being shaded lies on the screen.
I'm trying to draw to a texture, so that the rendering keeps getting compounded on top of each other.
I'm using two textures and two framebuffers.
texture[0] is attached to framebuffer[0], and texture[1] is attached to framebuffer[1].
var gl = webgl.context;
gl.useProgram(this.program);
gl.bindTexture(gl.TEXTURE_2D, this.texture[this.pingpong]);
this.pingpong = (this.pingpong==0?1:0);
var pp = this.pingpong;
gl.bindFramebuffer(gl.FRAMEBUFFER, this.frameBuffer[pp]);
gl.drawArrays(gl.TRIANGLES, 0, 6); //primitiveType, offset, count
gl.bindTexture(gl.TEXTURE_2D, this.texture[pp]); //bind to the texture we just drew to
gl.bindFramebuffer(gl.FRAMEBUFFER, null); //render the above texture to the canvas
gl.drawArrays(gl.TRIANGLES, 0, 6);
The issue, is that I'm not seeing the previous renders get saved into the textures.
I thought that bindframebuffer() would make it render to the texture.
my fragment shader:
precision mediump float; // fragment shaders don't have a default precision so we need to pick one. mediump is a good default
varying vec2 v_texCoord; // the texCoords passed in from the vertex shader.
uniform vec2 u_resolution; // a uniform
uniform vec2 u_mouse;
uniform sampler2D u_image; // this isn't set, so it will default to 0 (the current active texture)
void main() {
vec4 texColor = texture2D(u_image, v_texCoord); // Look up a color from the texture.
vec2 coord = vec2(gl_FragCoord.x,u_resolution.y-gl_FragCoord.y);
vec4 color = step(distance(coord,u_mouse),100.0)*vec4(1,0,0,1) + step(100.0,distance(coord,u_mouse))*texColor;
gl_FragColor = color; // gl_FragColor is a special variable a fragment shader is responsible for setting
}
u_image is set to default 0, the active texture.
Is there something I'm overlooking? Why won't the previous renders get compounded on top of each other? It is just showing the latest render as if the textures haven't been altered.
here is the vertex shader:
precision mediump float;
attribute vec2 a_position; // an attribute will receive data from a buffer
attribute vec2 a_texCoord;
varying vec2 v_texCoord; // a varying
uniform vec2 u_resolution; // a uniform
uniform vec2 u_mouse;
uniform float u_flip;
// all shaders have a main function
void main() {
v_texCoord = a_texCoord; // pass the texCoord to the fragment shader.The GPU will interpolate this value between points
vec2 zeroToOne = a_position / u_resolution; // convert the position from pixels to 0.0 to 1.0
vec2 zeroToTwo = zeroToOne * 2.0; // convert from 0->1 to 0->2
vec2 clipSpace = zeroToTwo - 1.0; // convert from 0->2 to -1->+1 (clipspace)
// gl_Position is a special variable a vertex shader is responsible for setting
gl_Position = vec4(clipSpace * vec2(1, u_flip), 0, 1);
}
I tried to emulate what you are doing:
loop.flexStep = function(){
gl.clear(gl.COLOR_BUFFER_BIT);
pingPong.pingPong().applyPass( //pingPong() just swaps the FBOs and return the current FBO being draw to
"outColor += src0 * 0.98;",
pingPong.otherTexture() // the src0 texture
);
points.drawPoint([mouse.x, mouse.y, 0 ]);
points.renderAll(camera);
screenBuffer.applyPass(
"outColor = src0;",
pingPong.resultFBO // src0
);
};
Here is what it looks like (gl.POINTS instead of circle):
Here are the gl commands:
You should check if your framebuffers are setup correctly first. Since that part isnt shown its hard to say.
Also consider to separate your shader. You should have one shader in the pingpong stage to copy/alter the result of previous texture to the current pingPong FBO texture. And another shader to draw the new stuff to the current pingPong FBO texture.
Issue seemed to be with getting the wrong texture coordinate and also flipping the texture.
It's working as expected now.
here's the updated vertex/fragment shaders:
this.vertex = `
precision mediump float;
attribute vec2 a_position; // an attribute will receive data from a buffer
attribute vec2 a_texCoord;
varying vec2 v_texCoord; // a varying
uniform vec2 u_resolution; // a uniform
uniform vec2 u_mouse;
uniform float u_flip;
// all shaders have a main function
void main() {
v_texCoord = a_texCoord / u_resolution; // pass the texCoord to the fragment shader.The GPU will interpolate this value between points
vec2 zeroToOne = a_position / u_resolution; // convert the position from pixels to 0.0 to 1.0
vec2 zeroToTwo = zeroToOne * 2.0; // convert from 0->1 to 0->2
vec2 clipSpace = zeroToTwo - 1.0; // convert from 0->2 to -1->+1 (clipspace)
// gl_Position is a special variable a vertex shader is responsible for setting
gl_Position = vec4(clipSpace * vec2(1, u_flip), 0, 1);
}
`;
this.fragment = `
precision mediump float; // fragment shaders don't have a default precision so we need to pick one. mediump is a good default
varying vec2 v_texCoord; // the texCoords passed in from the vertex shader.
uniform vec2 u_resolution; // a uniform
uniform vec2 u_mouse;
uniform float u_flip;
uniform sampler2D u_image; // this isn't set, so it will default to 0 (the current active texture)
uniform sampler2D u_texture0;
uniform sampler2D u_texture1;
void main() {
vec4 texColor = texture2D(u_image, v_texCoord); // Look up a color from the texture.
vec2 coord = vec2(gl_FragCoord.x, step(0.0,u_flip)*gl_FragCoord.y + step(u_flip,0.0)*(u_resolution.y-gl_FragCoord.y) );
vec4 color = step(distance(coord,u_mouse),100.0)*vec4(1,0,0,1) + step(100.0,distance(coord,u_mouse))*texColor;
gl_FragColor = color; // gl_FragColor is a special variable a fragment shader is responsible for setting
}
`;
I followed this post hoping to add my own spin on things. I noticed the example located
here is using a very old revision of Three.JS (49). When I changed the source file to a more up-to-date version, the texture no longer appears. See Demo
I've been spending a lot of time trying to figure out what depreciations occurred and I've narrowed down my search to these lines.
// material
uniforms = {
sunDirection: { type: "v3", value: new THREE.Vector3(0,1,0) },
dayTexture: { type: "t", value: 0, texture: THREE.ImageUtils.loadTexture( "/images/world2.png" ) },
nightTexture: { type: "t", value: 1, texture: THREE.ImageUtils.loadTexture( "/images/world5.png" ) }
};
uniforms.dayTexture.texture.wrapS = uniforms.dayTexture.texture.wrapT = THREE.Repeat;
uniforms.nightTexture.texture.wrapS = uniforms.nightTexture.texture.wrapT = THREE.Repeat;
material = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent
} );
More misc stuff that probably has something to do with my problem
<script id="fragmentShader" type="x-shader/x-fragment">
uniform sampler2D dayTexture;
uniform sampler2D nightTexture;
uniform vec3 sunDirection;
varying vec2 vUv;
varying vec3 vNormal;
void main( void ) {
vec3 dayColor = texture2D( dayTexture, vUv ).rgb;
vec3 nightColor = texture2D( nightTexture, vUv ).rgb;
// compute cosine sun to normal so -1 is away from sun and +1 is toward sun.
float cosineAngleSunToNormal = dot(normalize(vNormal), sunDirection);
// sharpen the edge beween the transition
cosineAngleSunToNormal = clamp( cosineAngleSunToNormal * 10.0, -1.0, 1.0);
// convert to 0 to 1 for mixing
float mixAmount = cosineAngleSunToNormal * 0.5 + 0.5;
// Select day or night texture based on mix.
vec3 color = mix( nightColor, dayColor, mixAmount );
gl_FragColor = vec4( color, 1.0 );
//gl_FragColor = vec4( mixAmount, mixAmount, mixAmount, 1.0 );
}
</script>
<script id="vertexShader" type="x-shader/x-vertex">
varying vec2 vUv;
varying vec3 vNormal;
void main()
{
vUv = uv;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
vNormal = normalMatrix * normal;
gl_Position = projectionMatrix * mvPosition;
}
</script>
I've checked the migrations docx here
There isn't much on "uniforms" or "shaders" for that fact.
In the Migration Wiki you referenced, it specifies a new pattern for assigning textures to uniforms as of r.51:
{ type: "t", value: 0, texture: map } => { type: "t", value: map }
So in you case, it would be
dayTexture: { type: "t", value: THREE.ImageUtils.loadTexture( "/images/world2.png" ) },
nightTexture: { type: "t", value: THREE.ImageUtils.loadTexture( "/images/world5.png" ) }
three.js r.59