Scaling Video texture mixed with image texture - javascript

I've tried searching and is a trivial thing to do but can't figure it out, I looked at many answers but never answered what I need with full working examples.
I am trying to scale the video texture in a fragment shader, with an image texture in the background. So can't transform vertex. The video texture needs to be resized to 200 x 200 and positioned in the top or bottom corner.
It will use MediaStream eventually for webcam, with transparent background effect. It works with bodypix on top of an image texture, using the mix shader but need to scale the video so it doesn't stretch to the viewport.
If I try a vec2 of vec2(200,200), clamp will stretch the rest of the colours. Repeat will tile it. Neither of these is the expected result.
Jsfiddle of video render but need to resize it.
https://jsfiddle.net/danrossi303/or3dk2q4/16/
The basic fragment shader I am working with.
precision mediump float;
uniform sampler2D background;
uniform sampler2D frame;
uniform float texWidth;
uniform float texHeight;
void main(void) {
vec2 texCoord = gl_FragCoord.xy / vec2(texWidth,texHeight);
vec4 texel0 = texture2D(background, texCoord);
vec4 texel1 = texture2D(frame, texCoord);
gl_FragColor = mix(texel0, texel1, 1.);
}

You need to scale the texture coordinates for the video:
vec2 frameuv = texCoord * vec2(texWidth, texHeight) / vec2(200.0, 200.0);
vec4 texel1 = texture2D(frame, frameuv);
Discard the frame (texel1) if frameuv.x or frameuv.y is greater than 1.0 (see step):
float w = step(frameuv.x, 1.0) * step(frameuv.y, 1.0);
gl_FragColor = mix(texel0, texel1, w);
Complete fragment shader:
precision mediump float;
uniform sampler2D background;
uniform sampler2D frame;
uniform float texWidth;
uniform float texHeight;
void main(void) {
vec2 texCoord = gl_FragCoord.xy / vec2(texWidth,texHeight);
vec4 texel0 = texture2D(background, texCoord);
vec2 frameuv = texCoord * vec2(texWidth, texHeight) / vec2(200.0, 200.0);
vec4 texel1 = texture2D(frame, frameuv);
gl_FragColor = mix(texel0, texel1, step(frameuv.x, 1.0) * step(frameuv.y, 1.0));
}
Additionally there is a bug when you load the background texture. gl.bindTexture binds a texture object to the current texture unit. Therfor the texture unit has to be set with gl.activeTexture, before calling gl.bindTexture:
img.onload = () => {
gl.activeTexture(gl.TEXTURE0); // <---
gl.bindTexture(gl.TEXTURE_2D, texture);
initBackgroundTexture();
};
img.src = "https://videos.electroteque.org/textures/virtualbg.jpg";

Related

Three.js : Modify the UV of my texture inside a custom ShaderMaterial

I've a plane geometry and I'm creating a CustomShader material related to it. It will receive some textures as uniforms. I'd like the textures to perfectly cover my plane (like the background-size:cover css property)
I managed to do it with an utility function when I used my textures with a MeshBasicMaterial :
cover( texture, aspect ) {
var imageAspect = texture.image.width / texture.image.height;
if ( aspect < imageAspect ) {
texture.matrix.setUvTransform( 0, 0, aspect / imageAspect, 1, 0, 0.5, 0.5 );
} else {
texture.matrix.setUvTransform( 0, 0, 1, imageAspect / aspect, 0, 0.5, 0.5 );
}
}
But unfortunately since I'm using the ShaderMaterial, my "cover" function doesn't apply anymore. Am I force to do it inside my fragment shader? If so how can I manage to reproduce this behavior ?
Here's my code :
const vertexShader = `
precision highp float;
uniform mat3 uUvTransform;
varying vec2 vUv;
void main() {
vUv = ( uUvTransform * vec3( uv, 1 ) ).xy;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}`;
const fragmentShader = `
precision highp float;
uniform sampler2D uText1;
varying vec2 vUv;
void main() {
vec2 xy = vUv;
vec4 color = texture2D(uText1,xy);
gl_FragColor = color;
}`;
And here's my current result :
Thanks a lot
You could simply use a custom uniform, e.g. :
uniform sampler2D uText1;
uniform vec2 uUvScale;
varying vec2 vUv;
void main() {
vec2 uv = (vUv - 0.5) * uUvScale + 0.5;
gl_FragColor = texture2D(uText1, uv);
}
And :
var imageAspect = texture.image.width / texture.image.height;
if ( aspect < imageAspect ) {
material.uniforms.uUvScale.value.set(aspect / imageAspect, 1)
} else {
material.uniforms.uUvScale.value.set(1, imageAspect / aspect)
}
The way Three.js handles texture transformations like .offset, .repeat, .rotation, .center is via a Matrix3 that gets passed as a uniform into the vertex shader. The vertex shader performs the matrix multiplication, then passes the modified UVs as a varying to the fragment shader.
You can see that uniform being declared in the uv_pars_vertex.glsl.js file
You can see the transform being applied in the uv_vertex.glsl.js file
You could copy those lines of GLSL code to your ShaderMaterial's vertex shader, and I think the texture properties will come through in the Matrix3 automatically. However, if for some reason it doesn't, you could recreate the Matrix3 by copying it from the source and passing it as a uniform manually. I don't know what your utility function looks like, so it's hard to tell how you're achieving the desired scaling.

Scale video texture with mask over image texture background

Referencing this previous question Scaling Video texture mixed with image texture
I have the webcam video texture partially scaling with a mask provided by bodypix data. However I can't figure out how to remove the white background produced from mixing the two. I'm not sure where the extra alpha is coming from.
Updated fiddle https://jsfiddle.net/danrossi303/q8gz5cun/10/
the fragment for this
precision mediump float;
uniform sampler2D background;
uniform sampler2D frame;
uniform sampler2D mask;
uniform float texWidth;
uniform float texHeight;
void main(void) {
vec2 texCoord = gl_FragCoord.xy / vec2(texWidth,texHeight);
vec2 frameuv = texCoord * vec2(texWidth, texHeight) / vec2(200.0, 200.0);
vec4 texel0 = texture2D(background, texCoord);
vec4 frameTex = texture2D(frame, frameuv.xy);
vec4 maskTex = texture2D(mask, frameuv.xy);
vec4 texel1 = vec4(frameTex.rgb, maskTex.a * 255.);
gl_FragColor = mix(texture2D(background, texCoord), texel1, step(frameuv.x, 1.0) * step(frameuv.y, 1.0));
}
I need to position the scaled video to any corner of the canvas, and somehow provide a dynamic uniform to change the texture coord back to the viewport size. Currently it is hardcoded to 200, but should revert back.
Unscaled example with masked bodypix video over image.
https://jsfiddle.net/danrossi303/q8gz5cun/6/
The alpha channel is in range [0.0, 1.0] (linke the color channels. Hence the multiplication with 255. is obviously wrong:
vec4 texel1 = vec4(frameTex.rgb, maskTex.a * 255.);
vec4 texel1 = vec4(frameTex.rgb, maskTex.a);
Mixing the textures will likely depend on the mask:
gl_FragColor = mix(texture2D(background, texCoord), texel1,
step(frameuv.x, 1.0) * step(frameuv.y, 1.0) * maskTex.a);

Apply alpha to texture at certain clipspace coordinates

This is not exactly the problem I have, but a simplified version of it. Say, I have a single image display at full screen size. I want to modify the alpha of this image, so that at the left-half (horizontally) of the screen, the alpha is 0.5 and at the right-half, alpha is 1. Just alpha 0.5 or 1, nothing else in between.
Here are my (failed) codes so far
This is my javascript codes to setup webgl
this.gl = canvas.getContext('webgl', {
alpha: false,
});
this.gl.enable(this.gl.BLEND);
this.gl.blendFunc(this.gl.SRC_ALPHA, this.gl.ONE_MINUS_SRC_ALPHA);
this.gl.clearColor(1, 0, 0, 0); // red to highlight alpha problem
this is my vertex shader code
mediump float;
attribute vec2 coordinates;
attribute vec2 a_texcoord;
varying vec2 v_texcoord;
varying float alpha;
void main() {
gl_Position = vec4(coordinates.x, coordinates.y, 1.0, 1.0);
v_texcoord = a_texcoord;
if (coordinates.x <= 0) {
alpha = 0.5;
} else {
alpha = 1.0;
}
}
and my fragment shader is standard simple
precision mediump float;
varying vec2 v_texcoord;
varying float alpha;
uniform sampler2D u_texture;
void main() {
vec4 color = texture2D(u_texture, v_texcoord).rgba;
gl_FragColor = color;
}
and at draw time
window.canvas.gl.clear(window.canvas.gl.COLOR_BUFFER_BIT);
window.canvas.gl.useProgram(this.drawProgram);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, this.vertexBuffer);
window.canvas.gl.enableVertexAttribArray(this.positionLocation);
window.canvas.gl.vertexAttribPointer(this.positionLocation, 2, window.canvas.gl.FLOAT, false, 0, 0);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, null);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, this.texcoordBuffer);
window.canvas.gl.enableVertexAttribArray(this.texcoordLocation);
window.canvas.gl.vertexAttribPointer(this.texcoordLocation, 2, window.canvas.gl.FLOAT, false, 0, 0);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, null);
window.canvas.gl.bindTexture(window.canvas.gl.TEXTURE_2D, this.texture);
window.canvas.gl.uniform1i(this.textureLocation, 0);
window.canvas.gl.drawArrays(window.canvas.gl.TRIANGLES, 0, 6);
With these codes, I couldn't achieve what I want. First, the transparency does not start at the middle of the screen (clipspace x = 0), but at a seemingly random location. Also, there is gradual decline from alpha 1.0 to alpha 0.5, not the just the 2 values 0.5 and 1.0 I hope for. And I have no idea where this gradual transition comes from.
Obviously I am learning webgl so any pointer would be much appreciate. Any hint on how to solve the problem would be of great help to me. Thanks in advance!
the transparency does not start at the middle of the screen [..]
because alpha is evaluated per vertex and interpolated for the fragments.
You have to do the evaluation per fragment rather than per vertex.
Pass coordinates.x or gl_Position.x/gl_Position.w from the vertex shader to the fragment shader:
mediump float;
attribute vec2 coordinates;
attribute vec2 a_texcoord;
varying vec2 v_texcoord;
varying vec2 pos;
void main() {
gl_Position = vec4(coordinates.xy, 1.0, 1.0);
v_texcoord = a_texcoord;
pos = coordinates.xy;
}
Compute the alpha value in the fragment shader:
precision mediump float;
varying vec2 v_texcoord;
varying vec2 pos;
uniform sampler2D u_texture;
void main() {
vec4 color = texture2D(u_texture, v_texcoord).rgba;
float alpha = pos.x < 0.5 ? 0.5 : 1.0;
gl_FragColor = vec4(color.rgb, color.a * alpha);
}
Note, the vertex shader is executed once for each vertex coordinate. The coordinates define the corners of the primitives. The fragment shader is executed for each fragment. The output parameters of the vertex shader are interpolated dependent on the position of the fragment on the primitive. The interpolated value is the input to the fragment shader.
If alpha is calculated in the vertex shader, then the alpha value for the fragments on the left is 0.5 and on the right it is 1.0. The fragments in between get an smoothly interpolated value int he range [0.5, 1.0].
The same happens to the position when it is passed from the vertex shader to the fragment shader. But since alpha is calculated in the fragment shader, the alpha value of each fragment is either 0.5 or 1.0, dependent on the interpolated value of pos.x

Three.js Points, BufferGeometry : rendering point as circle

I'm new to three.js and shaders at all.
I need to create a sphere of particles, which move on surface like waves, but that's not a problem. Right now I got something like this.
And here is result I need.
So, how to render each point as circle or maybe, render texture? Right now my fragment shader is
uniform sampler2D texture;
uniform vec2 repeat;
uniform float uTime;
varying vec2 vOffset;
precision mediump float;
varying vec3 vColor;
varying vec2 vUv;
void main()
{
vec2 uv = vec2( gl_PointCoord.x, 1.0 - gl_PointCoord.y );
vec4 tex = texture2D( texture, uv * 0.5);
gl_FragColor = vec4(vec3(0.5, 0.8, 0.85), 0.8);
}
Ofc i tried to render gl_FragColor = tex, but it doesn't seem to work though. My texture is just a particle.
What do you mean gl_FragColor = tex didn't work?
If you want to use a texture the code should be
uniform sampler2D texture;
void main() {
gl_FragColor = texture2D(texture, gl_PointCoord);
}
and you should probably turn on blending and set it up for premultiplied alpha and make sure your texture is using premultiplied alpha and the depth test is off.

How to alternate between two textures?

I'm trying to draw to a texture, so that the rendering keeps getting compounded on top of each other.
I'm using two textures and two framebuffers.
texture[0] is attached to framebuffer[0], and texture[1] is attached to framebuffer[1].
var gl = webgl.context;
gl.useProgram(this.program);
gl.bindTexture(gl.TEXTURE_2D, this.texture[this.pingpong]);
this.pingpong = (this.pingpong==0?1:0);
var pp = this.pingpong;
gl.bindFramebuffer(gl.FRAMEBUFFER, this.frameBuffer[pp]);
gl.drawArrays(gl.TRIANGLES, 0, 6); //primitiveType, offset, count
gl.bindTexture(gl.TEXTURE_2D, this.texture[pp]); //bind to the texture we just drew to
gl.bindFramebuffer(gl.FRAMEBUFFER, null); //render the above texture to the canvas
gl.drawArrays(gl.TRIANGLES, 0, 6);
The issue, is that I'm not seeing the previous renders get saved into the textures.
I thought that bindframebuffer() would make it render to the texture.
my fragment shader:
precision mediump float; // fragment shaders don't have a default precision so we need to pick one. mediump is a good default
varying vec2 v_texCoord; // the texCoords passed in from the vertex shader.
uniform vec2 u_resolution; // a uniform
uniform vec2 u_mouse;
uniform sampler2D u_image; // this isn't set, so it will default to 0 (the current active texture)
void main() {
vec4 texColor = texture2D(u_image, v_texCoord); // Look up a color from the texture.
vec2 coord = vec2(gl_FragCoord.x,u_resolution.y-gl_FragCoord.y);
vec4 color = step(distance(coord,u_mouse),100.0)*vec4(1,0,0,1) + step(100.0,distance(coord,u_mouse))*texColor;
gl_FragColor = color; // gl_FragColor is a special variable a fragment shader is responsible for setting
}
u_image is set to default 0, the active texture.
Is there something I'm overlooking? Why won't the previous renders get compounded on top of each other? It is just showing the latest render as if the textures haven't been altered.
here is the vertex shader:
precision mediump float;
attribute vec2 a_position; // an attribute will receive data from a buffer
attribute vec2 a_texCoord;
varying vec2 v_texCoord; // a varying
uniform vec2 u_resolution; // a uniform
uniform vec2 u_mouse;
uniform float u_flip;
// all shaders have a main function
void main() {
v_texCoord = a_texCoord; // pass the texCoord to the fragment shader.The GPU will interpolate this value between points
vec2 zeroToOne = a_position / u_resolution; // convert the position from pixels to 0.0 to 1.0
vec2 zeroToTwo = zeroToOne * 2.0; // convert from 0->1 to 0->2
vec2 clipSpace = zeroToTwo - 1.0; // convert from 0->2 to -1->+1 (clipspace)
// gl_Position is a special variable a vertex shader is responsible for setting
gl_Position = vec4(clipSpace * vec2(1, u_flip), 0, 1);
}
I tried to emulate what you are doing:
loop.flexStep = function(){
gl.clear(gl.COLOR_BUFFER_BIT);
pingPong.pingPong().applyPass( //pingPong() just swaps the FBOs and return the current FBO being draw to
"outColor += src0 * 0.98;",
pingPong.otherTexture() // the src0 texture
);
points.drawPoint([mouse.x, mouse.y, 0 ]);
points.renderAll(camera);
screenBuffer.applyPass(
"outColor = src0;",
pingPong.resultFBO // src0
);
};
Here is what it looks like (gl.POINTS instead of circle):
Here are the gl commands:
You should check if your framebuffers are setup correctly first. Since that part isnt shown its hard to say.
Also consider to separate your shader. You should have one shader in the pingpong stage to copy/alter the result of previous texture to the current pingPong FBO texture. And another shader to draw the new stuff to the current pingPong FBO texture.
Issue seemed to be with getting the wrong texture coordinate and also flipping the texture.
It's working as expected now.
here's the updated vertex/fragment shaders:
this.vertex = `
precision mediump float;
attribute vec2 a_position; // an attribute will receive data from a buffer
attribute vec2 a_texCoord;
varying vec2 v_texCoord; // a varying
uniform vec2 u_resolution; // a uniform
uniform vec2 u_mouse;
uniform float u_flip;
// all shaders have a main function
void main() {
v_texCoord = a_texCoord / u_resolution; // pass the texCoord to the fragment shader.The GPU will interpolate this value between points
vec2 zeroToOne = a_position / u_resolution; // convert the position from pixels to 0.0 to 1.0
vec2 zeroToTwo = zeroToOne * 2.0; // convert from 0->1 to 0->2
vec2 clipSpace = zeroToTwo - 1.0; // convert from 0->2 to -1->+1 (clipspace)
// gl_Position is a special variable a vertex shader is responsible for setting
gl_Position = vec4(clipSpace * vec2(1, u_flip), 0, 1);
}
`;
this.fragment = `
precision mediump float; // fragment shaders don't have a default precision so we need to pick one. mediump is a good default
varying vec2 v_texCoord; // the texCoords passed in from the vertex shader.
uniform vec2 u_resolution; // a uniform
uniform vec2 u_mouse;
uniform float u_flip;
uniform sampler2D u_image; // this isn't set, so it will default to 0 (the current active texture)
uniform sampler2D u_texture0;
uniform sampler2D u_texture1;
void main() {
vec4 texColor = texture2D(u_image, v_texCoord); // Look up a color from the texture.
vec2 coord = vec2(gl_FragCoord.x, step(0.0,u_flip)*gl_FragCoord.y + step(u_flip,0.0)*(u_resolution.y-gl_FragCoord.y) );
vec4 color = step(distance(coord,u_mouse),100.0)*vec4(1,0,0,1) + step(100.0,distance(coord,u_mouse))*texColor;
gl_FragColor = color; // gl_FragColor is a special variable a fragment shader is responsible for setting
}
`;

Categories

Resources