I started learning WebGL a couple of weeks ago and as I am trying to learn by practice, I stumbled upon a simple example of a shader that I could implement using p5.js.
In this example, I am creating concentric circles starting from the center of the screen, using this fragment, where u_resolution and u_time are uniforms passed down from p5 script as [windowWidth, windowHeight] and respectively frameCount as below:
void main(void) {
float maxAxis = max(u_resolution.x, u_resolution.y);
vec2 uv = gl_FragCoord.xy / maxAxis;
vec2 center = u_resolution / maxAxis;
gl_FragColor = vec4(
vec3(sin(u_time + distance(uv, center) * 255.0)),
1.0);
}
Using this example, I can achieve what I want, but I cannot understand why I cannot calculate the center of the fragment using the formula:
vec2 center = vec2(u_resolution.x * 0.5, u_resolution.y * 0.5);
If I do this, then it will mess up the whole rendering.
Is there a coordinate system mismatch that I am missing, or something else?
For a better explanation, I included a snippet of the original experiment that I am doing in CodePen right here.
uv and center are in range [0.0, 1.0]. There for the center of the viewport is:
vec2 center = vec2(u_resolution.x * 0.5, u_resolution.y * 0.5);
vec2 center = 0.5 * u_resolution / maxAxis;
let myShaderIn, myShaderOut;
let isPlaying = true;
const vertexShader = document.getElementById("vert-shader").textContent;
const fragmentShaderStyleIn = document.getElementById("frag-shader-style-in")
.textContent;
function setup() {
const canvas = createCanvas(windowWidth, windowHeight, WEBGL);
// canvas.mousePressed(toggleSound);
rectMode(CENTER);
// shaders
myShaderIn = createShader(vertexShader, fragmentShaderStyleIn);
// register shaders
shader(myShaderIn);
// shapes setup
noStroke();
}
function draw() {
background(0);
drawEllipse();
}
function drawEllipse() {
myShaderIn.setUniform("u_resolution", [float(width), float(height)]);
myShaderIn.setUniform("u_time", float(frameCount));
shader(myShaderIn);
ellipse(0, 0, width/2);
}
function windowResized() {
resizeCanvas(windowWidth, windowHeight);
clear();
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.3.1/p5.min.js"></script>
<!-- vertex shader -->
<script type="x-shader/x-vertex" id="vert-shader">
#ifdef GL_FRAGMENT_PRECISION_HIGH
precision highp float;
#else
precision mediump float;
#endif
attribute vec3 aPosition;
uniform mat4 uProjectionMatrix;
uniform mat4 uModelViewMatrix;
void main() {
vec4 newPosition = vec4(aPosition, 1.0);
gl_Position = uProjectionMatrix * uModelViewMatrix * newPosition;
}
</script>
<!-- fragment shader -->
<script type="x-shader/x-fragment" id="frag-shader-style-in">
#ifdef GL_FRAGMENT_PRECISION_HIGH
precision highp float;
#else
precision mediump float;
#endif
uniform vec2 u_resolution; // canvas size (width, height)
uniform float u_time; // time in seconds since load
void main(void) {
float maxAxis = max(u_resolution.x, u_resolution.y);
// If you want to map the pixel coordinate values to the range 0 to 1, you divide by resolution.
/*With vec4 gl_FragCoord, we know where a thread is working inside the billboard.
In this case we don't call it uniform because it will be different from thread to thread,
instead gl_FragCoord is called a varying. */
vec2 uv = gl_FragCoord.xy / maxAxis;
vec2 center = 0.5 * u_resolution / maxAxis;
gl_FragColor = vec4(
vec3(sin(u_time * 0.1 + distance(uv, center) * 255.0)),
1.0);
}
</script>
Related
I've a plane geometry and I'm creating a CustomShader material related to it. It will receive some textures as uniforms. I'd like the textures to perfectly cover my plane (like the background-size:cover css property)
I managed to do it with an utility function when I used my textures with a MeshBasicMaterial :
cover( texture, aspect ) {
var imageAspect = texture.image.width / texture.image.height;
if ( aspect < imageAspect ) {
texture.matrix.setUvTransform( 0, 0, aspect / imageAspect, 1, 0, 0.5, 0.5 );
} else {
texture.matrix.setUvTransform( 0, 0, 1, imageAspect / aspect, 0, 0.5, 0.5 );
}
}
But unfortunately since I'm using the ShaderMaterial, my "cover" function doesn't apply anymore. Am I force to do it inside my fragment shader? If so how can I manage to reproduce this behavior ?
Here's my code :
const vertexShader = `
precision highp float;
uniform mat3 uUvTransform;
varying vec2 vUv;
void main() {
vUv = ( uUvTransform * vec3( uv, 1 ) ).xy;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}`;
const fragmentShader = `
precision highp float;
uniform sampler2D uText1;
varying vec2 vUv;
void main() {
vec2 xy = vUv;
vec4 color = texture2D(uText1,xy);
gl_FragColor = color;
}`;
And here's my current result :
Thanks a lot
You could simply use a custom uniform, e.g. :
uniform sampler2D uText1;
uniform vec2 uUvScale;
varying vec2 vUv;
void main() {
vec2 uv = (vUv - 0.5) * uUvScale + 0.5;
gl_FragColor = texture2D(uText1, uv);
}
And :
var imageAspect = texture.image.width / texture.image.height;
if ( aspect < imageAspect ) {
material.uniforms.uUvScale.value.set(aspect / imageAspect, 1)
} else {
material.uniforms.uUvScale.value.set(1, imageAspect / aspect)
}
The way Three.js handles texture transformations like .offset, .repeat, .rotation, .center is via a Matrix3 that gets passed as a uniform into the vertex shader. The vertex shader performs the matrix multiplication, then passes the modified UVs as a varying to the fragment shader.
You can see that uniform being declared in the uv_pars_vertex.glsl.js file
You can see the transform being applied in the uv_vertex.glsl.js file
You could copy those lines of GLSL code to your ShaderMaterial's vertex shader, and I think the texture properties will come through in the Matrix3 automatically. However, if for some reason it doesn't, you could recreate the Matrix3 by copying it from the source and passing it as a uniform manually. I don't know what your utility function looks like, so it's hard to tell how you're achieving the desired scaling.
I've tried searching and is a trivial thing to do but can't figure it out, I looked at many answers but never answered what I need with full working examples.
I am trying to scale the video texture in a fragment shader, with an image texture in the background. So can't transform vertex. The video texture needs to be resized to 200 x 200 and positioned in the top or bottom corner.
It will use MediaStream eventually for webcam, with transparent background effect. It works with bodypix on top of an image texture, using the mix shader but need to scale the video so it doesn't stretch to the viewport.
If I try a vec2 of vec2(200,200), clamp will stretch the rest of the colours. Repeat will tile it. Neither of these is the expected result.
Jsfiddle of video render but need to resize it.
https://jsfiddle.net/danrossi303/or3dk2q4/16/
The basic fragment shader I am working with.
precision mediump float;
uniform sampler2D background;
uniform sampler2D frame;
uniform float texWidth;
uniform float texHeight;
void main(void) {
vec2 texCoord = gl_FragCoord.xy / vec2(texWidth,texHeight);
vec4 texel0 = texture2D(background, texCoord);
vec4 texel1 = texture2D(frame, texCoord);
gl_FragColor = mix(texel0, texel1, 1.);
}
You need to scale the texture coordinates for the video:
vec2 frameuv = texCoord * vec2(texWidth, texHeight) / vec2(200.0, 200.0);
vec4 texel1 = texture2D(frame, frameuv);
Discard the frame (texel1) if frameuv.x or frameuv.y is greater than 1.0 (see step):
float w = step(frameuv.x, 1.0) * step(frameuv.y, 1.0);
gl_FragColor = mix(texel0, texel1, w);
Complete fragment shader:
precision mediump float;
uniform sampler2D background;
uniform sampler2D frame;
uniform float texWidth;
uniform float texHeight;
void main(void) {
vec2 texCoord = gl_FragCoord.xy / vec2(texWidth,texHeight);
vec4 texel0 = texture2D(background, texCoord);
vec2 frameuv = texCoord * vec2(texWidth, texHeight) / vec2(200.0, 200.0);
vec4 texel1 = texture2D(frame, frameuv);
gl_FragColor = mix(texel0, texel1, step(frameuv.x, 1.0) * step(frameuv.y, 1.0));
}
Additionally there is a bug when you load the background texture. gl.bindTexture binds a texture object to the current texture unit. Therfor the texture unit has to be set with gl.activeTexture, before calling gl.bindTexture:
img.onload = () => {
gl.activeTexture(gl.TEXTURE0); // <---
gl.bindTexture(gl.TEXTURE_2D, texture);
initBackgroundTexture();
};
img.src = "https://videos.electroteque.org/textures/virtualbg.jpg";
Discarding instances in the vertex shader
I am using instanced geom to display content using webGL2. As part of the process each instance has a color component which for some instances may have an alpha value of zero.
Rather than have it passed on to the fragment shader to discard, I check alpha in the vertext shader. If zero then I output each vert as vec4(-2) to put it outside the clip or at worst have it as a 1 pixel point.
I can not find information on how this is handled by the rendering pipe.
Is this the best strategy to remove instances from the pipeline?
The alternative is to remove the instances from the buffer which in JS is a CPU intensive operation when dealing with 1000's of instances.
The shaders
const vertexSrc = `#version 300 es
#define alphaCut 0.0
in vec2 verts;
in vec2 pos;
in vec2 scale;
in vec2 offset;
in float rotate;
in float zIdx; // z index for zBuf clip only
in vec4 color; // RGBA to color.a == 0.0 to remove not render
in uint spriteIdx;
uniform vec4 sheetLayout[128];
uniform vec2 sheetSize;
uniform mat2 view;
uniform vec2 origin;
out vec2 spr;
out vec4 col;
void main() {
if (color.a <= alphaCut) {
gl_Position = vec4(-2); // put this instance outside clip
return;
}
col = color;
vec4 sprite = sheetLayout[spriteIdx];
spr = sprite.xy + verts * sprite.zw / sheetSize;
vec2 loc = (verts - offset) * scale * sprite.zw;
float xdx = cos(rotate);
float xdy = sin(rotate);
loc = view * (vec2(loc.x * xdx - loc.y * xdy, loc.x * xdy + loc.y * xdx) + pos - origin);
gl_Position = vec4(loc, zIdx, 1);
}`;
const fragmentSrc = `#version 300 es
precision mediump float;
uniform sampler2D tex;
#define alphaCut 0.0;
in vec2 spr;
in vec4 col;
out vec4 pixel;
void main() {
pixel = texture(tex, spr) * col;
if (pixel.a <= alphaCut) { discard; }
}`;
As pointed out in the question's comments and the deleted answer, to move the vertex outside the clip space requires gl_Position = vec4(vec3(-2), 1)
Setting gl_Position = vec4(-2) will put the vertex at vec3(1) (top right on the near plane). This is inside the clip area and thus instance geom will end up in the fragment shader.
But why?
Perspective division
When the vertex moves from the vertex shader as gl_position it is as a 4D vector. To clip the vertex we need a 3D vector that is outside 3d clip space.
The normalization of the vertex from 4D to 3D is called perspective division and is performed automatically by dividing the vectors x, y and z components by the w component.
Thus to correctly clip the geom instance and ensure it does not reach the fragment shader...
#define alphaCut 0.1
//...
void main() {
if (color.a <= alphaCut) {
gl_Position = vec4(vec3(2), 1);
return;
}
//... rest of code
This is not exactly the problem I have, but a simplified version of it. Say, I have a single image display at full screen size. I want to modify the alpha of this image, so that at the left-half (horizontally) of the screen, the alpha is 0.5 and at the right-half, alpha is 1. Just alpha 0.5 or 1, nothing else in between.
Here are my (failed) codes so far
This is my javascript codes to setup webgl
this.gl = canvas.getContext('webgl', {
alpha: false,
});
this.gl.enable(this.gl.BLEND);
this.gl.blendFunc(this.gl.SRC_ALPHA, this.gl.ONE_MINUS_SRC_ALPHA);
this.gl.clearColor(1, 0, 0, 0); // red to highlight alpha problem
this is my vertex shader code
mediump float;
attribute vec2 coordinates;
attribute vec2 a_texcoord;
varying vec2 v_texcoord;
varying float alpha;
void main() {
gl_Position = vec4(coordinates.x, coordinates.y, 1.0, 1.0);
v_texcoord = a_texcoord;
if (coordinates.x <= 0) {
alpha = 0.5;
} else {
alpha = 1.0;
}
}
and my fragment shader is standard simple
precision mediump float;
varying vec2 v_texcoord;
varying float alpha;
uniform sampler2D u_texture;
void main() {
vec4 color = texture2D(u_texture, v_texcoord).rgba;
gl_FragColor = color;
}
and at draw time
window.canvas.gl.clear(window.canvas.gl.COLOR_BUFFER_BIT);
window.canvas.gl.useProgram(this.drawProgram);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, this.vertexBuffer);
window.canvas.gl.enableVertexAttribArray(this.positionLocation);
window.canvas.gl.vertexAttribPointer(this.positionLocation, 2, window.canvas.gl.FLOAT, false, 0, 0);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, null);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, this.texcoordBuffer);
window.canvas.gl.enableVertexAttribArray(this.texcoordLocation);
window.canvas.gl.vertexAttribPointer(this.texcoordLocation, 2, window.canvas.gl.FLOAT, false, 0, 0);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, null);
window.canvas.gl.bindTexture(window.canvas.gl.TEXTURE_2D, this.texture);
window.canvas.gl.uniform1i(this.textureLocation, 0);
window.canvas.gl.drawArrays(window.canvas.gl.TRIANGLES, 0, 6);
With these codes, I couldn't achieve what I want. First, the transparency does not start at the middle of the screen (clipspace x = 0), but at a seemingly random location. Also, there is gradual decline from alpha 1.0 to alpha 0.5, not the just the 2 values 0.5 and 1.0 I hope for. And I have no idea where this gradual transition comes from.
Obviously I am learning webgl so any pointer would be much appreciate. Any hint on how to solve the problem would be of great help to me. Thanks in advance!
the transparency does not start at the middle of the screen [..]
because alpha is evaluated per vertex and interpolated for the fragments.
You have to do the evaluation per fragment rather than per vertex.
Pass coordinates.x or gl_Position.x/gl_Position.w from the vertex shader to the fragment shader:
mediump float;
attribute vec2 coordinates;
attribute vec2 a_texcoord;
varying vec2 v_texcoord;
varying vec2 pos;
void main() {
gl_Position = vec4(coordinates.xy, 1.0, 1.0);
v_texcoord = a_texcoord;
pos = coordinates.xy;
}
Compute the alpha value in the fragment shader:
precision mediump float;
varying vec2 v_texcoord;
varying vec2 pos;
uniform sampler2D u_texture;
void main() {
vec4 color = texture2D(u_texture, v_texcoord).rgba;
float alpha = pos.x < 0.5 ? 0.5 : 1.0;
gl_FragColor = vec4(color.rgb, color.a * alpha);
}
Note, the vertex shader is executed once for each vertex coordinate. The coordinates define the corners of the primitives. The fragment shader is executed for each fragment. The output parameters of the vertex shader are interpolated dependent on the position of the fragment on the primitive. The interpolated value is the input to the fragment shader.
If alpha is calculated in the vertex shader, then the alpha value for the fragments on the left is 0.5 and on the right it is 1.0. The fragments in between get an smoothly interpolated value int he range [0.5, 1.0].
The same happens to the position when it is passed from the vertex shader to the fragment shader. But since alpha is calculated in the fragment shader, the alpha value of each fragment is either 0.5 or 1.0, dependent on the interpolated value of pos.x
In my initial question thread I asked about why my lighting wasn't showing and got an answer based on the info I provided, but I realize i wasn't asking the right question. I kind of think I know my problem now. The newest revision doesn't seem to support tangents in the same way that the original script uses the function, as it seems to automatically delete the .computeTangent command, and I thought that didn't matter, but I just noticed my shaders rely heavily on this to map the light to the texture as it rotates around my object.
So this is what my shader scripts look like, is there any way to update this so it works with the newest revision.
<script id="vertexShader" type="x-shader/x-vertex">
attribute vec4 tangent;
uniform vec2 uvScale;
uniform vec3 lightPosition;
varying vec2 vUv;
varying mat3 tbn;
varying vec3 vLightVector;
void main() {
vUv = uvScale * uv;
/** Create tangent-binormal-normal matrix used to transform
coordinates from object space to tangent space */
vec3 vNormal = normalize(normalMatrix * normal);
vec3 vTangent = normalize( normalMatrix * tangent.xyz );
vec3 vBinormal = normalize(cross( vNormal, vTangent ) * tangent.w);
tbn = mat3(vTangent, vBinormal, vNormal);
/** Calculate the vertex-to-light vector */
vec4 lightVector = viewMatrix * vec4(lightPosition, 1.0);
vec4 modelViewPosition = modelViewMatrix * vec4(position, 1.0);
vLightVector = normalize(lightVector.xyz - modelViewPosition.xyz);
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
</script>
<!-- GLSL fragment shader for the moon -->
<script id="fragmentShader" type="x-shader/x-fragment">
uniform sampler2D textureMap;
uniform sampler2D normalMap;
varying vec2 vUv;
varying mat3 tbn;
varying vec3 vLightVector;
void main() {
/** Transform texture coordinate of normal map to a range (-1, 1) */
vec3 normalCoordinate = texture2D(normalMap, vUv).xyz * 2.0 - 1.0;
/** Transform the normal vector in the RGB channels to tangent space */
vec3 normal = normalize(tbn * normalCoordinate.rgb);
/** Lighting intensity is calculated as dot of normal vector and
the vertex-to-light vector */
float intensity = max(0.07, dot(normal, vLightVector));
vec4 lighting = vec4(intensity, intensity, intensity, 1.0);
/** Final color is calculated with the lighting applied */
gl_FragColor = texture2D(textureMap, vUv) * lighting;
}