I'm developing a GLSL shader that draws out a circle based on a given center position and radius. For some reason I don't understand, the circle's radius does not match what I am passing. That is, when I pass in 100 for u_radius, the radius is instead 56. I tried just doubling the value in the shader, and while it's close with that, it's still slightly inaccurate. Does anyone have any clue what could be causing the discrepancy?
precision mediump float;
varying vec4 v_pos;
uniform mat3 u_matrix;
uniform vec2 u_center; // the center of the circle in world coordinates
uniform float u_aspect; // aspect ratio. 1.7778 for my monitor (1920x1080)
uniform float u_radius; // radius. passing in 100
uniform vec2 u_canvasSize; // [ 1920, 1080 ]
void main() {
vec4 c = vec4((u_matrix * vec3(u_center, 1)).xy, 0, 1); // center
vec2 onePix = vec2(1.0, 1.0) / u_canvasSize; // the size of one pixel in clip-space
float onePixLength = sqrt(onePix.x * onePix.x + onePix.y * onePix.y); // magnitude of one pixel
float r = onePixLength * u_radius; // radius converted to clip-space
vec2 dist = (v_pos.xy - c.xy) * vec2(u_aspect, 1.0); // distance from center to current shader point
if (dist.x * dist.x + dist.y * dist.y > r * r) {
discard;
}
gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}
It is because normalized device space coordinates are in range [-1.0, 1.0]. Therefore a factor of 2 is missing when calculating the pixel size:
vec2 onePix = vec2(1.0, 1.0) / u_canvasSize;
vec2 onePix = vec2(2.0) / u_canvasSize;
Additionally, you need to calculate the side length of a pixel instead of the diagonal length. The width (x-dimension) is scaled by the aspect ratio. Therefore, you need to calculate the height of a pixel:
float onePixLength = sqrt(onePix.x * onePix.x + onePix.y * onePix.y);
float onePixLength = onePix.y;
Note, onePix.x * u_aspect is equal to onePix.y.
Related
I have a shader that draws a bunch of instanced circles, and it works great! It works by basically drawing a bunch of rectangles at every given location, and then in the fragment shader it effectively discards pixels that are outside the radius, and this draws a circle.
I'm trying to update the shader now to make it draw "wavy" circles. That is, having a sin curve trace the entire outer edge of the circle. But the issue I'm running into now is that this curve will clip outside the bounds of the rectangle, and as a result, edges will be cut off. I drew a (crude) picture of what I think is happening:
As you can see, making a circle by hollowing out a quad works fine in the easy case. But when you add waves to the circle, portions of it clip outside of the unit space, causing those portions to not be rendered, so the rendered circle gets cut off at those parts. Here is what it looks like in my application (notice it gets cut off on the top, bottom, right, and left edges):
Here is where I believe the clip is occurring:
Here are my current vertex and fragment shaders for drawing these wavy circles. Is there any way I can modify them to prevent this clipping from occurring? Or maybe there is some WebGL setting I could use to fix this?
Vertex Shader:
in vec2 a_unit; // unit quad
in vec4 u_transform; // x, y, r, alpha
uniform mat3 u_projection; // camera
out float v_tint;
out vec2 v_pos;
void main() {
float r = u_transform.z;
float x = u_transform.x - r;
float y = u_transform.y - r;
float w = r * 2.0;
float h = r * 2.0;
mat3 world = mat3(
w, 0, 0,
0, h, 0,
x, y, 1
);
gl_Position = vec4(u_projection * world * vec3(a_unit, 1), 1);
v_tint = u_transform.w;
v_pos = a_unit;
}
Fragment Shader:
in vec2 v_pos;
in float v_tint;
uniform vec4 u_color;
uniform mat3 u_projection;
uniform float u_time;
out vec4 outputColor;
void main() {
vec2 cxy = 2.0 * v_pos - 1.0; // convert to clip space
float r = cxy.x * cxy.x + cxy.y * cxy.y;
float theta = 3.1415926 - atan(cxy.y, cxy.x) * 10.0; // current angle
r += 0.3 * sin(theta); // add waves
float delta = fwidth(r); // anti-aliasing
float alpha = 1.0 - smoothstep(1.0 - delta, 1.0 + delta, r);
outputColor = u_color * alpha * vec4(1, 1, 1, v_tint);
}
I'd like to reproduce this effect on my three.js scene : https://www.shadertoy.com/view/3ljfzV
To do so, I'm using a ShaderMaterial(). I first made sure that my textures fit perfectly my scene based on this solution.
Then, I get rid of the gl_FragCoord since I have modified UVs. I replaced it with this formula : glFragCoord = modifiedUVs * uResolution
You can see here my current result. Here's the related fragment shader ↓
precision highp float;
uniform sampler2D uText1; // texture 1
uniform sampler2D uText2; // texture 2
uniform vec3 uResolution; // width and height of my scene
uniform float uTime;
uniform vec2 uUvScale; // UV Scale calculated with the resolution of my texture and the viewport
varying vec2 vUv; // uvs from my vertex shader
// parameters for the effect
float freq = 3.2, period = 8.0, speed = 2., fade = 4., displacement = 0.2;
void main()
{
// make my textures fits like the css background-size:cover property
vec2 uv = (vUv - 0.5) * uUvScale + 0.5;
vec2 R = uResolution.xy,
U = (2. * (uv * uResolution.xy) - R) / min(R.x, R.y), //2.
T = ((uv * uResolution.xy)) / R.y;
float D = length(U);
float frame_time = mod(uTime * speed, period);
float pixel_time = max(0.0, frame_time - D);
float wave_height = (cos(pixel_time * freq) + 1.0) / 2.0;
float wave_scale = (1.0 - min(1.0, pixel_time / fade));
float frac = wave_height * wave_scale;
if (mod(uTime * speed, period * 2.0) > period)
{
frac = 1. - frac;
}
vec2 tc = T + ((U / D) * -((sin(pixel_time * freq) / fade) * wave_scale) * displacement);
gl_FragColor = mix(
texture2D(uText1,tc),
texture2D(uText2,tc),
frac);
}
As you can see, the displacement works great, but I have trouble to make my textures fits the whole scene.
I think I'm pretty close to make it fully work because when I change the texture coordinates by the modified uvs my textures displays correctly ↓ In this case, only the displacement is missing as you can see here.
gl_FragColor = mix(
texture2D(uText1,uv),
texture2D(uText2,uv),
frac);
Does anyone know how can I combine correctly my modified UVs with the gl_FragCoord value? Should I replace the gl_FragCoord by another formula to keep both the displacement and the position of my textures?
Thank you very much
EDIT :
I've been told that I can add this line :
tc.x *= uResolution.y/uResolution.x;
It fixed the textures positions but now I don't have a perfect circular displacement as you can see here.
I'm working with WebGL and I'm trying to clip away what I'm drawing to draw a circle, but currently it's drawing an ellipse instead. Here is my fragment shader:
void main() {
vec4 c = vec4((u_matrix * vec3(u_center, 1)).xy, 0, 1); // center
float r = .25; // radius
bool withinRadius = pow(v_pos.x - c.x, 2.) + pow(v_pos.y - c.y, 2.) < r * r;
if (!withinRadius) { discard; }
gl_FragColor = vec4(1, 1, 1, 1);
}
I think the issue is that because my screen size is 1920x1200, the horizontal clip space that goes from -1.0 to +1.0 is wider than the vertical clip space that goes from -1.0 to +1.0. I think the solution might involve somehow normalizing the clip-space such that it is square, but I'm not exactly sure how to do that or what the recommended way to handle that is. How do you normally handle this scenario?
You have to scale either the x or the y component of the vector from the center of the circle to the fragment. Add a unfiorm variable or constant to the fragment shader which holds the aspect ratio (aspect = width/height) or the resolution of the canvas and scale the x component of the vector by aspect:
uniform vec2 u_resolution;
void main()
{
float aspect = u_resolution.x / u_resolution.y;
vec4 c = vec4((u_matrix * vec3(u_center, 1)).xy, 0, 1); // center
float r = .25; // radius
vec2 dist_vec = (v_pos.xy - c.xy) * vec2(aspect, 1.0);
if (dot(dist_vec, dist_vec) > r*r)
discard;
gl_FragColor = vec4(1, 1, 1, 1);
}
Note, I've used the Dot product to compute the square of the Euclidean distance:
dot(va, vb) == va.x*vb.x + va.y*vb.y
fragCoord - The system constant of the pixel on the screen
(https://www.khronos.org/registry/OpenGL-Refpages/es3.0/html/gl_FragCoord.xhtml)
iResolution - Screen resolution that you have to transfer yourself
center - (0.5)
vec2 uv = fragCoord/iResolution.xy;
// fixe
uv.x -= center;
uv.x *= iResolution.x / iResolution.y;
uv.x += center;
//
float color = length(uv-vec2(0.5));
color = smoothstep(0.46,0.47,color);
Discarding instances in the vertex shader
I am using instanced geom to display content using webGL2. As part of the process each instance has a color component which for some instances may have an alpha value of zero.
Rather than have it passed on to the fragment shader to discard, I check alpha in the vertext shader. If zero then I output each vert as vec4(-2) to put it outside the clip or at worst have it as a 1 pixel point.
I can not find information on how this is handled by the rendering pipe.
Is this the best strategy to remove instances from the pipeline?
The alternative is to remove the instances from the buffer which in JS is a CPU intensive operation when dealing with 1000's of instances.
The shaders
const vertexSrc = `#version 300 es
#define alphaCut 0.0
in vec2 verts;
in vec2 pos;
in vec2 scale;
in vec2 offset;
in float rotate;
in float zIdx; // z index for zBuf clip only
in vec4 color; // RGBA to color.a == 0.0 to remove not render
in uint spriteIdx;
uniform vec4 sheetLayout[128];
uniform vec2 sheetSize;
uniform mat2 view;
uniform vec2 origin;
out vec2 spr;
out vec4 col;
void main() {
if (color.a <= alphaCut) {
gl_Position = vec4(-2); // put this instance outside clip
return;
}
col = color;
vec4 sprite = sheetLayout[spriteIdx];
spr = sprite.xy + verts * sprite.zw / sheetSize;
vec2 loc = (verts - offset) * scale * sprite.zw;
float xdx = cos(rotate);
float xdy = sin(rotate);
loc = view * (vec2(loc.x * xdx - loc.y * xdy, loc.x * xdy + loc.y * xdx) + pos - origin);
gl_Position = vec4(loc, zIdx, 1);
}`;
const fragmentSrc = `#version 300 es
precision mediump float;
uniform sampler2D tex;
#define alphaCut 0.0;
in vec2 spr;
in vec4 col;
out vec4 pixel;
void main() {
pixel = texture(tex, spr) * col;
if (pixel.a <= alphaCut) { discard; }
}`;
As pointed out in the question's comments and the deleted answer, to move the vertex outside the clip space requires gl_Position = vec4(vec3(-2), 1)
Setting gl_Position = vec4(-2) will put the vertex at vec3(1) (top right on the near plane). This is inside the clip area and thus instance geom will end up in the fragment shader.
But why?
Perspective division
When the vertex moves from the vertex shader as gl_position it is as a 4D vector. To clip the vertex we need a 3D vector that is outside 3d clip space.
The normalization of the vertex from 4D to 3D is called perspective division and is performed automatically by dividing the vectors x, y and z components by the w component.
Thus to correctly clip the geom instance and ensure it does not reach the fragment shader...
#define alphaCut 0.1
//...
void main() {
if (color.a <= alphaCut) {
gl_Position = vec4(vec3(2), 1);
return;
}
//... rest of code
I got a problem understanding the logic I am trying to implement with Three.js and the GPUComputationRenderer by yomboprime.
(https://github.com/yomboprime/GPGPU-threejs-demos/blob/gh-pages/js/GPUComputationRenderer.js)
I want to make a simple Verlet-Cloth-Simulation. Here is the logic I was already able to implement (short version):
1) Position-Fragment-Shader: This shader takes the old and current position texture and computes the new position like this:
vec3 position = texture2D( texturePosition, uv ).xyz;
vec3 oldPosition = texture2D( textureOldPosition, uv ).xyz;
position = (position * 2.0 - oldPosition + acceleration * delta *delta )
t = checkConstraints(position);
position += t;
gl_FragColor = vec4(position,1);
2) Old-Position-Shader This shader just saves the current position and saves it for the next step.
vec3 position = texture2D( texturePosition, uv ).xyz;
gl_FragColor = vec4(position,1);
This works fine, but with that pattern it's not possible to calculate the constraints more than once in one step, because each vertex is observed separately and cannot see the change of position that other pixels would have done in the first iteration.
What I am trying to do is to separate the constraints from the verlet. At the moment it looks somehow like this:
1) Position-Shader (texturePosition)
vec3 position = texture2D( textureConstraints, uv ).xyz;
vec3 oldPosition = texture2D( textureOldPosition, uv ).xyz;
position = (position * 2.0 - oldPosition + acceleration * delta *delta );
gl_FragColor = vec4(position, 1 );
2) Constraint-Shader (textureConstraints)
vec3 position = texture2D( texturePosition, uv ).xyz;
t = checkConstraints(position);
position += t;
gl_FragColor = vec4(position,1);
3) Old-Position-Shader (textureOldPosition)
vec3 position = texture2D( textureConstraints, uv ).xyz;
gl_FragColor = vec4(position,1);
This logic is not working, even if I don't calculate constraints at all and just pass the values like they were before. As soon as some acceleration is added in the Position-Shader the position values are exploding into nowhere.
What am I doing wrong?
This example is not verlet cloth, but I think the basic premise may help you. I have a fiddle that uses the GPUComputationRender to accomplish some spring physics on a point cloud. I think you could adapt it to your needs.
What you need is more information. You'll need fixed references to the cloth's original shape (as if it were a flat board) as well as the force currently being exerted on any of those points (by gravity + wind + structural integrity or whatever else), which then gives you that point's current position. Those point references to its original shape in combination with the forces are what will give your cloth a memory instead of flinging apart as it has been.
Here, for example, is my spring physics shader which the GPUComputationRenderer uses to compute the point positions in my visualization. The tOffsets in this case are the coordinates that give the cloud a permanent memory of its original shape - they never change. It is a DataTexture I add to the uniforms at the beginning of the program. Various forces like the the mass, springConstant, gravity, and damping also remain consistent and live in the shader. tPositions are the vec4 coords that change (two of the coords record current position, the other two record current velocity):
<script id="position_fragment_shader" type="x-shader/x-fragment">
// This shader handles only the math to move the various points. Adding the sprites and point opacity comes in the following shader.
uniform sampler2D tOffsets;
uniform float uTime;
varying vec2 vUv;
float hash(float n) { return fract(sin(n) * 1e4); }
float noise(float x) {
float i = floor(x);
float f = fract(x);
float u = f * f * (3.0 - 2.0 * f);
return mix(hash(i), hash(i + 1.0), u);
}
void main() {
vec2 uv = gl_FragCoord.xy / resolution.xy;
float damping = 0.98;
vec4 nowPos = texture2D( tPositions, uv ).xyzw;
vec4 offsets = texture2D( tOffsets, uv ).xyzw;
vec2 velocity = vec2(nowPos.z, nowPos.w);
float anchorHeight = 100.0;
float yAnchor = anchorHeight;
vec2 anchor = vec2( -(uTime * 50.0) + offsets.x, yAnchor + (noise(uTime) * 30.0) );
// Newton's law: F = M * A
float mass = 24.0;
vec2 acceleration = vec2(0.0, 0.0);
// 1. apply gravity's force:
vec2 gravity = vec2(0.0, 2.0);
gravity /= mass;
acceleration += gravity;
// 2. apply the spring force
float restLength = yAnchor - offsets.y;
float springConstant = 0.2;
// Vector pointing from anchor to point position
vec2 springForce = vec2(nowPos.x - anchor.x, nowPos.y - anchor.y);
// length of the vector
float distance = length( springForce );
// stretch is the difference between the current distance and restLength
float stretch = distance - restLength;
// Calculate springForce according to Hooke's Law
springForce = normalize(springForce);
springForce *= (springConstant * stretch);
springForce /= mass;
acceleration += springForce;
velocity += acceleration;
velocity *= damping;
vec2 newPosition = vec2(nowPos.x - velocity.x, nowPos.y - velocity.y);
// Write new position out
gl_FragColor = vec4(newPosition.x, newPosition.y, velocity.x, velocity.y);
}
</script>