Sprite is being stretched inversely proportional to canvas dimensions - javascript

I have a small program I'm working on to render sprites with 2D transformations, link here. My problem is that I'm trying to render a 100px by 100px square, but it's being stretched into a rectangle. I have absolutely zero idea what the offending code is, but here's some relevant pieces.
const position = gl.createBuffer()
gl.bindBuffer(gl.ARRAY_BUFFER, position)
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
-w/2, h/2,
w/2, h/2,
-w/2, -h/2,
w/2, -h/2
]), gl.STATIC_DRAW)
gl.bindBuffer(gl.ARRAY_BUFFER, position)
gl.vertexAttribPointer(attrib.vertexPosition,
2, gl.FLOAT, false, 0, 0)
gl.enableVertexAttribArray(attrib.vertex)
gl.uniformMatrix2fv(uniform.transformMatrix, false, transform)
gl.uniform2f(uniform.translation, x+w/2, y+h/2)
gl.uniform2f(uniform.screenRes, gl.canvas.width, gl.canvas.height)
Vertex shader:
attribute vec2 aVertexPosition;
attribute vec2 aTextureCoord;
uniform mat2 uTransformMatrix;
uniform vec2 uTranslation;
uniform vec2 uScreenRes;
varying vec2 vTextureCoord;
void main() {
gl_Position = vec4(2.0 * (uTransformMatrix * aVertexPosition + uTranslation) / uScreenRes - 1.0, 1.0, 1.0);
vTextureCoord = aTextureCoord;
}
Feel free to toy around with the variables in the pen, especially the canvas dimensions; when you scale a dimension down, that dimension of the sprite scales up, and vice versa.
P.S. I'm not concerned about how the texture is inverted. I'm shelving that for later.

Your code is correct, however you forgot to specify the viewport.
Add this right before you make any draw calls (in your case, ideally after gl.clear())
gl.viewport(0, 0, gl.canvas.width, gl.canvas.height)
The WebGLRenderingContext.viewport() method of the WebGL API sets the
viewport, which specifies the affine transformation of x and y from
normalized device coordinates to window coordinates.

Related

How to prevent WebGL from clipping outside bounds when drawing a wavy circle?

I have a shader that draws a bunch of instanced circles, and it works great! It works by basically drawing a bunch of rectangles at every given location, and then in the fragment shader it effectively discards pixels that are outside the radius, and this draws a circle.
I'm trying to update the shader now to make it draw "wavy" circles. That is, having a sin curve trace the entire outer edge of the circle. But the issue I'm running into now is that this curve will clip outside the bounds of the rectangle, and as a result, edges will be cut off. I drew a (crude) picture of what I think is happening:
As you can see, making a circle by hollowing out a quad works fine in the easy case. But when you add waves to the circle, portions of it clip outside of the unit space, causing those portions to not be rendered, so the rendered circle gets cut off at those parts. Here is what it looks like in my application (notice it gets cut off on the top, bottom, right, and left edges):
Here is where I believe the clip is occurring:
Here are my current vertex and fragment shaders for drawing these wavy circles. Is there any way I can modify them to prevent this clipping from occurring? Or maybe there is some WebGL setting I could use to fix this?
Vertex Shader:
in vec2 a_unit; // unit quad
in vec4 u_transform; // x, y, r, alpha
uniform mat3 u_projection; // camera
out float v_tint;
out vec2 v_pos;
void main() {
float r = u_transform.z;
float x = u_transform.x - r;
float y = u_transform.y - r;
float w = r * 2.0;
float h = r * 2.0;
mat3 world = mat3(
w, 0, 0,
0, h, 0,
x, y, 1
);
gl_Position = vec4(u_projection * world * vec3(a_unit, 1), 1);
v_tint = u_transform.w;
v_pos = a_unit;
}
Fragment Shader:
in vec2 v_pos;
in float v_tint;
uniform vec4 u_color;
uniform mat3 u_projection;
uniform float u_time;
out vec4 outputColor;
void main() {
vec2 cxy = 2.0 * v_pos - 1.0; // convert to clip space
float r = cxy.x * cxy.x + cxy.y * cxy.y;
float theta = 3.1415926 - atan(cxy.y, cxy.x) * 10.0; // current angle
r += 0.3 * sin(theta); // add waves
float delta = fwidth(r); // anti-aliasing
float alpha = 1.0 - smoothstep(1.0 - delta, 1.0 + delta, r);
outputColor = u_color * alpha * vec4(1, 1, 1, v_tint);
}

Why is this shader circle so much smaller than the passed radius?

I'm developing a GLSL shader that draws out a circle based on a given center position and radius. For some reason I don't understand, the circle's radius does not match what I am passing. That is, when I pass in 100 for u_radius, the radius is instead 56. I tried just doubling the value in the shader, and while it's close with that, it's still slightly inaccurate. Does anyone have any clue what could be causing the discrepancy?
precision mediump float;
varying vec4 v_pos;
uniform mat3 u_matrix;
uniform vec2 u_center; // the center of the circle in world coordinates
uniform float u_aspect; // aspect ratio. 1.7778 for my monitor (1920x1080)
uniform float u_radius; // radius. passing in 100
uniform vec2 u_canvasSize; // [ 1920, 1080 ]
void main() {
vec4 c = vec4((u_matrix * vec3(u_center, 1)).xy, 0, 1); // center
vec2 onePix = vec2(1.0, 1.0) / u_canvasSize; // the size of one pixel in clip-space
float onePixLength = sqrt(onePix.x * onePix.x + onePix.y * onePix.y); // magnitude of one pixel
float r = onePixLength * u_radius; // radius converted to clip-space
vec2 dist = (v_pos.xy - c.xy) * vec2(u_aspect, 1.0); // distance from center to current shader point
if (dist.x * dist.x + dist.y * dist.y > r * r) {
discard;
}
gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}
It is because normalized device space coordinates are in range [-1.0, 1.0]. Therefore a factor of 2 is missing when calculating the pixel size:
vec2 onePix = vec2(1.0, 1.0) / u_canvasSize;
vec2 onePix = vec2(2.0) / u_canvasSize;
Additionally, you need to calculate the side length of a pixel instead of the diagonal length. The width (x-dimension) is scaled by the aspect ratio. Therefore, you need to calculate the height of a pixel:
float onePixLength = sqrt(onePix.x * onePix.x + onePix.y * onePix.y);
float onePixLength = onePix.y;
Note, onePix.x * u_aspect is equal to onePix.y.

How to draw a circle instead of an ellipse when your monitor screen resolution isn't square?

I'm working with WebGL and I'm trying to clip away what I'm drawing to draw a circle, but currently it's drawing an ellipse instead. Here is my fragment shader:
void main() {
vec4 c = vec4((u_matrix * vec3(u_center, 1)).xy, 0, 1); // center
float r = .25; // radius
bool withinRadius = pow(v_pos.x - c.x, 2.) + pow(v_pos.y - c.y, 2.) < r * r;
if (!withinRadius) { discard; }
gl_FragColor = vec4(1, 1, 1, 1);
}
I think the issue is that because my screen size is 1920x1200, the horizontal clip space that goes from -1.0 to +1.0 is wider than the vertical clip space that goes from -1.0 to +1.0. I think the solution might involve somehow normalizing the clip-space such that it is square, but I'm not exactly sure how to do that or what the recommended way to handle that is. How do you normally handle this scenario?
You have to scale either the x or the y component of the vector from the center of the circle to the fragment. Add a unfiorm variable or constant to the fragment shader which holds the aspect ratio (aspect = width/height) or the resolution of the canvas and scale the x component of the vector by aspect:
uniform vec2 u_resolution;
void main()
{
float aspect = u_resolution.x / u_resolution.y;
vec4 c = vec4((u_matrix * vec3(u_center, 1)).xy, 0, 1); // center
float r = .25; // radius
vec2 dist_vec = (v_pos.xy - c.xy) * vec2(aspect, 1.0);
if (dot(dist_vec, dist_vec) > r*r)
discard;
gl_FragColor = vec4(1, 1, 1, 1);
}
Note, I've used the Dot product to compute the square of the Euclidean distance:
dot(va, vb) == va.x*vb.x + va.y*vb.y
fragCoord - The system constant of the pixel on the screen
(https://www.khronos.org/registry/OpenGL-Refpages/es3.0/html/gl_FragCoord.xhtml)
iResolution - Screen resolution that you have to transfer yourself
center - (0.5)
vec2 uv = fragCoord/iResolution.xy;
// fixe
uv.x -= center;
uv.x *= iResolution.x / iResolution.y;
uv.x += center;
//
float color = length(uv-vec2(0.5));
color = smoothstep(0.46,0.47,color);

Draw particles in a single quad so they can interact with each other, how to pass particle positions to shaders?

I can render a single 2d particle using sdf like this:
void renderShape(out vec3 col, vec2 p) {
vec2 translation = vec2(0.0, 0.5);
// some math
col = // some color.
}
translation variable controls where to render this particle.
Now I calculate particle positions on the cpu, and need to pass these positions to the shaders and render each particle on different positions. How do I do that?
Currently only data I pass is the vertices of a quad like this:
let positions = [
-1, 1,
-1, -1,
1, -1,
-1, 1,
1,-1,
1, 1
];
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(positions), gl.STATIC_DRAW);
// on render
gl.bindVertexArray(vao);
gl.drawArrays(gl.TRIANGLES, 0, 6);
vertex shader:
#version 300 es
precision mediump float;
in vec2 a_position;
uniform vec2 u_resolution;
void main() {
gl_Position = vec4(a_position, 0, 1);
}
I want particles to combine with each other (using union operation) so I can't render them individually like with gl.POINTS

2D Image Processing With WebGL

I intend to create a simple photo editor in JS. My main question is, is it possible to create filters that render in real-time? For example, adjusting brightness and saturation. All I need is a 2D image where I can apply filters using the GPU.
All the tutorials I've read are very complex and don't really explain what the API mean. Please point me in the right direction. Thanks.
I was going to write a tutorial and post it on my blog but I don't know when I'll have time to finish so here's what I have Here's a more detailed set of posts on my blog.
WebGL is actually a rasterization library. I takes in attributes (streams of data), uniforms (variables) and expects you to provide "clip space" coordinates in 2d and color data for pixels.
Here's a simple example of 2d in WebGL (some details left out)
// Get A WebGL context
var gl = canvas.getContext("experimental-webgl");
// setup GLSL program
vertexShader = createShaderFromScriptElement(gl, "2d-vertex-shader");
fragmentShader = createShaderFromScriptElement(gl, "2d-fragment-shader");
program = createProgram(gl, vertexShader, fragmentShader);
gl.useProgram(program);
// look up where the vertex data needs to go.
var positionLocation = gl.getAttribLocation(program, "a_position");
// Create a buffer and put a single clipspace rectangle in
// it (2 triangles)
var buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
-1.0, -1.0,
1.0, -1.0,
-1.0, 1.0,
-1.0, 1.0,
1.0, -1.0,
1.0, 1.0]), gl.STATIC_DRAW);
gl.enableVertexAttribArray(positionLocation);
gl.vertexAttribPointer(positionLocation, 2, gl.FLOAT, false, 0, 0);
// draw
gl.drawArrays(gl.TRIANGLES, 0, 6);
Here's the 2 shaders
<script id="2d-vertex-shader" type="x-shader/x-vertex">
attribute vec2 a_position;
void main() {
gl_Position = vec4(a_position, 0, 1);
}
</script>
<script id="2d-fragment-shader" type="x-shader/x-fragment">
void main() {
gl_FragColor = vec4(0,1,0,1); // green
}
</script>
This will draw a green rectangle the entire size of the canvas.
In WebGL it's your responsibility to provide a vertex shader that provides clipspace coordinates. Clipspace coordinates always go from -1 to +1 regardless of the size of the canvas. If you want 3d it's up to you to supply shaders that convert from 3d to 2d because WebGL is only a rasterization API
In one simple example, if you want to work in pixels you could pass in a rectangle that uses pixels instead of clip space coordinates and convert to clip space in the shader
For example:
<script id="2d-vertex-shader" type="x-shader/x-vertex">
attribute vec2 a_position;
uniform vec2 u_resolution;
void main() {
// convert the rectangle from pixels to 0.0 to 1.0
vec2 zeroToOne = a_position / u_resolution;
// convert from 0->1 to 0->2
vec2 zeroToTwo = zeroToOne * 2.0;
// convert from 0->2 to -1->+1 (clipspace)
vec2 clipSpace = zeroToTwo - 1.0;
gl_Position = vec4(clipSpace, 0, 1);
}
</script>
Now we can draw rectangles by changing the data we supply
// set the resolution
var resolutionLocation = gl.getUniformLocation(program, "u_resolution");
gl.uniform2f(resolutionLocation, canvas.width, canvas.height);
// setup a rectangle from 10,20 to 80,30 in pixels
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
10, 20,
80, 20,
10, 30,
10, 30,
80, 20,
80, 30]), gl.STATIC_DRAW);
You'll notice WebGL considers the bottom right corner to be 0,0. To get it to be the more traditional top right corner used for 2d graphics we just flip the y coordinate.
gl_Position = vec4(clipSpace * vec2(1, -1), 0, 1);
You want to manipulate images you need to pass in textures. In the same way the size of the canvas is represented by clipspace coordinates textures are are referenced by texture coordinates that go from 0 to 1.
<script id="2d-vertex-shader" type="x-shader/x-vertex">
attribute vec2 a_position;
attribute vec2 a_texCoord;
uniform vec2 u_resolution;
varying vec2 v_texCoord;
void main() {
// convert the rectangle from pixels to 0.0 to 1.0
vec2 zeroToOne = a_position / u_resolution;
// convert from 0->1 to 0->2
vec2 zeroToTwo = zeroToOne * 2.0;
// convert from 0->2 to -1->+1 (clipspace)
vec2 clipSpace = zeroToTwo - 1.0;
gl_Position = vec4(clipSpace, 0, 1);
// pass the texCoord to the fragment shader
// The GPU will interpolate this value between points.
v_texCoord = a_texCoord;
}
</script>
<script id="2d-fragment-shader" type="x-shader/x-fragment">
precision float mediump;
// our texture
uniform sampler2D u_image;
// the texCoords passed in from the vertex shader.
varying vec2 v_texCoord;
void main() {
gl_FragColor = texture2D(u_image, v_texCoord);
}
</script>
To draw an image requires loading the image and since that happen asynchronously we need to change our code a little. Take all the code we had and put it in a function called "render"
var image = new Image();
image.src = "http://someimage/on/our/server"; // MUST BE SAME DOMAIN!!!
image.onload = function() {
render();
}
function render() {
...
// all the code we had before except gl.draw
// look up where the vertex data needs to go.
var texCoordLocation = gl.getAttribLocation(program, "a_texCoord");
// provide texture coordinates for the rectangle.
var texCoordBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, texCoordBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
1.0, 1.0,
0.0, 1.0,
0.0, 0.0,
1.0, 1.0,
0.0, 0.0,
1.0, 0.0]), gl.STATIC_DRAW);
gl.enableVertexAttribArray(texCoordLocation);
gl.vertexAttribPointer(texCoordLocation, 2, gl.FLOAT, false, 0, 0);
var texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.draw(...)
If you want to do image processing you just change your shader. Example, Swap red and blue
void main() {
gl_FragColor = texture2D(u_image, v_texCoord).bgra;
}
Or blend with the pixels next to it.
uniform vec2 u_textureSize;
void main() {
vec2 onePixel = vec2(1.0, 1.0) / u_textureSize;
gl_FragColor = (texture2D(u_image, v_texCoord) +
texture2D(u_image, v_texCoord + vec2(onePixel.x, 0.0)) +
texture2D(u_image, v_texCoord + vec2(-onePixel.x, 0.0))) / 3.0;
}
And we have to pass in the size of the texture
var textureSizeLocation = gl.getUniformLocation(program, "u_textureSize");
...
gl.uniform2f(textureSizeLocation, image.width, image.height);
Etc... Click the last link below for a convolution sample.
Here are working versions with a slightly different progression
Draw Rect in Clip Space
Draw Rect in Pixels
Draw Rect with origin at top left
Draw a bunch of rects in different colors
Draw an image
Draw an image red and blue swapped
Draw an image with left and right pixels averaged
Draw an image with a 3x3 convolution
Draw an image with multiple effects
You can make a custom pixel shader for each operation you're intending to use. Just learn some GLSL and follow the "Learning WebGL" tutorials to get a grasp of basic WebGL.
You can render your image with the shader modifying the parameters you can include to control the different visual styles and then when the user clicks "ok" you can read back the pixels to store it as your current image.
Just remember to avoid cross domain images, because that will disable the reading back of pixels.
Also, check the quick reference card (PDF) for quick info on shader operations.
Just try glfx ( http://evanw.github.com/glfx.js/ )
I think it is exactly what you need.
You can use set of predefined shaders or easily add yours ;)
enjoy! It is very easy with glfx!
<script src="glfx.js"></script>
<script>
window.onload = function() {
// try to create a WebGL canvas (will fail if WebGL isn't supported)
try {
var canvas = fx.canvas();
} catch (e) {
alert(e);
return;
}
// convert the image to a texture
var image = document.getElementById('image');
var texture = canvas.texture(image);
// apply the ink filter
canvas.draw(texture).ink(0.25).update();
// replace the image with the canvas
image.parentNode.insertBefore(canvas, image);
image.parentNode.removeChild(image);
};
</script>
<img id="image" src="image.jpg">

Categories

Resources