I want to make a WebGL 2D game with many sprites
But I only have two triangles to make a square.
And I do not need additional vertices.
The following tutorials will help you understand my purpose
https://webgl2fundamentals.org/webgl/lessons/webgl-drawing-without-data.html
https://webgl2fundamentals.org/webgl/lessons/webgl-instanced-drawing.html
Only one square and lots of matrices to move the square
My Shader
#version 300 es
uniform mat3 u_matrix;
uniform mat3 u_view;
uniform mat3 u_uv;
out vec2 v_texcoord;
const float xp[6] = float[](0.0, 0.0, 1.0, 1.0, 1.0, 0.0);
const float yp[6] = float[](0.0, 1.0, 0.0, 1.0, 0.0, 1.0);
void main() {
float x = xp[gl_VertexID];
float y = yp[gl_VertexID];
gl_Position = vec4(u_view * u_matrix * vec3(x, y, 1),1);
v_texcoord = vec3(u_uv * vec3(x,y,0)).xy;
}
My Code in each frame
gameobgects.forEach(gameobject => {
gl.uniformMatrix3fv(u_uv, false, gameobject.uv);
gl.uniformMatrix3fv(u_matrix, false, gameobject.matrix);
gl.drawArrays(gl.TRIANGLES, 0, 6);
});
I can only change the value u_matrix once in each draw call with this shader
Help me to fix my shader. to connect u_matrix to a buffer and draw all objects in one call.
Forgive me for my bad English
Related
I've a plane geometry and I'm creating a CustomShader material related to it. It will receive some textures as uniforms. I'd like the textures to perfectly cover my plane (like the background-size:cover css property)
I managed to do it with an utility function when I used my textures with a MeshBasicMaterial :
cover( texture, aspect ) {
var imageAspect = texture.image.width / texture.image.height;
if ( aspect < imageAspect ) {
texture.matrix.setUvTransform( 0, 0, aspect / imageAspect, 1, 0, 0.5, 0.5 );
} else {
texture.matrix.setUvTransform( 0, 0, 1, imageAspect / aspect, 0, 0.5, 0.5 );
}
}
But unfortunately since I'm using the ShaderMaterial, my "cover" function doesn't apply anymore. Am I force to do it inside my fragment shader? If so how can I manage to reproduce this behavior ?
Here's my code :
const vertexShader = `
precision highp float;
uniform mat3 uUvTransform;
varying vec2 vUv;
void main() {
vUv = ( uUvTransform * vec3( uv, 1 ) ).xy;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}`;
const fragmentShader = `
precision highp float;
uniform sampler2D uText1;
varying vec2 vUv;
void main() {
vec2 xy = vUv;
vec4 color = texture2D(uText1,xy);
gl_FragColor = color;
}`;
And here's my current result :
Thanks a lot
You could simply use a custom uniform, e.g. :
uniform sampler2D uText1;
uniform vec2 uUvScale;
varying vec2 vUv;
void main() {
vec2 uv = (vUv - 0.5) * uUvScale + 0.5;
gl_FragColor = texture2D(uText1, uv);
}
And :
var imageAspect = texture.image.width / texture.image.height;
if ( aspect < imageAspect ) {
material.uniforms.uUvScale.value.set(aspect / imageAspect, 1)
} else {
material.uniforms.uUvScale.value.set(1, imageAspect / aspect)
}
The way Three.js handles texture transformations like .offset, .repeat, .rotation, .center is via a Matrix3 that gets passed as a uniform into the vertex shader. The vertex shader performs the matrix multiplication, then passes the modified UVs as a varying to the fragment shader.
You can see that uniform being declared in the uv_pars_vertex.glsl.js file
You can see the transform being applied in the uv_vertex.glsl.js file
You could copy those lines of GLSL code to your ShaderMaterial's vertex shader, and I think the texture properties will come through in the Matrix3 automatically. However, if for some reason it doesn't, you could recreate the Matrix3 by copying it from the source and passing it as a uniform manually. I don't know what your utility function looks like, so it's hard to tell how you're achieving the desired scaling.
I'm trying to use custom shader for three.js to make a single texture animate and expand. The problem here is, when I multiply vUv with certain number to make it expand, the bigger the number is, the smaller the result paint appears, which means it works contrary to my expectation. For example, when I multiply 0.1, the result becomes 10 times bigger, and when I multiply 10.0, it becomes 10 times smaller.
Here is my shader codes (simplified to make the problem clear):
//vertex shader
varying vec2 vUv;
uniform float uFixAspect;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
precision mediump float;
uniform float time;
uniform vec2 resolution;
varying vec2 vUv;
uniform sampler2D uTex;
void main() {
// I want the result to be 10 times smaller than original, but it draws 10 times bigger than original
mat2 newvUv = vUv * mat2(0.1, 0.0, 0.0, 0.1);
gl_FragColor= texture2D(uTex, newvUv);
}
and, this is my three.js code:
const loader = new THREE.TextureLoader();
loader.load(
"./assets/textures/tex.png",
tex => {
const geo = new THREE.PlaneGeometry(2, 2);
const mat = new THREE.ShaderMaterial({
uniforms: {
uTex: {
value: tex
},
time: {
type: "f",
value: 0.1
},
resolution: {
type: "vec2",
value: new THREE.Vector2(512, 512)
}
},
vertexShader: vert,
fragmentShader: frag,
});
const shaderObj = new THREE.Mesh(
geo,
mat
);
marker.add(shaderObj);
}
);
Is there any problem on my code or is it the problem of three.js?
Thank you.
[...] when I multiply vUv with certain number to make it expand, the bigger the number is, the smaller the result paint appears [...]
Of course, because you scale the texture coordinates for the look up, but not the texture.
You want to "scale down" the texture. The texture keeps the same size, so you've to take the texels form a "up scaled" position.
Use the reciprocal of the scale factor:
float scale = 1.0/0.1; // reciprocal scale
mat2 newvUv = vUv * mat2(scale, 0.0, 0.0, scale);
This is not exactly the problem I have, but a simplified version of it. Say, I have a single image display at full screen size. I want to modify the alpha of this image, so that at the left-half (horizontally) of the screen, the alpha is 0.5 and at the right-half, alpha is 1. Just alpha 0.5 or 1, nothing else in between.
Here are my (failed) codes so far
This is my javascript codes to setup webgl
this.gl = canvas.getContext('webgl', {
alpha: false,
});
this.gl.enable(this.gl.BLEND);
this.gl.blendFunc(this.gl.SRC_ALPHA, this.gl.ONE_MINUS_SRC_ALPHA);
this.gl.clearColor(1, 0, 0, 0); // red to highlight alpha problem
this is my vertex shader code
mediump float;
attribute vec2 coordinates;
attribute vec2 a_texcoord;
varying vec2 v_texcoord;
varying float alpha;
void main() {
gl_Position = vec4(coordinates.x, coordinates.y, 1.0, 1.0);
v_texcoord = a_texcoord;
if (coordinates.x <= 0) {
alpha = 0.5;
} else {
alpha = 1.0;
}
}
and my fragment shader is standard simple
precision mediump float;
varying vec2 v_texcoord;
varying float alpha;
uniform sampler2D u_texture;
void main() {
vec4 color = texture2D(u_texture, v_texcoord).rgba;
gl_FragColor = color;
}
and at draw time
window.canvas.gl.clear(window.canvas.gl.COLOR_BUFFER_BIT);
window.canvas.gl.useProgram(this.drawProgram);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, this.vertexBuffer);
window.canvas.gl.enableVertexAttribArray(this.positionLocation);
window.canvas.gl.vertexAttribPointer(this.positionLocation, 2, window.canvas.gl.FLOAT, false, 0, 0);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, null);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, this.texcoordBuffer);
window.canvas.gl.enableVertexAttribArray(this.texcoordLocation);
window.canvas.gl.vertexAttribPointer(this.texcoordLocation, 2, window.canvas.gl.FLOAT, false, 0, 0);
window.canvas.gl.bindBuffer(window.canvas.gl.ARRAY_BUFFER, null);
window.canvas.gl.bindTexture(window.canvas.gl.TEXTURE_2D, this.texture);
window.canvas.gl.uniform1i(this.textureLocation, 0);
window.canvas.gl.drawArrays(window.canvas.gl.TRIANGLES, 0, 6);
With these codes, I couldn't achieve what I want. First, the transparency does not start at the middle of the screen (clipspace x = 0), but at a seemingly random location. Also, there is gradual decline from alpha 1.0 to alpha 0.5, not the just the 2 values 0.5 and 1.0 I hope for. And I have no idea where this gradual transition comes from.
Obviously I am learning webgl so any pointer would be much appreciate. Any hint on how to solve the problem would be of great help to me. Thanks in advance!
the transparency does not start at the middle of the screen [..]
because alpha is evaluated per vertex and interpolated for the fragments.
You have to do the evaluation per fragment rather than per vertex.
Pass coordinates.x or gl_Position.x/gl_Position.w from the vertex shader to the fragment shader:
mediump float;
attribute vec2 coordinates;
attribute vec2 a_texcoord;
varying vec2 v_texcoord;
varying vec2 pos;
void main() {
gl_Position = vec4(coordinates.xy, 1.0, 1.0);
v_texcoord = a_texcoord;
pos = coordinates.xy;
}
Compute the alpha value in the fragment shader:
precision mediump float;
varying vec2 v_texcoord;
varying vec2 pos;
uniform sampler2D u_texture;
void main() {
vec4 color = texture2D(u_texture, v_texcoord).rgba;
float alpha = pos.x < 0.5 ? 0.5 : 1.0;
gl_FragColor = vec4(color.rgb, color.a * alpha);
}
Note, the vertex shader is executed once for each vertex coordinate. The coordinates define the corners of the primitives. The fragment shader is executed for each fragment. The output parameters of the vertex shader are interpolated dependent on the position of the fragment on the primitive. The interpolated value is the input to the fragment shader.
If alpha is calculated in the vertex shader, then the alpha value for the fragments on the left is 0.5 and on the right it is 1.0. The fragments in between get an smoothly interpolated value int he range [0.5, 1.0].
The same happens to the position when it is passed from the vertex shader to the fragment shader. But since alpha is calculated in the fragment shader, the alpha value of each fragment is either 0.5 or 1.0, dependent on the interpolated value of pos.x
I want to be able to apply some procedural structures to faces. First task, when I faced such demand is to create billboard, on which is drawn nuclear blast in open space. I hoped to make it as a animated radial gradient and I have succeed partly.
The main thing is for each fragment shader - to have access to UV as to uniform var.
Seems like the main thing about rendering sprites - is to access to camera projection matrix in the vertex shader.
Here's example http://goo.gl/A7pY01!
Now I want to draw this onto the billboard sprite. I supposed to use THREE.Sprite for this with THREE.ShaderMaterial, but had no luck in this. It seemed, that THREE.SpriteMaterial is only good material for sprites. And after inspecting some source-code I revealed why Sprites are draw in one special way using plugins.
So, before I found myself inventing my own bicycle, I felt needness to ask people how to place my own custom shader on my own custom sprite without hacking THREE.js?
So.
After a small research and work I have considered THREE.ShaderMaterial is the best option to complete this little task. Thanks to /extras/renderers/plugins/SpritePlugin, I realized how to form and position sprites using vertex shaders. I still have some question, but I found one good solution.
To accomplish my task, firstly I create a simple plane geometry:
var geometry = new THREE.PlaneGeometry( 1, 1 );
And use it in mesh with ShaderMaterial:
uniforms = {
cur_time: {type:"f", value:1.0},
beg_time:{type:"f", value:1.0},
scale:{type: "v3", value:new THREE.Vector3()}
};
var material = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
transparent: true,
blending:THREE.AdditiveBlending // It looks like real blast with Additive blending!!!
} );
var mesh = new THREE.Mesh( geometry, material );
Here's my shaders:
Vertex shader:
varying vec2 vUv;
uniform vec3 scale;
void main() {
vUv = uv;
float rotation = 0.0;
vec3 alignedPosition = vec3(position.x * scale.x, position.y * scale.y, position.z*scale.z);
vec2 pos = alignedPosition.xy;
vec2 rotatedPosition;
rotatedPosition.x = cos( rotation ) * alignedPosition.x - sin( rotation ) * alignedPosition.y;
rotatedPosition.y = sin( rotation ) * alignedPosition.x + cos( rotation ) * alignedPosition.y;
vec4 finalPosition;
finalPosition = modelViewMatrix * vec4( 0.0, 0.0, 0.0, 1.0 );
finalPosition.xy += rotatedPosition;
finalPosition = projectionMatrix * finalPosition;
gl_Position = finalPosition;
}
I got vertex shader from original Sprite Plugin source code, and changed it slightly.
BTW, changing += to = makes sprite screen-sticky. This thing wasted a lot of my time.
And this is my fragment shader:
uniform float cur_time;
uniform float beg_time;
varying vec2 vUv;
void main() {
float full_time = 5000.;
float time_left = cur_time - beg_time;
float expl_step0 = 0.;
float expl_step1 = 0.3;
float expl_max = 1.;
float as0 = 0.;
float as1 = 1.;
float as2 = 0.;
float time_perc = clamp( (time_left / full_time), 0., 1. ) ;
float alphap;
alphap = mix(as0,as1, smoothstep(expl_step0, expl_step1, time_perc));
alphap = mix(alphap,as2, smoothstep(expl_step1, expl_max, time_perc));
vec2 p = vUv;
vec2 c = vec2(0.5, 0.5);
float max_g = 1.;
float dist = length(p - c) * 2. ;
float step1 = 0.;
float step2 = 0.2;
float step3 = 0.3;
vec4 color;
float a0 = 1.;
float a1 = 1.;
float a2 = 0.7;
float a3 = 0.0;
vec4 c0 = vec4(1., 1., 1., a0 * alphap);
vec4 c1 = vec4(0.9, 0.9, 1., a1 * alphap);
vec4 c2 = vec4(0.7, 0.7, 1., a2 * alphap);
vec4 c3 = vec4(0., 0., 0., 0.);
color = mix(c0, c1, smoothstep(step1, step2, dist));
color = mix(color, c2, smoothstep(step2, step3, dist));
color = mix(color, c3, smoothstep(step3, max_g, dist));
gl_FragColor = color;
}
Here's example of how to make multipoint gradient, animated by time. There's a lot to optimize and several thoughts how to make this even more beautiful.
But this one is almost what I wanted.
I intend to create a simple photo editor in JS. My main question is, is it possible to create filters that render in real-time? For example, adjusting brightness and saturation. All I need is a 2D image where I can apply filters using the GPU.
All the tutorials I've read are very complex and don't really explain what the API mean. Please point me in the right direction. Thanks.
I was going to write a tutorial and post it on my blog but I don't know when I'll have time to finish so here's what I have Here's a more detailed set of posts on my blog.
WebGL is actually a rasterization library. I takes in attributes (streams of data), uniforms (variables) and expects you to provide "clip space" coordinates in 2d and color data for pixels.
Here's a simple example of 2d in WebGL (some details left out)
// Get A WebGL context
var gl = canvas.getContext("experimental-webgl");
// setup GLSL program
vertexShader = createShaderFromScriptElement(gl, "2d-vertex-shader");
fragmentShader = createShaderFromScriptElement(gl, "2d-fragment-shader");
program = createProgram(gl, vertexShader, fragmentShader);
gl.useProgram(program);
// look up where the vertex data needs to go.
var positionLocation = gl.getAttribLocation(program, "a_position");
// Create a buffer and put a single clipspace rectangle in
// it (2 triangles)
var buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
-1.0, -1.0,
1.0, -1.0,
-1.0, 1.0,
-1.0, 1.0,
1.0, -1.0,
1.0, 1.0]), gl.STATIC_DRAW);
gl.enableVertexAttribArray(positionLocation);
gl.vertexAttribPointer(positionLocation, 2, gl.FLOAT, false, 0, 0);
// draw
gl.drawArrays(gl.TRIANGLES, 0, 6);
Here's the 2 shaders
<script id="2d-vertex-shader" type="x-shader/x-vertex">
attribute vec2 a_position;
void main() {
gl_Position = vec4(a_position, 0, 1);
}
</script>
<script id="2d-fragment-shader" type="x-shader/x-fragment">
void main() {
gl_FragColor = vec4(0,1,0,1); // green
}
</script>
This will draw a green rectangle the entire size of the canvas.
In WebGL it's your responsibility to provide a vertex shader that provides clipspace coordinates. Clipspace coordinates always go from -1 to +1 regardless of the size of the canvas. If you want 3d it's up to you to supply shaders that convert from 3d to 2d because WebGL is only a rasterization API
In one simple example, if you want to work in pixels you could pass in a rectangle that uses pixels instead of clip space coordinates and convert to clip space in the shader
For example:
<script id="2d-vertex-shader" type="x-shader/x-vertex">
attribute vec2 a_position;
uniform vec2 u_resolution;
void main() {
// convert the rectangle from pixels to 0.0 to 1.0
vec2 zeroToOne = a_position / u_resolution;
// convert from 0->1 to 0->2
vec2 zeroToTwo = zeroToOne * 2.0;
// convert from 0->2 to -1->+1 (clipspace)
vec2 clipSpace = zeroToTwo - 1.0;
gl_Position = vec4(clipSpace, 0, 1);
}
</script>
Now we can draw rectangles by changing the data we supply
// set the resolution
var resolutionLocation = gl.getUniformLocation(program, "u_resolution");
gl.uniform2f(resolutionLocation, canvas.width, canvas.height);
// setup a rectangle from 10,20 to 80,30 in pixels
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
10, 20,
80, 20,
10, 30,
10, 30,
80, 20,
80, 30]), gl.STATIC_DRAW);
You'll notice WebGL considers the bottom right corner to be 0,0. To get it to be the more traditional top right corner used for 2d graphics we just flip the y coordinate.
gl_Position = vec4(clipSpace * vec2(1, -1), 0, 1);
You want to manipulate images you need to pass in textures. In the same way the size of the canvas is represented by clipspace coordinates textures are are referenced by texture coordinates that go from 0 to 1.
<script id="2d-vertex-shader" type="x-shader/x-vertex">
attribute vec2 a_position;
attribute vec2 a_texCoord;
uniform vec2 u_resolution;
varying vec2 v_texCoord;
void main() {
// convert the rectangle from pixels to 0.0 to 1.0
vec2 zeroToOne = a_position / u_resolution;
// convert from 0->1 to 0->2
vec2 zeroToTwo = zeroToOne * 2.0;
// convert from 0->2 to -1->+1 (clipspace)
vec2 clipSpace = zeroToTwo - 1.0;
gl_Position = vec4(clipSpace, 0, 1);
// pass the texCoord to the fragment shader
// The GPU will interpolate this value between points.
v_texCoord = a_texCoord;
}
</script>
<script id="2d-fragment-shader" type="x-shader/x-fragment">
precision float mediump;
// our texture
uniform sampler2D u_image;
// the texCoords passed in from the vertex shader.
varying vec2 v_texCoord;
void main() {
gl_FragColor = texture2D(u_image, v_texCoord);
}
</script>
To draw an image requires loading the image and since that happen asynchronously we need to change our code a little. Take all the code we had and put it in a function called "render"
var image = new Image();
image.src = "http://someimage/on/our/server"; // MUST BE SAME DOMAIN!!!
image.onload = function() {
render();
}
function render() {
...
// all the code we had before except gl.draw
// look up where the vertex data needs to go.
var texCoordLocation = gl.getAttribLocation(program, "a_texCoord");
// provide texture coordinates for the rectangle.
var texCoordBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, texCoordBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
1.0, 1.0,
0.0, 1.0,
0.0, 0.0,
1.0, 1.0,
0.0, 0.0,
1.0, 0.0]), gl.STATIC_DRAW);
gl.enableVertexAttribArray(texCoordLocation);
gl.vertexAttribPointer(texCoordLocation, 2, gl.FLOAT, false, 0, 0);
var texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.draw(...)
If you want to do image processing you just change your shader. Example, Swap red and blue
void main() {
gl_FragColor = texture2D(u_image, v_texCoord).bgra;
}
Or blend with the pixels next to it.
uniform vec2 u_textureSize;
void main() {
vec2 onePixel = vec2(1.0, 1.0) / u_textureSize;
gl_FragColor = (texture2D(u_image, v_texCoord) +
texture2D(u_image, v_texCoord + vec2(onePixel.x, 0.0)) +
texture2D(u_image, v_texCoord + vec2(-onePixel.x, 0.0))) / 3.0;
}
And we have to pass in the size of the texture
var textureSizeLocation = gl.getUniformLocation(program, "u_textureSize");
...
gl.uniform2f(textureSizeLocation, image.width, image.height);
Etc... Click the last link below for a convolution sample.
Here are working versions with a slightly different progression
Draw Rect in Clip Space
Draw Rect in Pixels
Draw Rect with origin at top left
Draw a bunch of rects in different colors
Draw an image
Draw an image red and blue swapped
Draw an image with left and right pixels averaged
Draw an image with a 3x3 convolution
Draw an image with multiple effects
You can make a custom pixel shader for each operation you're intending to use. Just learn some GLSL and follow the "Learning WebGL" tutorials to get a grasp of basic WebGL.
You can render your image with the shader modifying the parameters you can include to control the different visual styles and then when the user clicks "ok" you can read back the pixels to store it as your current image.
Just remember to avoid cross domain images, because that will disable the reading back of pixels.
Also, check the quick reference card (PDF) for quick info on shader operations.
Just try glfx ( http://evanw.github.com/glfx.js/ )
I think it is exactly what you need.
You can use set of predefined shaders or easily add yours ;)
enjoy! It is very easy with glfx!
<script src="glfx.js"></script>
<script>
window.onload = function() {
// try to create a WebGL canvas (will fail if WebGL isn't supported)
try {
var canvas = fx.canvas();
} catch (e) {
alert(e);
return;
}
// convert the image to a texture
var image = document.getElementById('image');
var texture = canvas.texture(image);
// apply the ink filter
canvas.draw(texture).ink(0.25).update();
// replace the image with the canvas
image.parentNode.insertBefore(canvas, image);
image.parentNode.removeChild(image);
};
</script>
<img id="image" src="image.jpg">