I have created a primitive sphere using THREE.SphereGeometry. I am applying a displacement shader, to give it a bumpy effect. I am trying to animate the scale of the bumps with volume from the microphone. I am unable to pass my volume variable into the shader to affect the scale. I am logging the volume variable and I see that it is appropriately updating from my microphone.
The dynamic variable is:
var volume = meter.volume * 1000.0;
function drawLoop(time) {
rafID = window.requestAnimationFrame(drawLoop);
var volume = meter.volume * 1000.0;
//var volume = THREE.UniformsUtils.clone(meter.volume);
javascript: console.log(typeof(volume));
THREE.DisplacementShader = {
uniforms: {
texture1: {
type: "t",
value: null
},
scale: {
type: "f",
value: 100 + volume
},
volume: {
type: "f",
value: meter.volume
},
},
vertexShader: [
"varying vec2 vUv;",
"varying float noise;",
"varying vec3 fNormal;",
"uniform sampler2D texture1;",
"uniform float scale;",
"uniform float time;",
"varying float volume;",
"void main() {",
"vUv = uv;",
"fNormal = normal;",
"vec4 noiseTex = texture2D( texture1, vUv );",
"noise = noiseTex.r + time;",
//adding the normal scales it outward
//(normal scale equals sphere diameter)
"vec3 newPosition = position + normal * noise * scale * (volume*100.0);",
"gl_Position = projectionMatrix * modelViewMatrix * vec4( newPosition, 1.0 );",
"}"
].join("\n"),
fragmentShader: [
"varying vec2 vUv;",
"varying float noise;",
"varying vec3 fNormal;",
"void main( void ) {",
// compose the colour using the normals then
// whatever is heightened by the noise is lighter
"gl_FragColor = vec4( fNormal * noise, 1. );",
"}"
].join("\n")
};
}
You defined
varying float volume;
in your vertex-shader. It should instead read
uniform float volume;
Related
I've a plane geometry and I'm creating a CustomShader material related to it. It will receive some textures as uniforms. I'd like the textures to perfectly cover my plane (like the background-size:cover css property)
I managed to do it with an utility function when I used my textures with a MeshBasicMaterial :
cover( texture, aspect ) {
var imageAspect = texture.image.width / texture.image.height;
if ( aspect < imageAspect ) {
texture.matrix.setUvTransform( 0, 0, aspect / imageAspect, 1, 0, 0.5, 0.5 );
} else {
texture.matrix.setUvTransform( 0, 0, 1, imageAspect / aspect, 0, 0.5, 0.5 );
}
}
But unfortunately since I'm using the ShaderMaterial, my "cover" function doesn't apply anymore. Am I force to do it inside my fragment shader? If so how can I manage to reproduce this behavior ?
Here's my code :
const vertexShader = `
precision highp float;
uniform mat3 uUvTransform;
varying vec2 vUv;
void main() {
vUv = ( uUvTransform * vec3( uv, 1 ) ).xy;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}`;
const fragmentShader = `
precision highp float;
uniform sampler2D uText1;
varying vec2 vUv;
void main() {
vec2 xy = vUv;
vec4 color = texture2D(uText1,xy);
gl_FragColor = color;
}`;
And here's my current result :
Thanks a lot
You could simply use a custom uniform, e.g. :
uniform sampler2D uText1;
uniform vec2 uUvScale;
varying vec2 vUv;
void main() {
vec2 uv = (vUv - 0.5) * uUvScale + 0.5;
gl_FragColor = texture2D(uText1, uv);
}
And :
var imageAspect = texture.image.width / texture.image.height;
if ( aspect < imageAspect ) {
material.uniforms.uUvScale.value.set(aspect / imageAspect, 1)
} else {
material.uniforms.uUvScale.value.set(1, imageAspect / aspect)
}
The way Three.js handles texture transformations like .offset, .repeat, .rotation, .center is via a Matrix3 that gets passed as a uniform into the vertex shader. The vertex shader performs the matrix multiplication, then passes the modified UVs as a varying to the fragment shader.
You can see that uniform being declared in the uv_pars_vertex.glsl.js file
You can see the transform being applied in the uv_vertex.glsl.js file
You could copy those lines of GLSL code to your ShaderMaterial's vertex shader, and I think the texture properties will come through in the Matrix3 automatically. However, if for some reason it doesn't, you could recreate the Matrix3 by copying it from the source and passing it as a uniform manually. I don't know what your utility function looks like, so it's hard to tell how you're achieving the desired scaling.
I'm trying to use custom shader for three.js to make a single texture animate and expand. The problem here is, when I multiply vUv with certain number to make it expand, the bigger the number is, the smaller the result paint appears, which means it works contrary to my expectation. For example, when I multiply 0.1, the result becomes 10 times bigger, and when I multiply 10.0, it becomes 10 times smaller.
Here is my shader codes (simplified to make the problem clear):
//vertex shader
varying vec2 vUv;
uniform float uFixAspect;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
precision mediump float;
uniform float time;
uniform vec2 resolution;
varying vec2 vUv;
uniform sampler2D uTex;
void main() {
// I want the result to be 10 times smaller than original, but it draws 10 times bigger than original
mat2 newvUv = vUv * mat2(0.1, 0.0, 0.0, 0.1);
gl_FragColor= texture2D(uTex, newvUv);
}
and, this is my three.js code:
const loader = new THREE.TextureLoader();
loader.load(
"./assets/textures/tex.png",
tex => {
const geo = new THREE.PlaneGeometry(2, 2);
const mat = new THREE.ShaderMaterial({
uniforms: {
uTex: {
value: tex
},
time: {
type: "f",
value: 0.1
},
resolution: {
type: "vec2",
value: new THREE.Vector2(512, 512)
}
},
vertexShader: vert,
fragmentShader: frag,
});
const shaderObj = new THREE.Mesh(
geo,
mat
);
marker.add(shaderObj);
}
);
Is there any problem on my code or is it the problem of three.js?
Thank you.
[...] when I multiply vUv with certain number to make it expand, the bigger the number is, the smaller the result paint appears [...]
Of course, because you scale the texture coordinates for the look up, but not the texture.
You want to "scale down" the texture. The texture keeps the same size, so you've to take the texels form a "up scaled" position.
Use the reciprocal of the scale factor:
float scale = 1.0/0.1; // reciprocal scale
mat2 newvUv = vUv * mat2(scale, 0.0, 0.0, scale);
I have a very simple object and a texture drown to it (using shader). Everything works great, except when rotating the object, the texture is not rotating along it, but appears to stay in the 2D space, creating the 'mask' effect below:
Texture not rotating along the object
When I use a regular material and attach texture to it, all works fine, so I'm guessing I'm doing something wrong at the vertex shader.
I load the model the following way:
var loader = new THREE.JSONLoader();
loader.load( "models/cube.json", addModelToScene );
var texture = THREE.ImageUtils.loadTexture( "images/woods.jpg" );
texture.wrapS = THREE.RepeatWrapping;
texture.wrapT = THREE.RepeatWrapping;
texture.repeat.set( 1.0, 1.0 );
uniforms =
{
time:
{
type: "f",
value: 1.0
},
texture1: { type: "t", value: THREE.ImageUtils.loadTexture( "images/woods.jpg" ) }
};
function addModelToScene( geometry, materials ) {
var material = new THREE.MeshFaceMaterial( materials );
var shaderMaterial = new THREE.ShaderMaterial
(
{
vertexShader: $('#vertexshader').text(),
fragmentShader: $('#fragmentshader').text(),
uniforms: uniforms
}
);
model = new THREE.Mesh( geometry, shaderMaterial );
model.scale.set( 2.5, 2.5, 2.5 );
scene.add( model );
}
Vertex shader:
varying vec2 vUv;
#ifdef GL_ES
precision highp float;
#endif
uniform float time;
uniform sampler2D texture1;
void main()
{
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
Fragment shader:
varying vec2 vUv;
#ifdef GL_ES
precision highp float;
#endif
uniform sampler2D texture1;
uniform float time;
void main()
{
vec2 u_resolution = vec2( 1700, 1000 );
vec2 uv = gl_FragCoord.xy / u_resolution.xy;
gl_FragColor = texture2D( texture1, uv );
}
And finally I rotate the object the following way:
model.rotation.z += 0.00013;
model.rotation.z += 0.004;
Why is the texture not one with the object, but instead stays in static position? Thanks!
That's because you should use vUv varying to address the texture, not gl_FragCoord.xy:
gl_FragColor = texture2D(texture1, vUv);
gl_FragColor.xy is just pixel's (or, to be more accurate, fragment's) coordinates on a screen (i.e., window coordinates). They don't depend on rotation (or any transformations for that matter) of your object (or the object itself). They only depend upon where the pixel currently being shaded lies on the screen.
I am running into an issue where I am trying to apply a custom shader material to a series of nested objects to simulate a glow effect around each node. The effect works well on my home laptop (windows 8.1 latest chrome), but my work computer does not render the frontface of the glowing object, only the backface. I have checked it against a few systems and it seems to be mostly a chrome rendering issue on windows devices.
http://i.imgur.com/uYLtoxm.gif
I have included a codepen example where I shifted the glow off to the side and you can see that in some versions it is not rendering the front group of normals. The red dots should have a glow applied to each of them that shows up on the front and back(left and right in example). Any help would be appreciated, I am stumped as to what is going on.
Here is the shader material settings
local.glowNodeMat = new THREE.ShaderMaterial(
{
uniforms:
{
"c": { type: "f", value: 0 },
"p": { type: "f", value: 5.5 },
glowColor: { type: "c", value: new THREE.Color(0xaaccff) },
viewVector: { type: "v3", value: local.camera.position }
},
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent,
side: THREE.FrontSide,
blending: THREE.AdditiveBlending,
transparent: true
});
http://codepen.io/sniejadlik/pen/oDarE
///////////////////
FIXED thanks to Volune. Thanks for the help!
Fixed Vertex shader
<script id="vertexShader" type="x-shader/x-vertex">
uniform vec3 viewVector;
uniform float c;
uniform float p;
varying float intensity;
void main()
{
vec3 vNormal = normalize( normalMatrix * normal );
vec3 vNormel = normalize( normalMatrix * viewVector );
// incorrect intensity = pow( c - dot(vNormal, vNormel), p );
intensity = pow( abs(c - dot(vNormal, vNormel) ), p );
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
</script>
<!-- fragment shader a.k.a. pixel shader -->
<script id="fragmentShader" type="x-shader/x-vertex">
uniform vec3 glowColor;
varying float intensity;
void main()
{
vec3 glow = glowColor * intensity;
gl_FragColor = vec4( glow, 1.0 );
}
</script>
The error is in your vertex shader:
intensity = pow( c - dot(vNormal, vNormel), p );
You have c = 0 and p = 0.5. c - dot(vNormal, vNormel) may be negative (when the dot product returns a positive value), so you're trying to get the square of a negative value.
For some unknown reason, the fallback in Firefox looks like pow( abs(...), 0.5 ), while the fallback in Chrome seems to be 0.0.
Try to fix your shader like this:
intensity = pow( abs( c - dot(vNormal, vNormel) ), p );
I followed this post hoping to add my own spin on things. I noticed the example located
here is using a very old revision of Three.JS (49). When I changed the source file to a more up-to-date version, the texture no longer appears. See Demo
I've been spending a lot of time trying to figure out what depreciations occurred and I've narrowed down my search to these lines.
// material
uniforms = {
sunDirection: { type: "v3", value: new THREE.Vector3(0,1,0) },
dayTexture: { type: "t", value: 0, texture: THREE.ImageUtils.loadTexture( "/images/world2.png" ) },
nightTexture: { type: "t", value: 1, texture: THREE.ImageUtils.loadTexture( "/images/world5.png" ) }
};
uniforms.dayTexture.texture.wrapS = uniforms.dayTexture.texture.wrapT = THREE.Repeat;
uniforms.nightTexture.texture.wrapS = uniforms.nightTexture.texture.wrapT = THREE.Repeat;
material = new THREE.ShaderMaterial( {
uniforms: uniforms,
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent
} );
More misc stuff that probably has something to do with my problem
<script id="fragmentShader" type="x-shader/x-fragment">
uniform sampler2D dayTexture;
uniform sampler2D nightTexture;
uniform vec3 sunDirection;
varying vec2 vUv;
varying vec3 vNormal;
void main( void ) {
vec3 dayColor = texture2D( dayTexture, vUv ).rgb;
vec3 nightColor = texture2D( nightTexture, vUv ).rgb;
// compute cosine sun to normal so -1 is away from sun and +1 is toward sun.
float cosineAngleSunToNormal = dot(normalize(vNormal), sunDirection);
// sharpen the edge beween the transition
cosineAngleSunToNormal = clamp( cosineAngleSunToNormal * 10.0, -1.0, 1.0);
// convert to 0 to 1 for mixing
float mixAmount = cosineAngleSunToNormal * 0.5 + 0.5;
// Select day or night texture based on mix.
vec3 color = mix( nightColor, dayColor, mixAmount );
gl_FragColor = vec4( color, 1.0 );
//gl_FragColor = vec4( mixAmount, mixAmount, mixAmount, 1.0 );
}
</script>
<script id="vertexShader" type="x-shader/x-vertex">
varying vec2 vUv;
varying vec3 vNormal;
void main()
{
vUv = uv;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
vNormal = normalMatrix * normal;
gl_Position = projectionMatrix * mvPosition;
}
</script>
I've checked the migrations docx here
There isn't much on "uniforms" or "shaders" for that fact.
In the Migration Wiki you referenced, it specifies a new pattern for assigning textures to uniforms as of r.51:
{ type: "t", value: 0, texture: map } => { type: "t", value: map }
So in you case, it would be
dayTexture: { type: "t", value: THREE.ImageUtils.loadTexture( "/images/world2.png" ) },
nightTexture: { type: "t", value: THREE.ImageUtils.loadTexture( "/images/world5.png" ) }
three.js r.59