Converting rgba values into one integer in Javascript - javascript

I can already convert 32bit integers into their rgba values like this:
pixelData[i] = {
red: pixelValue >> 24 & 0xFF,
green: pixelValue >> 16 & 0xFF,
blue: pixelValue >> 8 & 0xFF,
alpha: pixelValue & 0xFF
};
But I don't really know how to reverse it.

To reverse it, you just have to combine the bytes into an integer.
Simply use left-shift and add them, and it will work.
var rgb = (red << 24) + (green << 16) + (blue << 8) + (alpha);
Alternatively, to make it safer, you could first AND each of them with 0xFF:
var r = red & 0xFF;
var g = green & 0xFF;
var b = blue & 0xFF;
var a = alpha & 0xFF;
var rgb = (r << 24) + (g << 16) + (b << 8) + (a);
(You may use bitwise OR | instead of + here, the outcome will be the same).

Related

How to convert RGB24 to RGB for web?

I have some colors in RGB24 format, for example: 1581613
How to convert it to RGB for web usage?
If I understand correctly, the format is as follows RRRGGBB.
Last code i've trying to run is:
const color = 1581613;
const red = (color >> 5) * 255 / 7;
const green = ((color >> 2) & 0x07) * 255 / 7;
const blue = (color & 0x03) * 255 / 3;
But my attempts did not lead to success.
I'm guessing a lot here, but here goes:
const color = 1581613;
const red = (color >> 16) & 255;
const green = (color >> 8) & 255;
const blue = color & 255;
const hex = red.toString(16) + green.toString(16) + blue.toString(16);
document.write(JSON.stringify({rgb: [red, green, blue], hex }, null, 2));

JavaScript conversion to Float32Array and back for WebGL readPixels

I am storing id (which is a value comprised in 24bit-range) into an Float32Array(3) for latter retrieval in WebGL:
var r = 0,
g = 0,
b = id;
if (b >= 65536) {
r = ~~(b / 65536);
b -= r * 65536;
}
if (b >= 256) {
g = ~~(b / 256);
b -= g * 256;
}
var fa = new Float32Array([r/255, g/255, b/255]);
For the sake of completeness, here is how i am using that value:
gl.uniform3fv(uniforms['u_id'], fa);
...and this is how i get my id back from WebGLRenderingContext.readPixels():
var result = new Uint8Array(4);
gl.readPixels(x, y, 1, 1, gl.RGBA, gl.UNSIGNED_BYTE, result);
var id = result[0] << 16 | result[1] << 8 | result[2];
Which is the correct way to split that value into my Float32Array? I strongly believe that such task could be accomplished in a more efficient and elegant way (actually, what i am doing is working but is really hurting my eyes).
id has a form like this:
0000 0000 rrrr rrrr gggg gggg bbbb bbbb
A part (r, g or b) can be extracted by putting it in the lowest byte and masking the rest away. Sometimes one of those steps is unnecessary. So:
b = id & 255 // b is already in the low byte, so no shift
g = (id >> 8) & 255
r = id >> 16 // no mask because there is nothing "above" r
This can be put together with the division and putting it in an array:
[(id >> 16) / 255, ((id >> 8) & 255) / 255, (id & 255) / 255]

JSXGraph: How to label polygon borders?

Does anyone know how to place labels on arbitrary borders of a polygon, when using JSXGraph?
I'm looking to implement something like this:
And I am creating a polygon like so (The script is interpreted via board.jc.parse):
A = point(-5,-5) << withLabel:false, visible:false>>;
B = point(-5,5) << withLabel:false, visible:false>>;
C = point(5,5) << withLabel:false, visible:false>>;
D = point(5, -5) << withLabel:false, visible:false>>;
polygon(A,B,C,D);
I'm thinking I can do something like this (put a label on the point and then move it over by a few pixels), but... blehk, that it ugly. I'd like to attach the label to the side of the polygon or the lines themselves.
// Don't want to do it this way
text(A.X(), A.Y(), 'label') << id: 'TT1' >>;
In JessieCode / JSXGraph labels for borders of a polygon can be set with the attribute sub-object 'borders':
A = point(-5, -5) << withLabel:false, visible:false>>;
B = point(-5, 5) << withLabel:false, visible:false>>;
C = point(5, 5) << withLabel:false, visible:false>>;
D = point(5, -5) << withLabel:false, visible:false>>;
polygon(A,B,C,D) <<
borders: <<
names: ['a', 'b', 'c', 'd'],
withLabel: true
>>
>>;

Changing opacity in hexadecimal doesn't work -heat map stop gradient-

I have various hexadecimal RRGGBBAA colors as stop values in a heat map gradient but I have noticed that setting different Alpha values for some of the stop doesn't change the opacity in my code, I always get the same view -although setting the last two alpha bits to 00 as 0.0 opacity works for some reason-. The RRGGBBAA values are written like this:
0xaa00007f (the last two bits, 7f should be 0.5 opacity)
0xaa0000ff (ff is the 1.0 opacity)
The setGradientStops function that takes the stop values is like this -this is from a heat map library, not my code-
setGradientStops: function(stops) {
var ctx = document.createElement('canvas').getContext('2d');
var grd = ctx.createLinearGradient(0, 0, 256, 0);
for (var i in stops) {
grd.addColorStop(i, 'rgba(' +
((stops[i] >> 24) & 0xFF) + ',' +
((stops[i] >> 16) & 0xFF) + ',' +
((stops[i] >> 8) & 0x7F) + ',' +
((stops[i] >> 0) & 0x7F) + ')');
}
ctx.fillStyle = grd;
ctx.fillRect(0, 0, 256, 1);
this.gradient = ctx.getImageData(0, 0, 256, 1).data;
}
The problem is that opacity expects a value in the range of 0 - 1 and there you are outputting a value in the range of 0 - 127. I would try...
grd.addColorStop(i, 'rgba(' +
((stops[i] >> 24) & 0xFF) + ',' +
((stops[i] >> 16) & 0xFF) + ',' +
((stops[i] >> 8) & 0xFF) + ',' +
(((stops[i] >> 0) & 0xFF) / 255) + ')');
So it takes the bits from the part that represents the alpha (all of them rather than almost all of them) by using the & bit operator on 0xFF rather than 0x7F. So...
0xFF (11111111) & 0xFF (11111111) = 0xFF (11111111) = 255
Rather than...
0xFF (11111111) & 0x7F (01111111) = 0x7F (01111111) = 127
and then you have the value in the range of 0 - 255, divide by 255 to get this to the required range.
0xFF / 255 = 1, 0x7F / 255 = 0.498, 0x00 / 255 = 0
So then for 0xaa00007f, grd.addColorStop would be given the string 'rgba(170,0,0,0.498)'

javascript shifting issue (rgb and rgba to hex)

I found a RGB to hex converter and I'm trying to make a RGBA to hex converter. The original rgb2hex function works but the new rgba2hex function does not. What am I doing wrong? The rgba function is returning gba, no r.
// convert RGB color data to hex
function rgb2hex(r, g, b) {
if (r > 255 || g > 255 || b > 255)
throw "Invalid color component";
return ((r << 16) | (g << 8) | b).toString(16);
}
// convert RGBA color data to hex
function rgba2hex(r, g, b, a) {
if (r > 255 || g > 255 || b > 255 || a > 255)
throw "Invalid color component";
return ((r << 32) | (g << 16) | (b << 8) | a).toString(16);
}
Example:
alert(rgb2hex(255, 155, 055));
alert(rgba2hex(255, 155, 055, 255));
Current output: ff9b2d and 9b2dff
Expected output:ff9b2d and ff9b2dff
Your issue is that bitwise math in JavaScript caps out at 31 bits, so you can't quite do this as is. You need to use normal math ops, not bitwise ops:
// convert RGBA color data to hex
function rgba2hex(r, g, b, a) {
if (r > 255 || g > 255 || b > 255 || a > 255)
throw "Invalid color component";
return (256 + r).toString(16).substr(1) +((1 << 24) + (g << 16) | (b << 8) | a).toString(16).substr(1);
}
Also fixed an issue with the original algorithm where if the first component is < 10, the output doesn't have enough digits.
Anyway, this won't work anyway... #ff9b2dff isn't a valid color, but you may not care?
RGB to HEX
rgb(0 255 94)
rgb(0, 255, 94)
RGBA to HEX
rgba(255,25,2,0.5)
rgba(255 25 2 / 0.5)
rgba(50%,30%,10%,0.5)
rgba(50%,30%,10%,50%)
rgba(50% 30% 10% / 0.5)
rgba(50% 30% 10% / 50%)
DEMO - CODEPEN.IO

Categories

Resources