JavaScript - Incorrect division - javascript

perc = 15/30;
//result=Math.round(perc*100)/100 //returns 28.45
$('#counter').text(perc);
$('#total').text(count);
returns back 0.5% which is suppose to be 50.00%... how do I fix this? :S

You do realize that word percent quite literally translates into "per cent" or "per 100" since cent is the latin root that's used everywhere meaning "100" or "one-hundredth".
Century (100 years)
US Cent (100th of a dollar)
Centurion (Those who commanded 100 soldiers)
Centipede (creature with 100 legs)
So 50% becomes 50 per cent becomes 50 per 100
And, since in mathematical terms, the word per means divide (miles per hour == mph == m/h) then we can distill 50% down to:
50/100
Which, surprisingly enough, is represented as the decimal number .5

15/30 = 0.5
if you want to have percent number you have to multiply it by 100.

I am a low rep user so here goes. http://en.wikipedia.org/wiki/Percentage
Treat the % sign as a constant equal to 0.01. Thus, when working with a number like 50%, treat it as 50 * 0.01 or 0.5.
0.5 = n % // I want to know what 0.5 is as a percent
0.5 / % = n * % / % // Divide both sides by the constant
0.5 / % = n // Remove the excess
0.5 / 0.01 = n // Replace the constant
50 = n // You have your answer

Just multiply by 100.

Related

Truncate to the nearest 50 in JavaScript

What JavaScript formula can I use to truncate a number to the nearest 50.
Example. I wanted 498 > 450
I have tried
Math.round (498, 50 )
And
Math.ceil(498, 50)
But am not getting. Please help
This may be a mixup of terminology, mixing terms like "nearest" and "truncate", neither of which quite describes what the example demonstrates.
The example you give always rounds down, never up, to the nearest custom value (in this case 50). To do that you can just subtract the result of % 50. For example:
const val = 498;
console.log(val - val % 50);
Even make it a re-usable function:
const Nearest = (val, num) => val - val % num;
console.log(Nearest(498, 50));
Divide by 50, do the operation, multiply by 50.
console.log(Math.floor(498 / 50) * 50);
console.log(Math.ceil(498 / 50) * 50);
console.log(Math.round(498 / 50) * 50);
console.log(Math.trunc(498 / 50) * 50);
You divide your number by 50, take the ceiling of that number and then multiply it by 50.
Math.ceil(value / 50) * 50;
A quick sidenote: truncate has a whole other meaning for numbers in Javascript: Math.trunc on MDN
Edit:
If you want other rounding semantics than ceil you can of course use floor (always goes to lowest multiple of 50):
Math.floor(451 / 50) * 50; // => 450
You divide by the multiple and round and then multiply by the multiple. If you want the lower bound, you use floor instead of round. If you want the upper bound, you use ceil instead of round. Look at these examples:
let x = 498;
let y = Math.round(498/50)*50;
console.log(y);
y = Math.floor(498/50)*50;
console.log(y);
y = Math.ceil(498/50)*50;
console.log(y);
To do what you want, the Remainder operator is your best friend. This will give you whatever is left over after dividing the number by the nearest number.
If your goal is to always round down, the following function would work. Just take your original number, find the remainder, and remove the remainder:
function roundDownToNearest(num, nearest){
return num - (num % nearest);
}
console.log(roundDownToNearest(498, 50))
If you always want to round up, you round down, then add the nearest amount:
function roundUpToNearest(num, nearest){
return num - (num % nearest) + nearest;
}
console.log(roundUpToNearest(498, 50))
If you want to get to the closest of the two, you could do the following. Find your remainder, then see if it's greater or less than half of your nearest value. If it's greater, round up. If less, round down.
function roundToNearest(num, nearest){
if(num % nearest > nearest / 2){
return roundUpToNearest(num, nearest);
} else {
return roundDownToNearest(num, nearest);
}
}
console.log(roundToNearest(498, 50))
console.log(roundToNearest(458, 50))

Generating random number in javascript [duplicate]

This question already has answers here:
Javascript Random Number?
(3 answers)
Closed 3 years ago.
i am using this function to generate random number between 1000 and 100.
but here according to me, in (max - min) + min, max- min =900 and min= 100, so it should not generate numbers between 900 and 100? but it is returning numbers greater than 900 also how? I am confused. and do tell how to check the range for the numbers random function is generating? any help with this?
x = Math.floor(Math.random() * (1000 - 100) + 100);
console.log(x);
The formula for random numbers Math.random() * (max - min) + min is the correct one to get a uniformly distributed number between min and max.
max - min will give you the range in which you want to generate the random numbers. So in this case 1000 - 100 results in a range of 900.
Multiplying by Math.random() will give you a random number in the range. So, with a Math.random() producing 0.5 after multiplying you get 450.
Finally, adding min back to the random pick ensures the number you get is within bounds of min and max.
For example Math.random() produces 0.01 if we substitute in the formula we get 0.01 * (1000 - 100) = 9 which is below min. Conversely, if Math.random() produces 1 then 1 * (1000 - 100) = 900 which is the highest random number possible to get from the range and yet it's still below max. In both cases adding min to the result ensures the random number you get is within max and min
The function Math.random() returns a number between 0 and 1.
When use "Math.random() * (1000 - 100)", this part of the code generates a number between 0 and 1 then multiplies it by 900, which will give you a number between 0 and 900.
Now in the last block you do add 100 to the previously generated number which results in a number between 0 and 900 + 100, which gives a result between 100 and 1000.
function random(min, max) {
console.log("Multiplying by: " + (max - min));
console.log("And adding : " + min);
return Math.floor(Math.random() * (max - min) + min);
}
console.log(random(100, 1000));
Multiply by (1000 -200) instead as you already have +100
Because in case random number generated is anything greater than 800 you end exceeding range as you're adding 100 in it everytime
x = Math.floor(Math.random() * (1000 - 200) + 100);
console.log(x);
Thumb rule :-
Math.floor(Math.random() * - ( max - ( 2 * min ) ) + min )
As Math.random() generate floats, this need to be converted to an integer.
We can use parseInt(), but there is a shorthand, the ~~ bitwise operator. Performances are known to be excellent.
console.log(
100 + ~~(Math.random() * 800)
)
One possible alternative is the web crypto api, it might be a bit slower, but with the best randomness doable. This return an integer between 0 and 256.
console.log(
100 + ~~(crypto.getRandomValues(new Uint8Array(1))[0] * 3.13)
)

Math: Formula to scale a number to another number

I asked a similar question earlier today, and it turns out that I just suck at math, because I can't figure this one out, either.
I'm calculating the screen ratio via width/height. I need a function to convert that resulting number to a new scale.
e.g.
function convertNum(ratio) {
return //formula here
}
Examples:
Given a resolution of 3000x1000 = ratio of 3 (i.e. 3000/1000).
I want it converted to 133.3 via the function, e.g. convertNum(3) spits out 133.33
2500x1000 = 2.5 (desired result: 100)
2000x1000 = 2 (desired result: 66.6)
1500x1000 = 1.5 (desired result: 33.3)
1000x1000 = 1 (desired result: 0)
It should keep scaling this way for all screen ratios above 1.0.
You need to add an additional 33.3% for every 0.5 in the ratio.
First figure out how many "padding pieces" you need to add:
// Subtracting 1 since 1 should result in a 0
(ratio - 1) / 0.5
Then multiply the number of padding pieces by the padding amount:
((ratio - 1) / 0.5) * 0.333
But dividing by 0.5 is the same thing as multiplying by 2, so it can be further reduced down to:
(ratio - 1) * 2 * 0.333
But that's obviously the same as:
(ratio - 1) * 0.666
Although, you could get more precision by changing that to:
(ratio - 1) * (2 / 3)

Calculate the percentage of a value in a logarithmic scale

I have a logarithmic scale going from 0 to 100:
0.00
0.10
1.00
10.00
100.00
I need to make a pie chart which has 4 quarters.
the first is going from 0 to 0.10
the second is from 0.10 to 1 etc.etc.
So if I have the value 25, it should be calculated which percentage this is in the logarithmic scale. Considering the scale it should end up somewhere in the last quarter of the chart.
Unfortunately my understanding of Maths does not reach this far ;)
Could you help out and tell me where to start.
I thought of looking at each quarter as a 100% piece, and then calculate where this might be in this quarter..
per example:
32 > 10 so it should be in the last quarter (percentage wise above 75%)
So in this last quarter 32 will be in:
((32-10) x 100) / (100 - 10) = 24.44% in this quarter
Making this 24.44 / 4 = 6.11% over 4 quarters and thus 75 + 6.11 = 81.11% of the whole chart.
Now this would work, but I am looking for a shorter and simpler way of calculating this.
Can you please help out.
This is surely a maths question about plotting values on a logarithmic
scale, and not really anything to do with JavaScript in particular or
programming in general. Anyhow ...
You need to decide on a minimum value, since the logarithm of zero is
undefined. Once you have your maximum and minimum logarithms, you can
scale your values as you wish. Slightly ontopic: JavaScript has
Math.log10 in more up-to-date engines (and can be readily
defined if not, e.g. as in #NinaScholz's answer or using the polyfill here).
var minval = 0.01,
maxval = 100,
minlog = Math.log10(minval),
maxlog = Math.log10(maxval),
range = maxlog - minlog,
lineartolog = function(n){
return (Math.log10(n) - minlog) / range;
},
logplots = [
0.01,
0.1,
1,
3.2,
10,
32,
75,
100
].map(lineartolog);
document.body.innerHTML = '<pre>' + logplots + '</pre>';
Adjust as required for percentages, radians, etc.
Consideration:
q0 q1 q2 q3
01234567890123456789012345678901234567890
| | | | |
0.01 0.1 1 10 100
0.32 3.2 32
because log10(32) = 1.505 = 1 + 0.505
because log10(3.2) = 0.505 = 0 + 0.505
because log10(0.32) = -0,495 = -1 + 0.505
^ ^
quadrant after adding 2 amount to fill
together, fill is the amount in %:
function log10(f) {
return Math.log(f) / Math.log(10);
}
function getValue(v) {
var l = log10(v),
quadrant = Math.floor(l) + 2,
fill = (l - Math.floor(l)) * 100;
return { quadrant: quadrant, fill: fill };
}
console.log('0.32', getValue(0.32));
console.log('3.2', getValue(3.2));
console.log('32', getValue(32));

How to determine the number that when multiplied will get you the closest to a target number?

Let's say I have the number 2062 and the multiplier is 0.75
What is the JavaScript formula to find which number that, when multiplied by 0.75, will come the closest to 2062?
PS: The closest here means either equal (==) or very close, but below the target number, and not very close but above.
You are looking for x in x * 0.75 = 2062. So solving for x that should be x = 2062 / 0.75. To ensure that the number is the closest whole number less than or equal to x, you can use Math.floor:
Math.floor(2062 / 0.75) = 2749
function findFactor(a, b) {
return Math.floor(a / parseFloat(b));
}
findFactor(2062, 0.75) -> 2749
https://jsfiddle.net/0s8cr5gd/

Categories

Resources