Addition and Subtraction with Negative Numbers - javascript

I tried adding and subtracting negative numbers with this code
var num1 = parseInt(document.form1.num1.value);
var num2 = parseInt(document.form1.num2.value);
if(operand == "plus"){
var sum = parseInt(num1+num2);
// add alerts to check
alert (num1);
alert (num2);
alert (sum);
}else{
var sum = parseInt(num1-num2);
}
but when I print the result (sum), the program ignore the negative number and just count it as if it's a positive number. I tried delete the parseInt but nothing changes.
for those who's confused : my inputs are num1 and num2. using the code I had, if I input (4) and (-2) and choose plus sign, sum = 6. they dont count the negative as negative, but as positive.
update : apparently even if I input (-2), they save it as (2).

Assuming sum1 and sum2 are string literals, what you should do is parseInt(num1) + parseInt(num2)
It seems your problem is that you're applying a double negative, which makes a positive:
4 - -2 == 4 + 2

Related

Extending a Variable's Length With Random Numbers Doesn't Work

I am making an AI that tries to solve a user-inputted "numberle" in JavaScript. I don't want the user to do extra work just to see an AI do it's thing, so on the input field, if the user inputs a number that has less than 5 digits, the JavaScript should add random numbers at the end of the variable, until it has a total of five digits.
I used all the loops I had experience with, under an if statement, so if the length of the input was less than 5 (like 3), the loop will add 5 - the number of digits of the input (2) digits that are random, using the Math.random attribute.
Here is the code I currently have:
if (input.length < 5)
do {
input = (input * 10) + Math.floor(Math.random() * 9) + 1;
} while (input.length < 5);
}
console.log(input)
I have also used the for and while loops with basically the same condition (obviously modified for the if loop; made a variable for input.length so that it has the same value).
Here is what I get in the console:
5 // Inputted number (1 digit)
52 // Inputted digit + random number
As you can see, the loop only runs once, although it should've ran 3 more times. I am using strict mode also. My code editor is github.dev, and I am using the CodeSwing console.
If input is a number, it will not have "length", since it is not a string.
You can achieve the desired result like this:
let input = 5;
let digits = 2;
while (input < 10**(digits-1)) input = ~~((input+Math.random())*10);
console.log(input);
Note that ~~ is a compact way of doing Math.floor()
Alternatively, without a while loop:
let input = 5, digits = 2, M = Math; L = x=>~~(M.log(x)/M.log(10))+1;
input = ~~((input+M.random())*10**(digits - L(input)));
console.log(input);

Adding Two Decimal Places using JavaScript

Good day Everyone!
I want to know how to return the output with two decimal places. Instead of 10,000 I want it to return 10,000.00. Also I already put .toFixed(2) but it's not working.
When the amount has decimal number other than zero, the values appear on the printout, but when the decimal number has a zero value, the Zeros won't appear on the printout.
Also, I have added a value of Wtax that was pulled-out on a "Bill Credit" Transaction.
Output:
Numeral.js - is a library that you can use for number formatting.
With that you can format your number as follows:
numeral(10000).format('$0,0.00');
Hope this will help you.
You can try this
var x = 1000; // Raw input
x.toFixed(2).replace(/(\d)(?=(\d{3})+\.)/g, '$1,') //returns you 1,000.00
Alternately you can use Netsuite's currency function too
nlapiFormatCurrency('1000'); // returns you 1,000.00
nlapiFormatCurrency('1000.98'); // returns you 1,000.98
You might consider below code. It can round off decimal values based on the decimal places.
This also addresses the issue when rounding off negative values by getting first the absolute value before rounding it off. Without doing that, you will have the following results which the 2nd sample is incorrect.
function roundDecimal(decimalNumber, decimalPlace)
{
//this is to make sure the rounding off is correct even if the decimal is equal to -0.995
var bIsNegative = false;
if (decimalNumber < 0)
{
decimalNumber = Math.abs(decimalNumber);
bIsNegative = true;
}
var fReturn = 0.00;
(decimalPlace == null || decimalPlace == '') ? 0 : decimalPlace;
var multiplierDivisor = Math.pow(10, decimalPlace);
fReturn = Math.round((parseFloat(decimalNumber) * multiplierDivisor).toFixed(decimalPlace)) / multiplierDivisor;
fReturn = (bIsNegative) ? (fReturn * -1) : fReturn;
fReturn = fReturn.toFixed(decimalPlace)
return fReturn;
}
Below are the test sample
And this test sample after addressing the issue for negative values.

Addition operator not working in JavaScript

Addition operator isn't working for me in Javascript. If I do 5+5, it gives me 55 instead of 10. How can I fix this?
var numberOne = prompt (Enter first number.);
if (numberOne > 0.00001) {
var numberTwo = prompt(Enter the second number.);
if (numberTwo > 0.00001) {
var alertAnswer = alert (numberOne + numberTwo);
}
}
You're reading in strings, and concatenating them. You need to convert them to integers with parseInt.
IE:
var numberOne = parseInt(prompt("Enter first number."), 10);
There are two main changes that need to take place. First, the prompts must use Strings. Second, you must parse the user's String input to a number.
var numberOne = prompt ("Enter first number.");
if (numberOne > 0.00001) {
var numberTwo = prompt("Enter the second number.");
if (numberTwo > 0.00001) {
var alertAnswer = alert (parseInt(numberOne,10) + parseInt(numberTwo,10));
}
you need to use parseInt
as in
var a = parseInt(prompt("Please enter a number"));
Just for completeness: a potential problem with parseInt() (in some situations) is that it accepts garbage at the end of a numeric string. That is, if I enter "123abc", parseInt() will happily return 123 as the result. Also, of course, it just handles integers — if you need floating-point numbers (numbers with fractional parts), you'll want parseFloat().
An alternative is to apply the unary + operator to a string:
var numeric = + someString;
That will interpret the string as a floating-point number, and it will pay attention to trailing garbage and generate a NaN result if it's there. Another similar approach is to use the bitwise "or" operator | with 0:
var numeric = someString | 0;
That gives you an integer (32 bits). Finally, there's the Number constructor:
var numeric = Number( someString );
Also allows fractions, and dislikes garbage.

Javascript Greater than or less than

I did a rewrite of the code I submitted yesterday based on suggestions from others. I now have this but still can't seem to get it to work with greater than less than. I can add/substract the 2 numbers and get a valid answers. I can't get a > < to work however. Hoping someone can offer some additional help keeping it within this format of "If statements".
if ((input.search("what is greater")!= -1) && (input.search(/\d{1,10}/)!=-1) && (input.search(/\d{1,10}/)!=-1))
{var numbersInString = input.match(/\d+/g);
var num1 = parseInt( numbersInString[0], 10 );
var num2 = parseInt( numbersInString[1], 10 );
if (num1 < num2) document.result.result.value = ""+num1+" is less than "+num2+"";
if (num1 > num2) document.result.result.value = ""+num1+" is greater than "+num2+"";
if (num1 = num2) document.result.result.value = "Both numbers are equal";
return true;}
It sounds like you want to manipulate a number in two ways:
1) You want to refer to the individual characters.
2) You want to compare the number to another number and see if one is greater than another.
If you have a string called input, then you can use the function parseInt(input, 10) to convert it from a string to the number represented by that string.
If you want to get just the first two characters, you can use the substring function.
The important thing to keep in mind is that to the computer, the string '12345' and the number 12345 are completely different. The computer has a completely different set of operations that it will perform on each.
also, #albin is correct to point out that putting semicolons after your if statements is wrong.
The output of the match method is an array of strings, so I think you are NOT comparing numbers but strings. Try doing this before comparing your numbers.
var num1 = parseInt( numbersInString[0], 10 );
var num2 = parseInt( numbersInString[1], 10 );
And then compare num1 and num2.
http://jsfiddle.net/qt3RW/
Simple input box:
​​​​
<input id="input1" value="Is 6 greater than 5"></input>
Parser find 'Is # greater than #' where # is digit and alert this digits:
var IsStringValid = $("#input1").val().match(/Is \d greater than \d/g);
alert(IsStringValid);
if(IsStringValid){
var values = $("#input1").val().match(/\d/g);
for(var i = 0; i < values.length; i++){
alert(values[i])
}
}
​

How can I change an HTML input value's data type to integer?

I'm using jQuery to retrieve a value submitted by an input button. The value is supposed to be an integer. I want to increment it by one and display it.
// Getting immediate Voting Count down button id
var countUp = $(this).closest('li').find('div > input.green').attr('id');
var count = $("#"+countUp).val() + 1;
alert (count);
The above code gives me a concatenated string. Say for instance the value is 3. I want to get 4 as the output, but the code produces 31.
How can I change an HTML input value's data type to integer?
To convert strValue into an integer, either use:
parseInt(strValue, 10);
or the unary + operator.
+strValue
Note the radix parameter to parseInt because a leading 0 would cause parseInt to assume that the input was in octal, and an input of 010 would give the value of 8 instead of 10
parseInt( $("#"+countUp).val() , 10 )
Use parseInt as in: var count = parseInt($("#"+countUp).val(), 10) + 1; or the + operator as in var count = +$("#"+countUp).val() + 1;
There is a parseInt method for that or Number constructor.
var count = parseInt(countUp, 10) + 1;
See w3schools webpage for parseInt.

Categories

Resources