Optimizing javascript code to use integer arithmetic - javascript

There are some algorithms which solve problems "very well" under the assumption that "very well" means minimizing the amount of floating point arithmetic operations in favor of integer arithmetic. Take for example Bresenham's line algorithm for figuring out which pixels to fill in order to draw a line on a canvas: the guy made practically the entire process doable with only some simple integer arithmetic.
This sort of thing is obviously good in many situations. But is it worth fretting about operations that require a lot of floating-point math in javascript? I understand that everything's pretty much a decimal number as far as the language specification goes. I'm wondering if it is practically worth it to try to keep things as integer-like as possible--do browsers make optimizations that could make it worth it?

You can use Int8, Uint8, Int16, etc. in javascript, but it requires a bit more effort than normal - see TypedArrays.
var A = new Uint32Array(new ArrayBuffer(4*n));
var B = new Uint32Array(new ArrayBuffer(4*n));
//assign some example values to A
for(var i=0;i<n;i++)
A[i] = i; //note RHS is implicitly converted to uint32
//assign some example values to B
for(var i=0;i<n;i++)
B[i] = 4*i+3; //again, note RHS is implicitly converted to uint32
//this is true integer arithmetic
for(var i=0;i<n;i++)
A[i] += B[i];
Recently, the asm.js project has made it is possible to compile C/C++ code to strange looking javascript that uses these TypedArrays in a rather extreme fashion, the benefit being that you can use your existing C/C++ code and it should run pretty fast in the browser (especially if the browser vendors implement special optimizations for this kind of code, which is supposed to happen soon).
On a side note* if you program can do SIMD parallelism (see wikipeda), i.e. if your code uses the SSEx instruction set, your arithmetic will be much faster, and in fact using int8s will be twice as fast as using int16s etc.
*I don't think this is relevant to browsers yet due to being too difficult for them to take advantage of on the fly. Edit: It turns out that Firefox is experimenting with this kind of optimization. Also Dart (true Dart, not Dart compiled to js) will be able to do this in Chrome.

Long ago, computers lacked dedicated FPUs and did floating point math entirely via software emulation.
Modern computers all have dedicated FPUs which handle floating point math just as well as integer. You should not need to worry about it unless you have a very specific circumstance.

Actually, it makes no different. JavaScript has no concept of "integer". JS only uses double-precision floating-point numbers, which may or may not be integers.
Therefore, there is absolutely nothing to gain in terms of performance by limiting yourself to integers.
However, keep in mind that integers will be precise up to 251, whereas non-integers can very easily suffer from precision loss (example: 0.1), so you might gain because of this.

Related

In Javascript, which operator is faster, the '*' multiply or '/' divide?

In javascript, is there a speed advantage between the multiplication operator and the division operator? As an example...
var foo = bar * 0.01;
var foo = bar / 100;
foo is the same for both, but which statement returns the value of foo the fastest? I know this may be an incredibly small difference, however, when loop processing large amounts of data it could make a bigger difference than realized, which would then make a difference in how I construct equations to facilitate the processing.
I would say, it depends on the implementation. I would just make an own test somehow or try to google it.
For most machines, multiplications is faster, but the raw CPU speed is not decisive when it comes to scripting languages. Even when the implementation is the same, the execution time for one or the other will not differ so much, since the overhead of scripting languages is normally much bigger.
Normally the difference between different operations is so small, that it is not worth it to think about it. When you must, you probably are using the wrong language, anyhow.
In computer systems, the only basic operators are + (adders) and (*) multipliers. You either add (negative or positive) numbers or rotate numbers (either left or right for multiplication and division respectively). You should be able to work it out whether multiplication or division takes longer...
*btw...unless I am wrong, your question has nothing to do with javascript. Javascript is an interpreted language with engines such as spidermonkey or rhino....
FYI You should read this - directly from the main people....to have better insight of "what" might be happening.

JavaScript and Dealing with Floating Point Determinism

I'm looking to build a browser multiplayer game using rollback netcode that runs a deterministic simulation on the clients. I prototyped the netcode in Flash already before I ran into the floating point roadblock.
Basically, from what I understand, integer math in Flash is done by casting ints to Numbers, doing the math, then casting back to int. It's faster apparently, but it means that there's no chance of deterministic math across different computer architectures.
Before I dump all my eggs into the JavaScript basket then, I'd like to ask a few questions.
Is there true integer arithmetic on all major browsers in JavaScript? Or do some browsers do the Flash thing and cast to floats/doubles to do the math before casting back to int?
Does something like BigDecimal or BigNum work for deterministic math across different computer architectures? I don't mind some performance loss as long as it's within reason. If not, is there some JavaScript fixed point library out there that solves my problem?
This is a long shot, but is there a HTML5 2D game engine that has deterministic math for stuff like x/y positions and collisions? The list of game engines is overwhelming to say the least. I'm uneasy about building a deterministic cross browser compatible engine from scratch, but that might be what I have to do.
NOTE: Edited from HTML5 to JS as per responses. Apologies for my lack of knowledge.
This is a Javascript issue - not an HTML5 one.
All Javascript math is done using IEEE754 floating point double values - there are no "ints".
Although IEEE754 requires (AFAIK) a specific answer for each operation for any given input, you should be aware that JS interpreters are potentially free to optimise expressions, loops, etc, such that the floating point operations don't actually execute in the order you expect.
Over the course of a program this may result in different answers being produced on different browsers.

Bitwise operations' significance in javascript [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Where would I use a bitwise operator in JavaScript?
In c/c++ bitwise operations are faster than normal(arithmetic) operations(significant atleast in low performance processors). Does the same apply in js? I don't think as the reason its faster in c is bitwise operations are hardwired and usually are completed in 1 processor cycle. But js runs within browser which doesn't have any such hardware(registers I mean) access. I am not sure (around 70% sure :) ). What are typical(or some smarter) uses of bitwise operators (especially in js but I would like to know others too). Please correct me if I am wrong anywhere.
Some bitwise operators are faster than arithmetic operators in some cases. It's hard to optimise Javascript, because the performance varies greatly betwen browsers, platforms and computer models.
Modern browsers compile the Javascript code into native code, so some things that are said about compiled languages are also relevant for Javascript. However, some things that are said about compiled languages are getting more and more inaccurate with newer processors. It's for example not relevant to look at the performance of a single processor operation any more, as operations are run in parallel. You don't look at how many cycles an operation takes any more, but how many operations you can do in a cycle.
To determine if a specific operation is faster or slower than another in Javascript, you would have to test it on a wide variety of computers and browsers, and look at the average, best case and worst case. Even then, any specific result that you get, woudl get more and more out of date for each new browser version that is released.
Bitwise operators in JS are slow. Really slow compared to C. The reason is that in JS, all numbers are represented as double-precision floating point numbers, so to perform a bitwise operation, the runtime has to convert them to 32-bit integers and back.
That's not to say they aren't useful. e.g., Node#compareDocumentPosition returns a bitmask, and something.length >>> 0 is a common way of getting the length property of something or zero if length isn't a number or is NaN. Also, a / b | 0 is a fast way to do Math.floor(a / b), assuming a and b are >= 0.

Performance of bitwise operators in javascript

One of the main ideas behind using bitwise operators in languages like C++/java/C# is that they're extremely fast. But I've heard that in javascript they're very slow (admittedly a few milliseconds probably doesn't matter much today). Why is this so?
(this question discusses when bitwise operators are used, so I'm changing the focus of this question to performance.)
This is quite an old question, but no one seemed to answer the updated version.
The performance hit that you get with JavaScript that doesn't exist in C/C++ is the cast from floating point representation (how JavaScript strores all of its numbers) to a 32 bit integer to perform the bit manipulation and back.
Nobody uses hex anymore?
function hextoRgb(c) {
c = '0x' + c.substring(1);
return [(c >> 16) & 255, (c >> 8) & 255, c & 255];
}
var c1 = hextoRgb('#191970');
alert('rgb(' + c1.join(',') + ')');
I use bitwise shift of zero in JS to perform quick integer truncation:
var i=3.141532;
var iTrunc=i>>0; //3
When would you want to use them? You would want to use them when you want to do bitwise operations. Just like you'd use boolean operators to do boolean operations, and mathematical operators to do mathematical operations.
If you are comfortable with bitwise operators it is very natural to use them for some applications. They can be used for many purposes other than an over-optimized boolean array. Of course, these circumstances don't come up very often in Javascript programming, but that's no reason why the operators shouldn't be available.
I found some good info #
http://dreaminginjavascript.wordpress.com/2009/02/09/bitwise-byte-foolish/
Apparently they perform very well these days. Why would you use them? Same reason you would anywhere else.
I'd think it's up to the implementer to make an operator efficient or inefficient. For example, there's nothing that prevents a JavaScript implementer from making a JITting VM, which turns a bitwise op into 1 machine instruction. So there's nothing inherently slow about "the bitwise operators in JavaScript".
There is an NES emulator written in JavaScript - it seems to make plenty of use of bitwise operations.
I am doubtful that bitwise operation are particularly slow in javascript. Since such operations can map directly to CPU operations, which are themselves quite efficient, there doesn't appear to be any inherent characteristic of bitwise operations that would force them to be irremediably slow in javascript.
Edit December 2015: I stand corrected! The performance hit that Javascript suffers in regards to bitwise operations comes from the need of converting from float to int and back (as all numeric variables in Javascript are stored as floating point values). Thank you to Chad Schouggins for pointing that out.
Never the less, as indicated in several responses, there exist various applications of javascript which rely on bitwise operation (ex: crytography and graphics) and which are not particularly slow... (see silky and Snarfblam on this page). This suggests that while slower than C/C++ and other languages which translate directly bitwise ops to single native CPU instructions, bitwise operations are all that sluggish.
Let's never the less entertain the possibility that some particular reasons caused the various implementers of javascript hosts to implement bitwise ops in a fashion that makes these extremely slow, and see if this even matters...
Although javascript has been used for other purposes, the most common use of this language in in providing user interface type of services.
BTW, I do not mean this in any pejorative way at all; performing these smart UI functions, and considering various constraints imposed on the language and also the loose adherence to standards, has required -and keeps requiring- talented javascript hackers.
The point is that in the context of UI-type requirements, the need for any quantity of bitwise operations susceptible of exposing the slowness of javascript in handling such operations is uncommon at best. Consequently, for typical uses, programmers should use bitwise operations where and if this approach seems to flow well with overall program/data and they should do so with little concern for performance issues. In the unlikely case of performance bottleneck arising from bitwise use, one can always refactor things, but one is better off staying clear from early optimization.
The notable exception to the above is with the introduction of canvas, on modern browsers, we can expect that more primitive graphic functions will be required of javascript hosts, and such operations can require in some cases heavy doses of bitwise operations (as well as healthy does of math functions). It is likely that these services will eventually be supported by way of javascript libraries (and even end-up as languages additions). For such libraries the common smarts of the industry will have been put to use to figure out the most efficient approaches. Furthermore and if indeed there is a weakness in javascript performance with bitwise ops, we'll get some help, for I predict that the javascript implementations on various hosts (browsers) will be modified to improve this particular area. (This would follow the typical pattern of evolution of javascript, that we've seen over the years.)
When speed is paramount, you can use them for bit-masking: http://snook.ca/archives/javascript/storing_values/
Also, if you need to support Netscape 4, you'd use them to deal with Document.captureEvents(). Not that any respectable company would have you write JS for NS4...
People do interesting things in JavaScript.
For example there are a lot of cryptography algorithms implemented in it (for various reasons); so of course bitwise operators are used.
Using JavaScript in its Windows Scripting Host JScript incarnation, you might have cause to use bitwise operators to pick out flags in values returned from WMI or Active Directory calls. For example, the User Access value of a user's record in AD contains several flags packed into one long integer.
ADS_UF_ACCOUNTDISABLE = 0x00000002;
if (uac & ADS_UF_ACCOUNTDISABLE == ADS_UF_ACCOUNTDISABLE) {
// user account has been disabled
}
Or someone's arbitrary table structure may contain such a field, accessible through ADO with JScript.
Or you may want to convert some retrieved data into a binary representation on any platform, just because:
BinaryData = "L";
BinaryString = BinToStr(BinaryData, ".", "x");
// BinaryString => '.x..xx..'
So there are numerous reasons why one might want to do bit manipulation in JavaScript. As for performance, the only way to know is to write it and test it. I suspect in most cases it would be perfectly acceptable, not significantly worse than any other of the multitude of inefficiencies these systems contain.
A lot of bitwise operations are being benchmarked here: http://jsperf.com/rounding-numbers-down/3
However, feel free to create your own performance testcase on jsPerf!

JavaScript Endian Encoding?

A response on SO got me thinking, does JavaScript guarantee a certain endian encoding across OSs and browsers?
Or put another way are bitwise shifts on integers "safe" in JavaScript?
Shifting is safe, but your question is flawed because endianness doesn't affect bit-shift operations anyway. Shifting left is the same on big-endian and little-endian systems in all languages. (Shifting right can differ, but only due to interpretation of the sign bit, not the relative positions of any bits.)
Endianness only comes into play when you have the option of interpreting some block of memory as bytes or as larger integer values. In general, Javascript doesn't give you that option since you don't get access to arbitrary blocks of memory, especially not the blocks of memory occupied by variables. Typed arrays offer views of data in an endian-sensitive way, but the ordering depends on the host system; it's not necessarily the same for all possible Javascript host environments.
Endianness describes physical storage order, not logical storage order. Logically, the rightmost bit is always the least significant bit. Whether that bit's byte is the one that resides at the lowest memory address is a completely separate issue, and it only matters when your language exposes such a concept as "lowest memory address," which Javascript does not. Typed arrays do, but then only within the context of typed arrays; they still don't offer access to the storage of arbitrary data.
Some of these answers are dated, because endianness can be relevant when using typed arrays! Consider:
var arr32 = new Uint32Array(1);
var arr8 = new Uint8Array(arr32.buffer);
arr32[0] = 255;
console.log(arr8[0], arr8[1], arr8[2], arr8[3]);
When I run this in Chrome's console, it yields 255 0 0 0, indicating that my machine is little-endian. However, typed arrays use the system endianness by default, so you might see 0 0 0 255 instead if your machine is big-endian.
Yes, they are safe. Although you're not getting the speed benefits you might hope for since JS bit operations are "a hack".
ECMA Script does actually have a concept of an integer type but it is implicitly coerced to or from a double-precision floating-point value as necessary (if the number represented is too large or if it has a fractional component).
Many mainstream Javascript interpreters (SpiderMonkey is an example) take a shortcut in implementation and interpret all numeric values as doubles to avoid checking the actual native type of the value for each instruction. As a result of the implementation hack, bit operations are implemented as a cast to an integral type followed by a cast back to a double representation. It is therefore not a good idea to use bit-level operations in Javascript and you won't get a performance boost anyway.
are bitwise shifts on integers "safe" in JavaScript?
Only for integers that fit within 32 bits (31+sign). Unlike, say, Python, you can't get 1<<40.
This is how the bitwise operators are defined to work by ECMA-262, even though JavaScript Numbers are actually floats. (Technically, double-precision floats, giving you 52 bits of mantissa, easily enough to cover the range of a 32-bit int.)
There is no issue of 'endianness' involved in bitwise arithmetic, and no byte-storage format where endianness could be involved is built into JavaScript.
JavaScript doesn't have an integer type, only a floating point type. You can never get close enough to the implementation details to worry about this.

Categories

Resources