I'm iterating through an array of arrays, pulling out the sub-arrays I need and discarding the rest.
var newArray = [];
for (var i = 0; i < oldArray.length; i++) {
if (oldArray[i][property] == value) {
newArray.push(oldArray[i]);
}
}
oldArray = newArray;
Is this the most memory-friendly way to do this?
Will garbage collection safely take care of the sub-arrays I did not push onto
newArray?
Will newArray be scattered across memory in a way that could prevent this method from scaling efficiently?
The javascript prototypal nature makes everything be an object. And all objects are maps, literally hash maps. Which means that when you are "pulling", as you say, objects from one array into another, you are only copying their references.
So yes I would say it won't bring you much memory problems, if you drop the references (at least in modern browsers). But the garbage collectors are implemented in different ways depending on the browser you are working with.
1 - Is this the most memory-friendly way to do this?
If you drop the references, yes it is a memory friendly way to do it. You don't have any free like in c/c++, and for testing purposes on chrome i think you can call window.gc() to call the garbage collector. I think a delete exists or existed but I don't know how it works.
2- Will garbage collection safely take care of the sub-arrays I did not append to newArray?
If there aren't any other references pointing to them. Yes. Circular memory leaks were common in older browsers but with the new ones it's safe to say yes.
3 - Will newArray be scattered across memory in a way that could prevent this method from scaling efficiently?
Yes it will be scattered across memory because in javascript arrays work like hashmaps or linked hashmaps (if i'm mistaken here someone pls correct me) but the garbage collector will take care of it, because it is used to work with maps. And again you are only working with references the objects will keep in the same place and you will only store references in the array.
1.no... fmsf is mostly right, but there are some micro-optimization that you could do for page load time improvement, and for the time to check the condition. If you leave oldArray.length in the loop, it will look up length on the oldArray object for every iteration, which can add some time in large loops, also instantiating all your variables at once can also save some time if the method this is contained within is called many times.
When someone downloads your script, it is good to have all the variables with shortest names as possible to have the least data transference from server to client.
var nA = [],
i = 0,
t = oA.length,
s;
for (; i < t; i++) {
s = oA[i];
if (s[p] == v) {
nA.push(s[i]);
}
}
oA = nA;
If you want to get really serious, you would use a code minifier for the renaming of variables and white space removal.
2.yes, JavaScript is pretty good about these things, the only things you have to look out for is IE closures which would not be caused by your code here. I have still found closures in rare cases to be prevalent in IE9 in odd cases, but these are usually caused by linking to the DOM, which is irrelevant in this case.
3.no, the only things that are changing with this code are refrences to objects, not how they are stored in memory.
Related
the project I'm creating involves having an array being searched through a bunch of times, I realize that if I don't do this the most optimal way possible I might see server performance issues.
I was wondering what is the least server intensive way to find a value in an array, your help would be appreciated.
I've seen some people answer this on this website but there's mixed answers, some people say a basic for loop is best and other say indexOf and findIndex would perform better but not sure which is best or if there's a different option.
Time complexity of searching in an array of length n is O(n) whereas using a Map will give you time complexity of O(1) because you don't need to iterate over a Map to know if particular element exists in it. You can get the element by using its key.
If elements exists, it will be returned in O(1) time, otherwise you will get undefined meaning element you searched for doesn't exists in the Map
So its better to use Map instead of an array in your case.
Even the most optimal search through a list would have a runtime complexity of O(n). A basic for loop would be the fastest since you could make it terminate at the first occurrence. Things get slightly more interesting when you're searching for objects.
function arrayHasObj(obj, list) {
var i = list.length;
while (i--) {
if (JSON.stringify(list[i]) === JSON.stringify(obj)) {
return true;
}
return false;
};
}
Here the while loop is obviously O(n), but we also need to account for the serialization of an object to JSON. This falls out of any standard time-complexity analysis. If anyone can weigh on this please do.
Generally optimizing a search through a single array isn't necessary. There is some larger optimization that need to happen in your algorithm, or if you're truly using a large dataset you should use be querying from a database.
EDIT:
Obviously a dictionary/set has its merits. The OP is specifically asking about arrays.
For example:
array1 = new Array(5); array2 = new Array(10);
Both console.log(array1) and console.log(array2) would return [].
then, what is the role of arrayLength here?
JavaScript hides a lot of the details that you'd typically have to deal with when working with arrays in many other languages. For example, an array in JavaScript can grow automatically as you push values to it.
However, under the covers, the runtime is still dealing with the same sort of memory allocation issues that languages like C or Java make visible to the developer. The runtime may set aside a little extra memory for the array to grow into, and then once it runs out of contiguous memory space, it'll reallocate a larger piece of memory somewhere else and copy all of the values from the first set of memory to another location.
(This is vastly oversimplifying things, but hopefully you get the general idea.)
If you know ahead of time exactly how many items you can expect to put into the array, using new Array(number) will give the runtime a hint that it can begin by allocating that much memory, and avoid the need for the memory to be reallocated and copied around as you grow it.
It's in this light, for example, that this page suggests the following practices to achieve maximum performance in the V8 javascript engine:
Don't pre-allocate large Arrays (e.g. > 64K elements) to their maximum size, instead grow as you go
Initialize using array literals for small fixed-sized arrays
Preallocate small arrays (<64k) to correct size before using them
Passing a number to the Array constructor sets the length property of the array without setting the indices of the items (which is why your console.log isn't showing anything).
To quote JavaScript Garden:
Being able to set the length of the array in advance is only useful in a few cases, like repeating a string, in which it avoids the use of a loop.
Here's an example of doing just that:
new Array(count + 1).join(stringToRepeat);
I'm working on a page that, eventually, could have more than 100 arrays, only a few of which would be used at any given time.
At the moment, I'm populating all arrays as global variables, but I suspect this is inefficient in terms of memory use.
Should I change my code to clear the arrays when they are not being used? If so, what is the best way to do this? I'd guess var myArray = new array() but perhaps there's a better option.
Unless you have many thousands of objects in your arrays, you don't need to worry. Don't prematurely optimize your code; the browser is quite good at handling lots of small objects.
If memory does become an issue or you notice performance issues, you can simply reassign a new array to those variables:
myArray = [];
The garbage collector will clean up the objects that you dereferenced.
In a broader case, if there's no need to keep references to those objects, you don't even need the arrays to begin with. I.e., if you never access the elements that you put in the arrays a second time, just remove the arrays and don't bother assigning the data.
Consider this javascript code:
var s = "Some string";
s = "More string";
Will the garbage collector (GC) have work to do after this sort of operation?
(I'm wondering whether I should worry about assigning string literals when trying to minimize GC pauses.)
e: I'm slightly amused that, although I stated explicitly in my question that I needed to minimize GC, everyone assumed I'm wrong about that. If one really must know the particular details: I've got a game in javascript -- it runs fine in Chrome, but in Firefox has semi-frequent pauses, that seem to be due to GC. (I've even checked with the MemChaser extension for Firefox, and the pauses coincide exactly with garbage collection.)
Yes, strings need to be garbage-collected, just like any other type of dynamically allocated object. And yes, this is a valid concern as careless allocation of objects inside busy loops can definitely cause performance issues.
However, string values are immutable (non-changable), and most modern JavaScript implementations use "string interning", that is they store only one instance of each unique string value. This means that if you have something like this...
var s1 = "abc",
s2 = "abc";
...only one instance of "abc" will be allocated. This only applies to string values, not String objects.
A couple of things to keep in mind:
Functions like substring, slice, etc. will allocate a new object for each function call (if called with different parameters).
Even though both variable point to the same data in memory, there are still two variables to process when the GC cycle runs. Having too many local variables can also hurt you as each of them will need to be processed by the GC, adding overhead.
Some further reading on writing high-performance JavaScript:
https://developer.mozilla.org/en-US/docs/JavaScript/Memory_Management
https://www.scirra.com/blog/76/how-to-write-low-garbage-real-time-javascript
http://jonraasch.com/blog/10-javascript-performance-boosting-tips-from-nicholas-zakas
Yes, but unless you are doing this in a loop millions of times it won't likely be a factor for you to worry about.
As you already noticed, JavaScript is not JavaScript. It runs on different platforms and thus will have different performance characteristics.
So the definite answer to the question "Will the GC have work to do after this sort of operation?" is: maybe. If the script is as short as you've shown it, then a JIT-Compiler might well drop the first string completely. But there's no rule in the language definition that says it has to be that way or the other way. So in the end it's like it is all too often in JavaScript: you have to try it.
The more interesting question might also be: how can you avoid garbage collection. And that is try to minimize the allocation of new objects. Games typically have a pretty constant amount of objects and often there won't be new objects until an old one gets unused. For strings this might be harder as they are immutable in JS. So try to replace strings with other (mutable) representations where possible.
Yes, the garbage collector will have a string object containing "Some string" to get rid of. And, in answer to your question, that string assignment will make work for the GC.
Because strings are immutable and are used a lot, the JS engine has a pretty efficient way of dealing with them. You should not notice any pauses from garbage collecting a few strings. The garbage collector has work to do all the time in the normal course of javascript programming. That's how it's supposed to work.
If you are observing pauses from GC, I rather doubt it's from a few strings. There is more likely a much bigger issue going on. Either you have thousands of objects needing GC or some very complicated task for the GC. We couldn't really speculate on that without study of the overall code.
This should not be a concern unless you were doing some enormous loop and dealing with tens of thousands of objects. In that case, one might want to program a little more carefully to minimize the number of intermediate objects that are created. But, absent that level of objects, you should first right clear, reliable code and then optimize for performance only when something has shown you that there is a performance issue to worry about.
To answer your question "I'm wondering whether I should worry about assigning string literals when trying to minimize GC pauses": No.
You really don't need to worry about this sort of thing with regard to garbage collection.
GC is only a concern when creating & destroying huge numbers of Javascript objects, or large numbers of DOM elements.
By using the Chrome Development Tools, I found out arrays and objects were being allocated. I gone through my code looking for the obvious [], {} andnew. But there isn't any. I have checked functions that create a new [], {}, new and looked to see where those functions are used, and I've learnt not to use them. So, how else can memory be allocated?
This is a problem for me, because every time GC kicks in, it blocks the main loop and the animation becomes inconsistent.
It is fruitless to worry overmuch about memory allocation. Memory will be allocated for everything, variables, arrays, objects, etc. There isn't much you could do with javascript without using a variable or an object, but again, the allocation of memory is not really the domain of a javascript script. Any and all javascript will use some degree of memory no matter what. Indeed, I would say that if you have "learned to avoid using" objects and arrays, you have been misinformed or are learning the wrong lesson.
It is far more important to avoid circular references, to avoid excessive memory consumption per scope, and to generally avoid locking up the browser thread with tight loops and other bad practices. For instance, in a for loop, avoid recalculating the limit in the for declaration: for (var x = 1; x < myString.length; x++) should be var max = myString.length; for(var x = 1; x < max; x++). Even such optimizations (micro-optimizations in most cases) are not critical to a javascript developer, for the browser is handling the overall memory allocation/consumption as well as the garbage collection of out-of-scope references.
For more information about practical practices to avoid leaks, check out this article: http://www.javascriptkit.com/javatutors/closuresleak/index.shtml (or other articles like this). Otherwise, as long as you aren't leaking memory, it is expected that any script will allocate/use some degree of memory; it is unavoidable. Considering that the modern PC has gigabytes of memory available, your script's paltry kilobytes or even megabytes of memory use are not of much consequence - that's what the memory is there for, to use it.