Is the below setTimeout JavaScript code leaking? - javascript

I've got a method which loops through the favicon of a website:
function LoopFavIcon(isRed){
if(!tabInFocus)
{
isRed = GetBoolean(isRed, false);
if(isRed)
{
$($('head link')[0]).attr("href", "favicon.ico");
}
else
{
$($('head link')[0]).attr("href", "favicon_red.ico");
}
setTimeout(function f83(){
LoopFavIcon(!isRed)
}, 700);
}}
The memory timeline seems to add memory, without a GC to occur:
It's possible that the measured time interval wasn't enough for the GC to kick in, but I have my doubts. Thanks in advance!

You haven't shown what's in GetBoolean, but assuming it's not doing something it shouldn't, no, there's no memory leak in that code. You just didn't run it for long enough to do GC. (Chrome's tools also let you force a GC.)
Remember that one of the key aspects of a garbage-collected environment is that the environment will do garbage collection as and when needed, and not typically before.

Related

JavaScript - Persistent memory leaks

I have been getting persistent memory leaks with my TypeScript application (3PG). The memory management appears to be flawed.
Two Applications:
2PG -> https://github.com/theADAMJR/2pg [does not have memory leaks]
3PG -> the application in question, [extension of 2PG, has set intervals etc.]
Here is a class of 3PG that uses lots of intervals and could be a cause: https://pastebin.com/Z6K8a2vK
private schedule(uuid: string, savedGuild: GuildDocument, interval: number) {
const task = this.findTask(uuid, savedGuild.id);
if (!task.timer) return;
task.status = 'ACTIVE';
task.timeout = setInterval(
async() => await this.sendTimer(task, savedGuild), interval);
}
I was wondering if this code could be the issue, and if so, what I am doing or should avoid doing in JavaScript that would build up memory usage. Thanks.
Update: The problem was due to discord.js calling the ready event multiple times which built memory up over time. It was also my mistake of not providing enough info to help precisely answer the question.
The problem is probably the fact that all those setInterval calls, will set up a new execution. There are 2 things that you need to do
Clear Intervals for anything that has already done its job and is no longer needed.Hold a reference to the interval objects and clear them when the component is no longer in use
Without knowing exacly the business behind your code, consider if setTimeout could be more appropriate for this task

JavaScript: changing timeout for infinite loops?

Sometimes I make mistakes and get infinite loops in JavaScript (example: while(true){}). Firefox helpfully reports "Error: Script terminated by timeout" or that the memory allocation limit is exceeded. How can I change the length of this timeout? I want to shorten it during development.
I could also solve this problem by writing a function that I call inside the loop that counts iterations and throws an exception on too many iterations. But I'm also interested in how to change the timeout value.
I have not seen this timeout documented, and have not found it by searching.
Unfortunately the maximum recursion limit is not user configurable from within a page running javascript. The limits also vary across browsers.
This Browserscope test showcases the results of user testing from another StackOverflow question on the subject: What are the js recursion limits for Firefox, Chrome, Safari, IE, etc?
Aside from writing your own timeout or utilising a promise chain to process data, you won't be able to change the timeout yourself.
There are multiple ways to change the timeout of infinite loops.
One shorter method is setInterval:
setInterval(function() {
document.body.innerHTML += "Hello!<br>";
}, 1000) //This number is, in milliseconds, the interval in which the function is executed.
Another better method is window.requestAnimationFrame. This gives you a higher quality animation(see Why is requestAnimationFrame better than setInterval or setTimeout for more details); here's an example of it in motion:
function run() {
setTimeout(function() {
window.requestAnimationFrame(run);
document.body.innerHTML += "Hello!<br>";
}, 1000); // This sets an interval for the animation to run; if you take the setTimeout out the function will loop as quickly as possible without breaking the browser.
}
run();
D. Joe is correct. In Firefox, browse to about:config and change the number of seconds value in dom.max_script_run_time .
See http://kb.mozillazine.org/Dom.max_script_run_time for details--you can also eliminate the timeout.
This will partially answer your question, but here is how I handle such case. It is more a workaround than a fix.
IMPORTANT: Do not put this code in production environment. It should only be used in local dev while debugging.
As I tend to be debugging when I stumble upon this kind of case, I am mostly likely using console.log to output to console. As such, I override the console.log function as follow anywhere near the entry point of my app:
const log = console.log
const maxLogCount = 200
let currentLogCount = 0
console.log = function (...msg) {
if (currentLogCount >= maxLogCount) {
throw new Error('Maximum console log count reached')
}
currentLogCount++
log(...msg)
}
Then when I accidentally do this
while (true) {
console.log("what is going on?")
}
It will error out after 200 outputs. This will prevent the tab from locking for half a minute and having to reopen a new tab and bla bla bla.
It is usually just the browser's way of saying the script is out of memory. You could solve this by using for loops or creating and index variable and adding to it each time like so:
var index = 0;
while(true){
if(index > certainAmount){
break;
}
index++;
}
If you really want something to go on forever read about setInterval()
or setTimeout()

Try Catch Errors many script components - JavaScript

I have a page that contains many script components (50+) and I am getting an error when using IE at some random instance (doesn't happen in Chrome or Firefox).
"Out of Memory at line: 1"
I've done some google search too and that reveals issues with IE handling things differently to Chrome and FF. I would like to catch this error and know exactly what the cause of that script error is.
What would be the best way to use a global try-catch block on that many script components? All these script components are on the same page. Looking forward to your suggestions.
You might want to try window.onerror as a starting point. It will need to be added before the <script> tags that load the components.
https://developer.mozilla.org/en-US/docs/Web/API/GlobalEventHandlers/onerror
If that fails, you might try reducing the components loaded by half until the error no longer occurs. Then, profile the page (you may have to reduce further due to the demand of profiling). Look for a memory leak as #Bergi suggested. If there is in fact a leak, it will likely occur in all browsers, so you can trouble-shoot in Chrome, as well.
If that still fails to yield anything interesting, the issue may be in one particular component that was not in the set of components you were loading. Ideally, anytime that component is included you see the issue. You could repeatedly bisect the loaded components until you isolate the culprit.
Finally, forgot to mention, your home-base for all of this should be the browser's developer tools, e.g. Chrome dev tools, or if it is unique to Edge, Edge debugger.
And FYI, Edge is the browser that crashes, but that does not mean the issue is not present in Chrome or FF.
One important thing that is missing in your question is if the error happens during the page loading or initialization or if it happens after some time while you browse the page.
If it's during loading or initialization, it's probably caused by the fact that your page contains too many components and uses much more memory than the browser is willing to accept (and IE is simply the first one to give up).
In such case there is no helping but reduce the page size. One possible way is to create only objects (components) that are currently visible (in viewport) and as soon as they get out of the viewport remove them from JS and DOM again (replacing the with empty DIVs sized to the size of the components).
In case the error happens while browsing the page, it may be caused by a memory leak. You may use Process Explorer to watch the memory used by your browser and check if the memory constantly increase - which would indicate the memory leak.
Memory leak in Internet Explorer may happen because it contains 2 separate garbage collectors (aka GC): one for DOM objects and other for JS properties. Other browsers (FF, Webkit, Chromium, etc.; not sure about the Edge) contains only one GC for both DOM and JS.
So when you create circular reference between DOM object and JS object, IE's GC cannot correctly release the memory and creates a memory leak.
var myGlobalObject;
function SetupLeak()
{
myGlobalObject = document.getElementById("LeakDiv");
document.getElementById("LeakDiv").expandoProperty = myGlobalObject;
//When reference is not required anymore, make sure to release it
myGlobalObject = null;
}
After this code it seems the LeakDiv reference was freed but LeakDiv still reference the myGlobalObject in its expandoProperty which in turn reference the LeakDiv. In other browsers their GC can recognize such situation and release both myGlobalObject and LeakDiv but IE's GCs cannot because they don't know if the referenced object is still in use or not (because it's the other GC's responsibility).
Even less visible is a circular reference created by a closure:
function SetupLeak()
{
// The leak happens all at once
AttachEvents( document.getElementById("LeakedDiv"));
}
function AttachEvents(element)
{
//attach event to the element
element.attachEvent("onclick", function {
element.style.display = 'none';
});
}
In this case the LeakedDiv's onclick property references the handler function whose closure element property reference the LeakedDiv.
To fix these situations you need to properly remove all references between DOM objects and JS variables:
function FreeLeak()
{
myGlobalObject = null;
document.getElementById("LeakDiv").expandoProperty = null;
}
And you may want to reduce (or remove completely) closures created on DOM elements:
function SetupLeak()
{
// There is no leak anymore
AttachEvents( "LeakedDiv" );
}
function AttachEvents(element)
{
//attach event to the element
document.getElementById(element).attachEvent("onclick", function {
document.getElementById(element).style.display = 'none';
});
}
In both cases using try-catch is not the option because the Out of memory may happen on random places in code and even if you find one line of code where it's happened the next time it may be elsewhere. The Process Explorer is the best chance to find the situations when the memory increase and and trying to guess what may be causing it.
For example: if the memory increase every time you open and close the menu (if you have one) then you should look how it's being opened and closed and look for the situations described above.
You could check your localStorage before and after any components called.
Something like:
function getLocalStorage() {
return JSON.stringify(localStorage).length;
}
function addScript(src, log) {
if(log){
console.log("Adding " + src + ", local storage size: " + getLocalStorage());
}
var s = document.createElement( 'script' );
s.setAttribute( 'src', src );
document.body.appendChild( s );
}
function callFunction(func, log){
if(log){
console.log("Calling " + func.name + ", local storage size: " + getLocalStorage());
}
func();
}
try {
addScript(src1, true);
addScript(src2, true);
callFunction(func1, true);
callFunction(func2, true);
}
catch(err) {
console.log(err.message);
}
I hope it helps you. Bye.

javascript / nodejs: synchronously calling set of similar asynchronous functions on large, chunked data set causes memory leak

I will try to simplify and explain my issue:
I am attempting to take a large database (mysql) of entries (1,000,000+) and reindex everything in another database (elasticsearch). In order to do this successfully I have to break the entries up into chunks to send reasonably sized requests to the new db and not load too much into memory at once (I don't have limitless memory on my server).
Here are three different versions of attempted solutions that all lead to similiar memory leaks (usage just keeps growing past 1gb until the script is finished):
1.
function sync(start_id) {
get_messages_from_db(start_id, function(messages){
if (messages.length == 0) return undefined;
index_in_elastic_search(messages, function(){
next_start_id = message.last.id
setTimeout(function(){
sync(next_start_id)
}, 0);
});
});
}
sync(0)
2.
start_ids = [0, 100, 200... 1000000]
requests_to_sync = []
start_ids.forEach(function(id){
requests_to_sync.push(get_messages_from_db.bind(undefined, id))
});
function sync_requests(requests) {
if (requests.length == 0) return undefined;
requests.shift()(function(messages){
index_in_elastic_search(messages, function(){
setTimeout(function(){
sync_requests(requests);
},0);
});
});
}
sync_requests(requests_to_sync);
finally I tried the node async library because I thought they might have solved this problem
3.
start_ids = [0, 100, 200... 1000000]
async.series(start_ids, function(id, next){
get_messages(id, function(messages){
index_in_elastic_search(messages, function(){
next();
});
});
});
So yeah, each of these solutions is similar. In practice they all take about the same amount of time to execute, and all lead to a memory leak.
I'm guessing the leak has to do with references to messages not being cleared properly at each iteration and probably something to do with recursion.
If anyone can shed light on why the heap keeps growing, or how I can fix this... that would be great.
Dayum, 1 million records. So, I'm guessing the garbage collector is practically never getting a chance to do any work. I'm not like a guru on how js garbage collection works, but I know a lil bit. For something like this, I would try to avoid making functions. I always consider function creation a costly process, both in terms of memory and cpu usage. So... maybe try something a bit more spread out like:
/* globals get_messages, index_in_elastic_search */
var state = {
curr_id: 0,
messages: null,
next_func: null
};
get_more_messages();
function get_more_messages() {
get_messages( state.curr_id, on_messages );
}
function on_messages(messages) {
if( messages.length ) {
delete state.messages; // this might help, maybe not
state.messages = messages; // this is probably sufficient to start garbage collection of the old messages
state.next_func = process_messages;
gc_love();
} else {
// maybe call state.done() here as a final callback
}
}
function process_messages() {
state.curr_id = state.messages[messages.length-1].id;
state.next_func = get_more_messages;
index_in_elastic_search(messages, gc_love ); // wait 20 ms and call get_more_messages
}
function gc_love() {
setTimeout(state.next_func, 20);
}
So we're only ever creating 4 functions and 1 object here. If you use all those closures, you're creating quite a bit more functions and creating a lot more work for js scoping, which is never a good thing.

Performance heavy algorithms on Node.js

I'm creating some algorithms that are very performance heavy, e.g. evolutionary and artificial intelligence. What matters to me is that my update function gets called often (precision), and I just can't get setInterval to update faster than once per millisecond.
Initially I wanted to just use a while loop, but I'm not sure that those kinds of blocking loops are a viable solution in the Node.js environment. Will Socket.io's socket.on("id", cb) work if I run into an "infinite" loop? Does my code somehow need to return to Node.js to let it check for all the events, or is that done automatically?
And last (but not least), if while loops will indeed block my code, what is another solution to getting really low delta-times between my update functions? I think threads could help, but I doubt that they're possible, my Socket.io server and other classes need to somehow communicate, and by "other classes" I mean the main World class, which has an update method that needs to get called and does the heavy lifting, and a getInfo method that is used by my server. I feel like most of the time the program is just sitting there, waiting for the interval to fire, wasting time instead of doing calculations...
Also, I'd like to know if Node.js is even suited for these sorts of tasks.
You can execute havy algorithms in separate thread using child_process.fork and wait results in main thread via child.on('message', function (message) { });
app.js
var child_process = require('child_process');
var child = child_process.fork('./heavy.js', [ 'some', 'argv', 'params' ]);
child.on('message', function(message) {
// heavy results here
});
heavy.js
while (true) {
if (Math.random() < 0.001) {
process.send({ result: 'wow!' });
}
}

Categories

Resources