Open this page: http://sunnah.com/abudawud/2
And run this simple xpath search query in the console. Then the browser tab crashes
for(var k=0, kl=2000; k < kl; k++){
console.log(k);
var xpathResult = document.evaluate("//div[#class='hello']", document, null, XPathResult.ANY_TYPE, null);
}
On chrome Version 46.0.2490.80 (64-bit) on Macbook Pro, OSX 10.10.5
Unfortunately i have to run the xpath on this page a couple of thousands of time to search for different elements. So i can't get away with not doing that many calls to evaluate.
The crash is dependent on the xpath term. for some terms it crashes and for some others it does not.
It fails consistently on the same count so it makes me think it is not a timing issue or garbage collection issue.
I am not getting any error codes so I am not sure where else to look.
Update
After further investigation we believe this is a legitimate Chrome bug or at least not very good way of releasing memory. What happens is that if your xpath starts with / or // then the search context is expanded to all of DOM and for some reason chrome keeps the DOM or some other intermediary object in memory. If the xpath does starts with relative path like (div/p) and the search scope ( second argument) is set to portions of the DOM the memory consumption is much more reasonable and there is no crash. Thanks to #JLRishe for several hints that were very helpful to get to this conclusion.
Update2
I filed a bug on chromium. But after a few months they rejected the bug as wont-fix. I managed to work around it for the time being.
If I run your code on that page and watch Task Manager, I can see Chrome's working set increase to about 3.3 GB before it eventually crashes after about 1300 iterations.
Each XPath query is causing Chrome to allocate memory for the results and any operation involved in obtaining them, but it seems like it is not releasing any of the allocated memory because you are not releasing control of the thread.
I have found that the working set levels out at 1.65 GB and the operation finishes without crashing if I do this:
var k = 0;
var intv = setInterval(function () {
console.log(k);
var xpathResult = document.evaluate("//div[#class='hello']", document, null, XPathResult.ANY_TYPE, null);
k += 1;
if (k >= 2000) {
clearInterval(intv);
}
}, 0);
so something like that might be a possible solution.
This is still using considerable system resources, and this isn't even including any values you might be storing in the course of your operation. I encourage you to seek out a smarter approach that doesn't require running quite so many XPath queries.
Related
I have a timer function that creates a large amount of data, declared with var. Why does the object not get garbage-collected? The number shown by usedJSHeapSize keeps growing. Task Manager in Chrome also shows memory increasing.
I'm testing this in Windows 10, Chrome, using VS 2017.
If I copy and paste the code into a separate file called test.html and open that in Chrome, it also shows the leak.
I've tested this code in Edge and IE (using Developer Tools instead of usedJSHeapSize) and I see no memory leak.
Is this an issue with Chrome?
<script type="text/javascript">
function refreshTimer() {
try {
var longStr = new Array(1000000).join('*');
document.getElementById('div1').textContent = 'usedJSHeapSize: ' + window.performance.memory.usedJSHeapSize;
}
catch (err) {
document.getElementById('div1').textContent = 'refreshTimer: ' + err;
}
}
window.setInterval("refreshTimer()", 3*1000);
</script>
<div id="div1" style="font-family:Calibri"></div>
I expect that there would be no memory leak because the large data object is declared with var and should be garbage-collected when it leaves scope.
Edit to my my original post:
I have run Chrome with and without "--enable-precise-memory-info" and it makes no difference. I have observed memory growing in Chrome->More Tools->Task Manager and in Windows Task manager with just one instance of Chrome running with my test.html file.
The only links I can find that mention this as a possible bug in Chrome are these:
Javascript garbage collection of typed arrays in Chrome
https://bugs.chromium.org/p/chromium/issues/detail?id=232415
These are old posts though and I can't believe the bug would live this long.
So - I'm still perplexed.
1/2/2019 - adding a comment to move this question to the top of SO. If anyone knows, please add your thoughts.
You've run into a security issue with Chrome. Chrome does not expose true memory usage via "window.performance.memory". Attackers could use this information to attack the web browser.
While profiling my webapp I noticed that my server is lighting fast and Chrome seems to be the bottleneck. I fired up Chrome's "timeline" developer tool and got the following numbers:
Total time: 523ms
Scripting: 369ms (70%)
I also ran a few console.log(performance.now()) from the main Javascript file and the load time is actually closer to 700ms. This is pretty shocking for what I am rendering (an empty table and 2 buttons).
I continued my investigation by drilling into "Scripting":
Evaluating jQuery-min.js: 33ms
Evaluating jQuery-UI-min.js: 50ms
Evaluating raphael-min.js: 29ms
Evaluating content.js: 41ms
Evaluating jQuery.js: 12ms
Evaluating content.js: 19ms
GC Event: 63 ms
(I didn't list the smaller scripts but they accounted for the remaining time) I don't know what to make of this.
Are these numbers normal?
Where do I go from here? Are there other tools I should be running?
How do I optimize Parse HTML events?
For all the cynicism this question received, I am amused to discover they were all wrong.
I found Chrome's profiler output hard to interpret so I turned to console.log(performance.now()). This led me to discover that the page was taking 1400 ms to load the Javascript files, before I even invoke a single line of code!
This didn't make much sense, so revisited Chrome's Javascript profiler tool. The default sorting order Heavy (Bottom Up) didn't reveal anything meaningful, so I switched over to Chart mode. This revealed that many browser plugins were being loaded, and they were taking much longer to run than I had anticipated. So I disabled all plugins and reloaded the page. Guess what? The load time went down to 147ms.
That's right: browser plugins were responsible for 90% of the load time!
So to conclude:
JQuery is substantially slower than native APIs, but this might be irrelevant in the grand scheme of things. This is why good programmers use profilers to find bottlenecks, as opposed to optimizing blindly. Don't trust people's subjective bias or a "gut feeling". Had I followed people's advise to optimize away JQuery it wouldn't have made a noticeable difference (I would have saved 100ms).
The timeline tool doesn't report the correct total time. Skip the pretty graphs and use the following tools...
Start simple. Use console.log(performance.now()) to verify basic assumptions.
Chrome's Javascript profiler
Chart will give you a chronological overview of the Javascript execution.
Tree (Top Down) will allow you to drill into methods, one level at a time.
Turn off all browser plugins, restart the browser, and try again. You'd be surprised how much overhead some plugins contribute!
I hope this helps others.
PS: There is a nice article at http://www.sitepoint.com/jquery-vs-raw-javascript-1-dom-forms/ which helps if you want to replace jQuery with the native APIs.
I think Parse HTML events happen every time you modify the inner HTML of an element, e.g.
$("#someiD").html(text);
A common style is to repeatedly append elements:
$.each(something, function() {
$("#someTable").append("<tr>...</tr>");
});
This will parse the HTML for each row that's added. You can optimize this with:
var tablebody = '';
$.each(something, function() {
tablebody += "<tr>...</tr>";
});
$("#someTable").html(tablebody);
Now it parses the entire thing at once, instead of repeatedly parsing it.
Is it possible freeing the browser memory from JavaScript objects?
I am designing the asynchronous navigation process of a web app, in which page parts are loaded through AJAX requests together with their necessary components (CSS, JS, images etc): I guess that without a proper scripting, a long usage of the app would load many different objects causing a brutal memory growth.
As I could imagine, the removal of a script tag from the DOM only removes the DOM node, not the objects and functions it defines, which remains loaded into the browser memory.
(A similar test: Should a "script" tag be allowed to remove itself?).
I also tried to test the browser behaviour while setting a variable as a big string and then overwriting it:
<html>
<head>
<title>JS memory test</title>
<script type="text/javascript">
function allocate() {
window.x = '';
var s = 'abcdefghijklmnopqrstuvwxyz0123456789-';
for (var i = 0; i < 1000000; ++i) {
window.x += 'hello' + i + s;
}
alert('allocated');
}
function free() {
window.x = '';
alert('freed');
}
</script>
</head>
<body>
<input type="button" value="Allocate" onclick="javascript:allocate();" />
<input type="button" value="Free" onclick="javascript:free();" />
</body>
</html>
Data from the Chrome task manager:
page weight when loaded:
5736 KB
page weight after having set the variable:
81720 KB
page weight after having the variable reset:
81740 KB
It strikes me since I thought the JS garbage collector throws away the old variable value, since it is overwritten from the new assigned value.
Firefox has not a single-task-tab management, so there's not a similar process monitor like in Chrome, but the global memory usage seems that it behaves almost like Chrome.
Is there some wrong assumption?
Can somebody provide suggestions about effective programming practices or useful resources to read?
Thanks a lot.
You have no direct access to memory management features. The best you can do is remove references to objects so that they are available for garbage collection, which will run from time to time as the browser sees fit.
From time to time certain browsers exhibit memory leaks, there are strategies to reduce them but usually for specific cases (e.g. the infamous IE circular reference leak, which has been fixed, even in updated versions of IE 6, as far as I know).
It's probably not worth worrying about unless you find you have issues. Deal with it then.
I have recently started to tinker with Project Euler problems and I try to solve them in Javascript. Doing this I tend to produce many endless loops, and now I'm wondering if there is any better way to terminate the script than killing the tab in Firefox or Chrome?
Also, is firebug still considered the "best" debugger (myself I can't see much difference between firebug and web dev tool in safari/chrome ).
Any how have a nice Sunday!
Firebug is still my personal tool of choice.
As for a way of killing your endless loops. Some browsers will prevent this from happening altogether. However, I still prefer just going ctrl + w, but this still closes the tab.
Some of the other alternatives you can look into:
Opera : Dragonfly
Safari / Chrome : Web Inspector
Although, Opera has a nice set of developer tools which I have found pretty useful. (Tools->Advanced->Developer Tools)
If you don't want to put in code to explicitly exit, try using a conditional breakpoint. If you open Firebug's script console and right-click in the gutter next to the code, it will insert a breakpoint and offer you an option to trigger the breakpoint meets some condition. For example, if your code were this:
var intMaxIterations = 10000;
var go = function() {
while(intMaxInterations > 0) {
/*DO SOMETHING*/
intMaxIterations--;
}
};
... you could either wait for all 10,000 iterations of the loop to finish, or you could put a conditional breakpoint somewhere inside the loop and specify the condition intMaxIterations < 9000. This will allow the code inside the loop to run 1000 times (well, actually 1001 times). At that point, if you wish, you can refresh the page.
But once the script goes into an endless loop (either by mistake or design), there's not a lot you can do that I know of to stop it from continuing if you haven't prepared for this. That's usually why when I'm doing anything heavily recursive, I'll place a limit to the number of times a specific block of code can be run. There are lots of ways to do this. If you consider the behaviour to be an actual error, consider throwing it. E.g.
var intMaxIterations = 10000;
var go = function() {
while(true) {
/*DO SOMETHING*/
intMaxIterations--;
if (intMaxIterations < 0) {
throw "Too many iterations. Halting";
}
}
};
Edit:
It just occurred to me that because you are the only person using this script, web workers are the ideal solution.
The basic problem you're seeing is that when JS goes into an endless loop, it blocks the browser, leaving it unresponsive to any events that you would normally use to stop the execution. Web workers are still just as fast, but they leave your browser unburdened and events fire normally. The idea is that you pass off your high-demand tasks (in this case, your Euler problem algorithm) to a web worker JS file, which executes in its own thread and consumes CPU resources only when they are not needed by the main browser. The net result is that your CPU still spikes like it does now, but your browser stays fast and responsive.
It is a bit of a pest setting up a web worker the first time, but in this case you only have to do it once. If your algorithm never returns, just hit a button and kill the worker thread. See Using Web Workers on MDC for more info.
While having Firebug or the webkit debuggers is nice, a browser otherwise seems like overhead for Project Euler stuff. Why not use a runtime like Rhino or V8?
I am creating a really big JavaScript object on page load. I am getting no error on firefox but on Internet Explorer I am getting an error saying:
Stop running this script ?
A script on this page is causing your web browser to run slowly. If it continues to run, your computer might become unresponsive.
Is there any size limit for Javascript objects in Internet Explorer ? Any other solutions but not dividing the object?
The key to the message your receiving is the "run slowly" part, which relates to time. So, your issue is not object size, but the time taken to construct the object.
To refine things even further, the issue is not the time taken to construct the object either. Rather, IE counts the number of javascript statements it executes, resetting this count when it executes an event handler or a setTimeout function.
So, you can prevent this problem by splitting your code into multiple pieces that run inside calls to setTimeout(...);
Here's an example that may push you in the right direction:
var finish = function(data) {
// Do something with the data after it's been created
};
var data = [];
var index = 0;
var loop;
loop = function() {
if (++index < 1000000) {
data[index] = index;
setTimeout(loop, 0);
} else {
setTimeout(function(){ finish(data); }, 0);
}
}
setTimeout(loop, 0);
The resources available to JavaScript are limited by the resources on the client computer.
It seems that your script is using too much processing time while creating that object, and the 'stop script' mechanism is kicking in to save your browser from hanging.
The reason why this happens on Internet Explorer and not on Firefox is probably because the JavaScript engine in Firefox is more efficient, so it does not exceed the threshold for the 'stop script' to get triggered.
that is not because of the size but because of the big quantity of loops you are executing and the big execution time. if you cut it into smaller parts it should work ok.
Try lowering the complexity of functions your running. Can you post it so that we can look at the loops and try to help?
Edit:
I supose you want to do all that on the client side for some reason. The code seems to need to much execution to be runing on the client side.
Can't you do the calculations on the server side? If this is all to initialize the object, you can cache it to avoid reprocessing and just send a generated json to the javascript side.
It does seem cachable
You must be using big loops or some recursive logic in the code. It really doesn't depend on the size of the object—it depends on the CPU resources it uses (memory, processor, etc.).