I have a web portal cricandcric.com which I have done using the php, java script, and mysql.
I dont see the java script error in Firefox, but i see the error in IE,
I observe the site loads faster in Firefox than in IE.
So my question, does java script errors can slow down the website loading time, even though the java script placed at the end of the page, (Yslow Strategies)
It depends. If the error happens early on and a lot of your script code is bypassed, it could actually make it faster. But every time an error occurs, there is some overhead (the exception object has to be built and sent up the call stack to look for any catches), so if it happens at the end, the script would run slower.
But I doubt your change in load time is noticeably impacted by script errors. How long the script takes to execute on the browser's JS engine or a host of other factors will have more impact.
IE's javascript engine has always been notably behind the other common browsers in performance, so it really might come down to that more than anything else. One of the many improvements in IE9 is JS execution speed that's actually competitive.
That said, the JS error probably is worth looking into, since it's occurring every time the image slideshow advances which happens once every couple of seconds.
If you're concerned about performance in general, there are a couple of tools like YSlow and the recently-opensourced DOM Monster bookmarklet, for giving suggestions on general ways to speed up websites. You might also want to check out some of the writings of Steve Souders.
Related
The project I'm working on involves a "planning" screen, which is entirely made with backbone.js (the other pages of the app are not).
My issue is that from times to times, chrome freezes and the web view stop responding to any interaction. Sometimes, I can manage to quit chrome itself, but usually the controls does not answer either.
I'm pretty convinced this is related to the js code. It seems to me that when a script takes too much time, or loops indefinitely, Chrome can detect this and interrupt the script. However, since this is not the case, I'm thinking that too many js objects stay in memory.
Whatever the cause is, I would like to know which chrome dev tools can help me here. While I'm not a beginner in js, asides setting breakpoints and calling console.log, I have no idea how to debug JS apps. I'm not opposed to use another browser if the dev tools are more suited.
Thanks a lot for your time !
FTR : This is a rails 3.2.8 app, using mongodb, & Backbone.js 0.9.2. The js code is written in coffeescript. This issue happened on my macbook air 2012 running mountain lion, as well as on the client machine which runs on windows 7. The issue appeared at least on chrome 22 & 23.
Using the Javascript CPU profiler, I was able to find the group of functions that seems to be responsible for the freezing.
I'm still open to any advice/ressource on debugging javascript code.
Make a console.log in the loop and check if its the same point freezing on all chrome versions. There is a limit see Browser Javascript Stack size limit.
Maybe add some code? Because there could be some memory leaks especially with event handlers!
What I would do is the long and weary approach to getting your hands dirty with console.log. --- in this case.
In your case, and THX for the hint by the way (finding the offender with the CPU profiler, I'll try that next time round), I guess the offenders might be some functions "callbacking" themselves .. probably via some sort of event-handling/bubbling/callback-combination.
What happens then usually is that it just doesn't recognize the endless-loop, because the callback-stack is kinda "broken". Therefor it will never throw a browser-error.
If you have any luck, it doesn't kill the browser quick enough to kill the console. I did have that at times, the console killed (remaining silent), the coffeescript files not even loaded in the debugger (I use a JIT-coffee-to-js-translator as in JS-MVC) ... the page frozen or not doing anything ...
So, if indeed you are lucky and the debugger spits out your console.logs you can thereby guess where your unwanted loop is hiding. Just by looking at the repeated order of the output statements.
of course you know you can set breakpoints right? even conditional breakpoints
I'm currently struggling a bit untangling a similar sounding loop - i guess in case of emergency some alert() calls in your code would at least slow the loop down
or maybe some wait calls (bad idea)
my loop is running so fast I loose my console log!
I have a script that has a few timeouts, draws a menu using Raphael and uses all sorts of dynamic properties to accomplish this. You can take a look at it over at jsfiddle: JSFiddle version of my code
My problem is that Firefox, just occasionally, throws the "a script is running slowly etc." error when this page is open, might be after half a minute or more. Typically I'll have hovered over one element or so, so one sub menu is open at this time. The error usually doesn't point to any of my js files, sometimes even Firefox's own files.
How do I debug this, is it possible at all? Any tips are appreciated. (I'm using chronos for the timers now, didn't seem to help)
Profile your code
You probably want to do Profiling, i.e. performance analysis of your code. As others have pointed out, Firebug is a good tool in Firefox. More specifically: In Firebug, click Profile in the Console tab. Then play with your navigation a bit and click Profile again to finish the analysis and get the results. They will tell you which functions were called how often and how long their execution took. This will help you identify performance bottlenecks in your code.
Optimize
On a quick glance I would say you could probably still optimize the DOM queries. Save references to the results of your expensive queries. And use contexts if possible, to make queries more efficient.
Also, requestAnimationFrame() seems to be the way to go for javascript animations now, instead of setTimeout().
Most browsers have a mechanism of some sort to allow users to stop "long running scripts".
How a script is deemed to be "long running" varies slightly between browsers, and is either the number of instructions executed within an 'execution context' or its elapsed duration.
JavaScript, when running in a browser, is predominantly event-driven (except for the immediate execution of JavaScript as the page is being parsed). The browser will wait until an execution context has completed before doing anything, and any display refreshing etc can be blocked whilst waiting for JavaScript to execute.
You basically need to break up your instructions into responses to events. If you have a massive loop anywhere (e.g. to do an animation), you really need to use an interval to break up the execution cycles. E.g. var interval = window.setInterval(refreshDisplay, 50); - this will call a refreshDisplay function (with no arguments) every 50 milliseconds (a crude 20 calls-per-second).
Try Firebug and use something like this as shown in screenshot --
Link to download firebug --
http://getfirebug.com/downloads/
firebug for debugging javascript (Firefox addon)
Use F12 to show firebug
This will help you !
With Firebug you can debug javascript with breakpoints and see all ajax calls !
There's probably some operation that's running slowly, or one of your loops has a bad complexity. You can use binary splitting: remove half of your code, see if there's a dramatic increase in timing, and do the same in the half of the code that's running the slowest.
I am in a middle of creating a website where we need to use a lot of script work. But due to extensive data load as we are loading the listing for youtube the browser hangs up due to unresponsive script error.
The data is being loaded using AJAX.
Can any one suggest how to tackle this issue?
Your suggestions will be highly appreciated.
Thanks in advance
J
You can change the dom.max_script_run_time and dom.max_chrome_script_run_time values, but it means the warning comes less often.
The best thing that you can do just find the root of the problem. I think you use Firefox, so if the script runs too slowly and long just click the Stop Script button and go to the error console.
More info are here.
Without knowing details, I can offer only these generalizations. Maybe one or more will help:
The jQuery .ajax call takes an option to make the call synchronous. That, combined with an unresponsive server, would cause the problem you describe. So, make sure you haven't mistakenly set that option.
On some browsers, FF at least, running out of memory causes an unresponsive script error. Chrome has a nice feature in which you can open a new tab, enter "about:memory" into the address bar, and a get a quick overview of how much memory other Chrome tabs/windows are using. See anything outrageous here?
On FF and IE, I have seen an infinite loop cause an unresponsive script error. Use a profiler, to detect this condition.
A profiler, like the one build in to Firebug for example, can help you pin down where in your script the problem is occurring. Knowing that, you can look for opportunities to break up the code into smaller operations. E.g., if you are processing a large amount of data, perhaps you could make a recursive call a setTimeout handler to process chunks of data at a time.
I'm writing a project which will use some fairly large JS libraries including jquery UI. The project will be run within an Intranet though. So download time is not really an issue for me and most people should only have to download the libraries once since I assume they will remain in the browser's cache.
My question is about how modern browsers (IE9,FF5,etc) handle the processing of the Javascript code. I imagine at some point it is compiled, but is this done on each page load, or is the compiled code cached too. If so, is it cached even after the browser is closed?
This web app may run on some low powered portable devices so I wanted to be reasonably efficient. I wanted to combine all the javascript files into one large one that is linked to on every page of the app.
But depending on how much work the browser must do to process all the JS I'm wondering if I should split them up so not all pages must load all the JS. Obviously that's more work though.
Any thoughts or info would be appreciated. Thank you.
Yes, JavaScript size is still a performance concern if it is cached for the following reasons:
Most browsers don't cache the byte code that they execute. So the script must still be re-parsed on every page load. Modern browsers are doing this faster, but it still may be a concern for very large JavaScript files.
The code in your JavaScript files is actually executed on every page load. Even if browsers start caching byte code, they still have to re-execute the JavaScript every time the page is loaded.
You may get a lower cache rate than expected, for reasons beyond your control. Users may disable their cache, or visit so many sites that your files get expired from the cache quickly. You should check your logs to make sure that the ratio of page loads to JavaScript file loads is high.
Loading a file from cache is usually pretty fast, but it's not always trivial. It can take upwards of 200ms in some cases.
You can do a pretty quick test to get a rough idea of how long your scripts take to parse and execute like this:
<script>
var startTime = (new Date()).getTime();
</script>
<script src="cachedFile1.js"></script>
<script src="cachedFile2.js"></script>
<!--all your scripts included this way-->
<script>
var endTime = (new Date()).getTime();
alert("Took " + (endTime - startTime) + " milliseconds to parse and execute");
</script>
Make sure to test on all the target browsers you support; JavaScript parse and execution time can vary wildly between different browsers. Also make sure that you test on a computer that is as slow as the ones your users will have. If you do find performance problems, you probably will need to solve them in a profiler. Minification won't help much for improving parse and execution time.
Minify your javascript files. This takes up less space.
Also, Javascript is an interpreted language so it is never compiled.
http://en.wikipedia.org/wiki/Minification_(programming)
It doesn't really matter how much there is code but how heavy it is. Nowadays browsers can run JS quite fast. Just try opening Gmail forexample (which is almost all about javascript) with IE7 and then with IE9 or Chrome.
You need to look into the JavaScript engines each browser uses to get a better understanding, for instance Google Chrome uses V8: http://code.google.com/p/v8/
Also this may help you understand a great deal better:
How do browsers execute javascript
How do browsers handle JavaScript?
If your use-case is only in an intranet environment, then compared to other optimization techniques, code size is not really a problem for you. Presumably, target browsers are fairly new and have JS engines that can handle modern full-blown javascript sites, in which case the size of the code won't really affect the speed, since the code parsing takes so little anyways, compared to the execution. In older browsers, however there might be a small speed difference compared to more optimized code length since they were never meant to run javascript that's thousands and thousands of lines of code.
Compared to code execution optimization, code length optimization won't probably even be noticeable to end-user. You might be able to knock of a few ms by compressing the code, but most modern engines create a "map" when they parse the code, so that at execution time, the location of a function, variable etc doesn't really matter. So worry more about overall code optimization, rather than the library sizes etc.
You can listen about V8 internals here
I never thought about this from that point of view, but I can't really why not bundling everything up in a big file wouldn't help* - the Javascript build tools I know focus on doing precisely this.
*unless you have a big module that is only rarely used. In that case, keep it separate so not everyone has to use it.
I'm trying to profile the performance of a web site that I'm fairly confident is being slowed down by the loading of JavaScript files on the page.
The same JavaScript files are included several times on the page, and <script /> tags are scattered throughout the page instead of being included at the bottom.
As I suspected, when looking at FireBug's "Net" tab, most of the time (not all) when JavaScript is being loaded, no other files are requested. The browser waits for the JavaScript to complete loading.
There are a few exceptions however. There are a few occasions where JavaScript is loaded, but then at the same time, other resources appear to get loaded, such as other JavaScript files and images.
I always thought that JavaScript blocks the loading of other resources on the page. Am I incorrect in thinking this, or does this behavior vary depending on the browser or browser version?
UPDATE:
To those who have explained how loading a script blocks the loading of other resources, I'm already aware of this. My question is why a script wouldn't block the loading of other resources. Firebug is showing that some JavaScript files do not block loading other resources. I want to know why this would happen.
Javascript resource requests are indeed blocking, but there are ways around this (to wit: DOM injected script tags in the head, and AJAX requests) which without seeing the page myself is likely to be what's happening here.
Including multiple copies of the same JS resource is extremely bad but not necessarily fatal, and is typical of larger sites which might have been accreted from the work of separate teams, or just plain old bad coding, planning, or maintenance.
As far as yahoo's recommendation to place scripts at the bottom of the body, this improves percieved response times, and can improve actual loading times to a degree (because all the previous resources are allowed to async first), but it will never be as effective as non-blocking requests (though they come with a high barrier of technical capability).
Pretty decent discussion of non-blocking JS here.
I'm not entirly sure that Firebug offers a true reflection of what is going on within the browser. It's timing for resource loading seems to be good but I am not sure that it is acting as a true reflection of exactly what is going on. I've had better luck using HTTP sniffers/proxy applications to monitor the actual HTTP requests coming from the browser. I use Fiddler, but I know there are other tools out there as well.
In short, this many be an issue with the tool and not with how resources are actually being loaded ... at least worth ruling out.
I suppose you're using Firefox 3.0.10 and Firebug 1.3.3 since those are the latest releases.
The Firebug 1.4 beta has done many improvements on the net tab, but it requires Firefox 3.5. If you want to test it in Firefox 3.0, use one of the previous 1.4 alpha versions. But even with the improvements I still struggle to understand the result. I wish the Firebug developers would document more precisely what each part of a download means. It doesn't make sense to me why queuing is after connecting.
My conclusion was not to trust the results in Firebug, and ended up using the WebPageTest. Which was surprisingly good to come from AOL ;-)
Also, what kind of resources are being loaded at the same time as the javascript? Try tracing down the resources that are loaded at the same time, and see if it's referenced in a css/iframe/html-ajax. I'm guessing the reason why nothing else is loaded, is because the browser stops parsing the current HTML when it sees a script tag (without defer). Since it can't continue parsing the HTML, it has nothing more to request.
If you could provide a link to the page you're talking about. It would help everyone to give a more precise answer.
I believe that the content is downloadeded, but not rendered until the JavaScript is finished loading.
This is, from the server's POV, not much of a deal, but to the user it can make a huge difference in speed.
If you think about it a tag has to finish processing before you can continue to render content. What if the tag used document.write or some other wonderfully dumb thing? Until anything within the script tag has finished running the page can't be sure what it's going to display.
Browsers usually have a set number of connections opened to a single domain.
So, if you load all your scripts from the same domain you will usually load them one after the other.
But, if those scripts are loaded from several domains, they will be loaded in parallel.
The reason the browser is blocking during JavaScript downloading is that the browser suspects that there will be DOM nodes created inside the script.
For example, there might be "dcoument.write()" calls inside the script.
A way to hint to the browser that the script does not contain any DOM generation is with the "defer" attribute.
So,
<script src="script.js" type="text/javascript" defer="defer"></script>
should allow the browser to continue parallelizing the requests.
References:
http://www.w3.org/TR/REC-html40/interact/scripts.html#adef-defer
http://www.websiteoptimization.com/speed/tweak/defer/
As others have stated, the script is probably loading other resources through DOM injection.
Script.aculo.us actually loads its child components/scripts itself by doing this -- injecting other <script> tags into the DOM for them.
If you want to see whether or not this is the case, use Firebug's profiler and take a look at what the scripts are doing.
Like others said, one non-blocking way is to inject <script> tags in the page head.
But firefox can also execute loaded <script>s in parallel:
Copy the two lines below:
http://streetpc.free.fr/tmp/test.js
http://streetpc.free.fr/tmp/test2.js
Then go to this page, paste in the input textarea, click "JavaScript", then "Load Scripts" (which builds and adds a <script> child element to the head).
Try that in FF : you'll see "test2 ok", move the dialog box to see "test ok".
In other browsers, you should see "test ok" (with no other dialog behind) then "test2 ok", (except for Safari 4, showing me tes2 before test).
Firefox 3 has introduced connection parallelism feature to improve performance while loading a webpage, I bet this is the source of your problem ;)
When you open a web page that has many
different objects on it, like images,
Javascript files, frames, data feeds,
and so forth, the browser tries to
download several of them at once to
get better performance.
Here's the ZDNET blogpost about it.