Dealing with pages with many script files - javascript

Our project contains many pages which has up to 20 tabs, each works with different scripts. All the script files are referenced in <head> and loads on the first page load. Now we have performance issue because there are too many scripts on the page loads on opening it (about 2k lines of JavaScript per tab). The matter is in much cases user needs to work with 2-3 tabs and as a result more than 60% of code becomes not used. So we need any scripts lazy-loading solution to ease the pages. As HTML for every tab is loaded on demand we can put <script> references in every tab that will provide a good working solution. But I'm prety sure including references not in <head> is a bad style.
So I wonder, is there any another solution? How such problems are solved in big projects like us? Any advice will be helpfull.
Thanks in advance!

jQuery has a great function for this sollution:
$.getScript("my_lovely_script.js", function(){
alert("Script loaded and executed.");
// here you can use anything you defined in the loaded script
});
This is by default not cached. I looked for a solution on the jQuery website it stated this solution for a cached script include.
jQuery.cachedScript = function(url, options) {
// allow user to set any option except for dataType, cache, and url
options = $.extend(options || {}, {
dataType: "script", //Note this
cache: true, //Enable caching
url: url
});
// Use $.ajax() since it is more flexible than $.getScript
// Return the jqXHR object so we can chain callbacks
return jQuery.ajax(options);
};
// Usage
$.cachedScript("URL HERE").done(function(script, textStatus) {
console.log( textStatus );
});

Related

Javascript code isn't getting into my document ready listener. (forge iOS)

This is my entire javascript file for the home page of my app. Any ideas as to why it never gets into the document ready listener?
var photos;
forge.request.ajax({
url: "http://photos-url.com/pics.json",
dataType: "json",
success: function(data) {
photos = data;
},
error: function(error) {
forge.logging.info("Couldn't fetch pics!");
}
});
//logging output works here
$(function() {
//logging output doesn't work here
//I'm trying to append to the html here, but it never gets into this code
});
Cross-domain requests are prohibited for security reasons (same as in desktop browsers). You must configure environment to allow requests to your domain. Look at https://trigger.io/docs/current/api/modules/request.html for details.
json files are usually allowed to be read from cross domain and even if this one would't be, I still doubt it could affect ready event. I'm not using document ready function on my page as I was having simillar issues (it fires few minutes after page is loaded, or doesn't fire at all). You could try window.onload or document.onload events. I'd also try to find out how document.readyState behaves and eventually check it manually with interval or try to bind event listener to it.

Inline cache <script> of ajax content

I need to cache some <script src> that I receive via AJAX. Currently each call try to load the src via AJAX, as default. But the problem is that this script never change in a session and I need only re-eval this on document.
To be more clear, take this example of AJAX content result:
<strong>Hello World!</strong>
<script src="hello-world.js"></script>
If I call this AJAX three times, the hello-world.js is called three times too, but I need only re-execute this, without try to download it again. Browser cache help a lot, but I really do not can download it again every time.
I like to set some data to script, to jQuery know that I want only re-execute it, instead of download again. Like:
<script src="hello-world.js" data-cache="true"></script>
Any solution?
If think about a good solution for my case... I just replaced the src with data-src, so jQuery will not get the content automatically, so I have time to work with my content and find data-src and create my own cache system. Works fine to me.
You can check my code here:
// Cache system (outside of AJAX engine)
var script_cache = {};
// Inside of [jQuery.ajax].success method
// where "data_html" is my jQuery(data_html) AJAX response.
// Find all script with data-src
jQuery('script[data-src]', data_html).each(function() {
var self = jQuery(this),
self_src = self.data('src');
// If data was loaded before, so just execute it again
if(typeof script_cache[self_src] !== "undefined") {
jQuery.globalEval(script_cache[self_src]);
}
// Else, will load, cache and execute now
// Note that we download with dataType text
else {
jQuery.ajax(self_src, {
dataType: "text"
}).success(function(data) {
script_cache[self_src] = data;
jQuery.globalEval(data);
});
}
// Finally we remove the node, only to avoid problem
self.remove();
});
Alternative solutions are welcome.

How to index dynamic pages to google using html5 pushstate method?

I am building a fully jquery powered website, so i am loading all pages dynamically using ajax to achieve fancy transitions between pages and maximize user experience.
Here is my javascript code:
$(function() {
var path = _.compact(location.pathname.split("/"));
if(path.length<2){
path = 'home'
}else{
path = path[path.length-1];
}
activepage = path;
$('.nav a').click(function(e) {
href = $(this).attr("href");
loadContent(href);
// HISTORY.PUSHSTATE
window.history.pushState('', 'New URL: '+href, href);
e.preventDefault();
});
// THIS EVENT MAKES SURE THAT THE BACK/FORWARD BUTTONS WORK AS WELL
window.onpopstate = function(event) {
var path = _.compact(location.pathname.split("/"));
if(path.length<2){
path = 'home'
}else{
path = path[path.length-1];
}
loadContent(path);
};
});
function loadContent(url){
// USES JQUERY TO LOAD THE CONTENT
var adresa = absurl + "ajax/get_content";
$.ajax({
url: adresa,
contentType: 'application/json',
data: { url: url },
success: function(data) {
switch(url)
{
case "projects": $.projects({ data : data }); $.projects.init();
break;
case "home": $.homePage({ data : data }); $.homePage.init();
break;
default: console.log("nista");
}
}
});
}
Ajax function returns all data needed to build pages in the json format, then i initialize my custom plugin which builds the page using that json data.
All of that works perfectly fine as you can see on this LIVE EXAMPLE, including the browser history (back and forward).
But here is my problem... When the page is fully loaded the main container remains empty when i look at the source of the page. Also its empty when i try to fetch the page as google bot and i am pretty sure that these two are related.
As you can see on this example with the pretty much same code like i have, the source is being changed when you click on the links and it changes the page content as well, but without reloading the page.
My question is, what am I missing here and how to achieve the same effect? Am i missing some php code or what? I am struggling with this over the past few days, I've tried everything and i couldn't make it work.
Note: only home and project links are working for now...
Thanks a lot for all replies!
pushState lets you change the local part of the URI when you update the page contents with Ajax.
For every URI you create that way, allow the server to build the same page without any dependency on JavaScript.
This will:
Give better performance when visitors enter the site by following a deep link
Allow the site to work without JavaScript (including for search engine robots)
Complementing the #Quentin's answer, you need to identify in the PHP if the content is being loaded via ajax or not.
If it isn't, you have to display the full content of the page being requested, including header, footer and the content of the page.

How to remove caching with javascript code?

I have small problem with my recent project build in HTML and Javascript + jQuery only. I would like to prevent page caching as I need to refresh some area of page with some time interval.
If I reload the page, then we can set the "no-cache" META tag into header. But I am not going to reload the page and though jQuery calls XML files with AJAX those javascript files are getting cached and Memory overhead occurs. Because of this my FireFox crashes and memory usages increase up to 2 GB.
Can any one suggest me something fruitful so that I can solve memory overhead problem and running my application over browser smoothly.
function refresh() {
$('#table_info').remove();
$('#table').hide();
if (refreshTimer) {
clearTimeout(refreshTimer);
refreshTimer = null ;
}
$.ajax({
document.getElementById('refresh_topology').disabled=true;
$('<div id="preload_xml"></div>').html('<img src="pic/dataload.gif" alt="loading data" /><h3>Loading Data...</h3>').prependTo($("#td_123"));
$("#topo").hide();
$('#root').remove();
show_topology();
});
}
This is the code and show_topology() is been called frequently to make different status of Topology everytime.
disable jquery ajax cache:
$.ajax({cache: false});

Running scripts in an ajax-loaded page fragment

My web app dynamically loads sections of its UI with jquery.ajax. The new UI sections come with script though. I'm loading them as such:
Use...
$.ajax({
url: url,
dataType: 'html',
success: function(data, textStatus, XMLHttpRequest) {
$(target_selector).html( data );
update_ui_after_load();
}
});
This almost works. The problem is that the scripts included in the dynamic part of the page run before the new page fragment is inserted into the DOM. But often these scripts want to modify the HTML they're being delivered with. My best hacky solution so far is just to delay the scripts some reasonable amount of time to let the DOM insertion happen, by wrapping them in a setTimeout:
window.setTimeout( function() {
// process downloaded Fragment
}, 300);
Obviously this is unreliable and hideous. What's a better way?
Using
$(function);
will make the function you pass to jQuery to be run after the fragment is inline on the page.
I found it in
ASP.NET Ajax partial postback and jQuery problem
after looking at your question.
Are you familiar with the live() function? Might be what you're looking for here.
http://api.jquery.com/live/
The problem is that the scripts included in the dynamic part of the page run before the new page fragment is inserted into the DOM. But often these scripts want to modify the HTML they're being delivered with.
I'm fairly sure that in that case, the only sensible thing is to place the script after the HTML element.
Everything else would become kludgy quickly - I guess you could implement your own "ready" handler that gets executed after your HTML has been inserted, but that would be a lot of work to implement for no real gain.
I solved it by making a new simple ready handler system as follows...
var ajaxOnLoad = (function() {
var ajaxOnLoad = {};
var onLoadQueue=[];
ajaxOnLoad.onLoad= function(fn) {
onLoadQueue.push(fn);
}
ajaxOnLoad.fireOnLoad = function() {
while( onLoadQueue.length > 0 ) {
var fn = onLoadQueue.shift();
fn();
}
}
window.ajaxOnLoad = ajaxOnLoad;
return ajaxOnLoad;
})();
So in the pages which get .ajax() loaded, the scripts are queued to run with
ajaxOnLoad.onLoad( function() {
// Stuff to do after the page fragment is inserted in the main DOM
});
and in the code which does the insertion, before the update_ui_after_load() call, run
ajaxOnLoad.fireOnLoad();
A more complete solution could parse the pages, find script tags, and queue them up automatically. But since I have complete control of the fragments being inserted, it's easier for me to switch to using ajaxOnLoad.onLoad.

Categories

Resources