Best practice for script caching with Javascript - javascript

Using:
C# MVC5 and Jquery
I have a filter screen that potentially uses multiple different filters. Based on what the user selects I make a call to the server and I load a partial view into a bootstrap modal as follows:
$.ajax({
url: filterUrl,
contentType: 'application/html',
success: function (filterContent) {
$("#divReportFilterModalBody").html(filterContent);
LoadFilterScript(SCOPESTRINGS[currentReport.Scope]);
},....
The next step is to load the necessary javascript for that filter page because you cant have scripts on a partial view. For this I also request the script from the server as follows:
$.getScript(scopeString + "FilterJavaScript",
function () {
The mvc controller:
[OutputCache(NoStore = true, Duration = 0, VaryByParam = "*")]
public ActionResult ScopeFilterJavaScript()
{
return
File(System.IO.File.ReadAllBytes(Server.MapPath("~/Scripts/.../filterPartial.js")), "text/javascript");
}
Because the user can only use one filter at a time and may or may not use multiple filters my questions are:
The scripts aren't big, is it better practice to load them all upfront rather then fetch them as required? The reason I load them as required is because they might not get called and didn't want to load a bunch of scripts that will not get used
Is not caching them a good idea because the user can use the same filter multiple times and in my current case the script will get loaded each time? OR should I rather cache the script and figure out a way not to load it again?
I'm also not 100% clear on script caching. What happens to the script in this case after it was loaded? If I make a call to the server I can see that it gets loaded again, was the previous scripts removed? Because when I look at the script tab on firebug they are all still listed there? Will this cause conflicts on the page?
What would best practice be in this scenario?
Thanks
Edit: I've been researching the topic a bit further and found this article (Old but still very relevant in my opinion). enter link description here

It's always a good idea to only load stuff if you actually need it. When the files arent that huge, maybe you can combine them and include them in the first place.
OR should I rather cache the script and figure out a way not to load it again?
yup.
When you load a script (without any queries) the browser caches it. But this has nothing to do with what happens when you load a script again. Either the servers delivers it "again" or the browser uses the cached one. Nevertheless, the script then executes again. Even if you remove it from the dom - once loaded scripts are just there.
Maybe you can wrap your scripts like so:
if (!window.foobarLoaded) {
// your script content
window.foobarLoaded = true;
}
Then you can load the script as many times as you like - it only "executes" once.

Related

Force a route cleanly with javascript

I'm working with a legacy app's UI and the path that links to this app is a default:
something/fldr
Whenever that page loads it forces a fldr/landing.asp page. We want to get it to go to other.asp instead of landing.
My approach for this is to use:
if (document.readyState === "interactive") {
if(location.href == 'https://www.something.com/fldr'){
location.href="https://www.something.com/other.asp";
}
}
Doing this causes a page stutter, where the landing.asp loads, shows for like 2 seconds and then refreshes to the correct page.
Is there a standard method for doing something like this in JS or jQuery? I feel like there is a way to make the page hang up until the if statements executes rather than try to load the wrong page. But I can't for the life of me remember what it is. I've handled this on the back end by forcing the correct page to return in the API but I still feel like this is something that can be resolved with only JS.
Note: The route names are made up since this is a stripped down problem of a legacy app.
JavaScript (when running in a browser) is a client-side technology.
That means it cannot run without the page partially loading after the page has been served and sent to the user's browser (client). The browser begins loading resources and parsing scripts and code, and your script will execute in the order it is parsed. This is, in fact, the delay you're experiencing.
While you may possibly tweak this to make the location.href change
execute in some earlier part of this process, there is no way to avoid
a partial page load prior to the client-side redirect you have
implemented.
Essentially, there is a better way to do this, one which will reduce the redirect delay to be imperceptible to a user.
Making this change at the web-server level is the ideal solution; however, first consider, is that even needed?
First, before implementing a redirect, I would suggest to look in the IIS settings and see if there is a default document set to fldr/landing.asp;
You can then just change that setting to make the default document to what you need.
Here's an example for IIS how to do this.
If there is not a default document or if there is some other code or application logic that is forcing landing.asp to load, then you would set up a 301 Permanent Redirect for that URL on the web server.
Here are IIS docs on setting this up.
IF for some reason the above options are unavailable to you (don't have access to web server, etc.), then the best you can do is ensure that script is the first thing in the page before any other scripts, stylesheets, etc., are loaded.
Another hacky thing that might work is just replacing the entire content of landing.asp with other.asp and call it a day :)
That is a last resort of course, and hopefully you can just change the default document and that will handle it.

HTML content loaded via AJAX loads external JavaScript out of order

Here's the scenario, not sure what I'm missing.
Page A.htm makes an ajax request for page B.htm, and inserts the response into the page.
Page B.htm contains links to several other JS files, many of which contain a document.ready() function to initialize them.
This works fine when A.htm and B.htm are on the same server but not when they are on different servers.
What I think I'm seeing here, is that when page A and B are on different servers (cross domain ajax), the external resources are being returned asynchronously, or at least out of order, so scripts are executing expecting JQuery.UI to be loaded already, when it is not.
Appreciate any pointers or advice. Apologies for the poor explanation.
You are injecting HTML + script tags via jQuery. In this case *:
HTML content except scripts are injected in the document
Then all scripts are executed one by one
If a script is external then it is downloaded and executed asynchronously
Therefore an external or inline script that depends on jQuery UI might execute before jQuery UI.
One possible solution is to change the way your pages work:
Get rid of external scripts in pageb.html but keep inline scripts
Load the required scripts in pagea.html
Load pageb.html
Another solution is to roll your own jQuery function that will:
Strip all <script src> elements from HTML
Download and execute those scripts in order
Inject the remaining HTML
* The exact behavior is not documented. I had to look into the source code to infer the details.
you are correct in your impression that the issue is a difference in how the requests are handled cross-domain.
Here is a link to get you on the right track : How to make synchronous JSONP crossdomain call
However, you will have to actually re-achitect your solution somewhat to check if the resource has been loaded before moving on. There are many solutions (see the link)
You can set a timer interval and check for something in the dom, or another reasonable solution (despite it's lack of efficiency) is to create a "proxy" serverside (eg php) file on your server and have that file do the cross-domain request, then spit out the result.
Note that since jquery UI is a rather large file, it's conceivable that the cross-domain request finishes first, and executes immediately, even though jqueryUI is not loaded yet. In any case, you're going to have to start thinking about having your app react rather than follow a sequence.

Insert JavaScript into WebView before pageload

I have a little problem while inserting JavaScript into my WebView before the page is loaded.
The reason why I want to insert JavaScript ist because I can't pass JSON to Java via a JavascriptInterface. I don't want to use strings or JSON.stringify while developing the WebApps so my approach was to add some JavaScript inside the WebViewClient.onPageStarted() method.
simplified example:
var JSWrapper = {
callJavascriptInterface: function (str, obj) {
if (typeof obj === 'object') {
JavascriptInterface.function(str, JSON.stringify(obj));
} else {
JavascriptInterface.function(str);
}
}
}
inside onPageStarted I also unregister sensors and remove old callbacks I registered via JavascriptInterfaces, cause they should be enabled anymore when the site has changed.
That works when I change the site via a link or something but if I use window.location.reload() I end up with errors (JSWrapper not defined).
Does someone have an idea how to solve this problem or maybe even a better approach for turning off the sensors, removing callbacks and the JSON.stringify?
onPageStarted is not a good place for doing anything with the page. It is called right after the moment when the first reply has been obtained from the server, and may happen before the page has been actually processed by the rendering engine. Thus, depending on the speed of page loading, you might end up interacting with the previously loaded page, which will be discarded (together with your changes) soon.
Another point to keep in mind is that changes to Java interfaces injected / removed via add/removeJavascriptInterface only affect the next loaded page. That is, after you, say, added a new Java interface, you must do a page (re)load in order to make this interface actually available to the page. If you try to change injected interfaces configuration from onPageStarted, this again may or may not affect the page you are loading.
So if you want to do a cleanup, or prepare for navigating to a new page, it's better to use shouldOverrideUrlLoading, because it is called after the page has decided to navigate, but before any loading has started (just remember to return false from shouldOverrideUrlLoading so loading proceeds as usual).
If you want to modify freshly loaded page, e.g. insert your JavaScript code, use onPageFinished -- it is called after the resources of the new page have been loaded and parsed. But remember that it's too late to inject Java interfaces at this point, as they will only affect next loaded page.

Reload file.js every minute

I read (somewhere else on this site) you can't reload (or inject javascript) onto a page that is already rendered.
Is there any other way of doing this. For instance an iFrame?
I have a recent comment widget.js and I need to constantly get it to reload without reloading the whole page.
Any ideas?
edit: The site has recent comments on it and they are displayed via a recentcomment.js
Once the page is loaded it doesn't update itself unless you reload the page. I want it to update itself, a way to do this is to just reload the js file on the page, correct?
Why do you need to do this? It seems to me that there's probably a more appropriate solution to your problem.
But to answer it:
var elm = document.createElement("script");
elm.src = "Widget.js";
document.getElementsByTagName("head")[0].appendChild(elm);
Hope I didn't write any mistakes...
Rather than reloading the file, you can have all the implementation of the file contained in a function and then call the function every minute using the setTimeout() function.
Alternatively, if you want to reload it because the content of the file might have changed, it would probably be better to move that part of the code out to some external file and then use a function (running every minute with setTimeout()) to load the new content you need.
You can make cross-domain AJAX requests using certain methods, so you could make an AJAX request for the script file and parse it yourself. Parsing it yourself probably isn't an optimal solution, but it looks like you're dealing with a brain-dead service provider anyways.
Look at this guy's jQuery mod for an example:
http://james.padolsey.com/javascript/cross-domain-requests-with-jquery/
Once you get the data from the 3rd party, you could probably use some combination of regex and JSON parser to extract the comments.

Best practices managing JavaScript on a single-page app

With a single page app, where I change the hash and load and change only the content of the page, I'm trying to decide on how to manage the JavaScript that each "page" might need.
I've already got a History module monitoring the location hash which could look like domain.com/#/company/about, and a Page class that will use XHR to get the content and insert it into the content area.
function onHashChange(hash) {
var skipCache = false;
if(hash in noCacheList) {
skipCache = true;
}
new Page(hash, skipCache).insert();
}
// Page.js
var _pageCache = {};
function Page(url, skipCache) {
if(!skipCache && (url in _pageCache)) {
return _pageCache[url];
}
this.url = url;
this.load();
}
The cache should let pages that have already been loaded skip the XHR. I also am storing the content into a documentFragment, and then pulling the current content out of the document when I insert the new Page, so I the browser will only have to build the DOM for the fragment once.
Skipping the cache could be desired if the page has time sensitive data.
Here's what I need help deciding on: It's very likely that any of the pages that get loaded will have some of their own JavaScript to control the page. Like if the page will use Tabs, needs a slide show, has some sort of animation, has an ajax form, or what-have-you.
What exactly is the best way to go around loading that JavaScript into the page? Include the script tags in the documentFragment I get back from the XHR? What if I need to skip the cache, and re-download the fragment. I feel the exact same JavaScript being called a second time might cause conflicts, like redeclaring the same variables.
Would the better way be to attach the scripts to the head when grabbing the new Page? That would require the original page know all the assets that every other page might need.
And besides knowing the best way to include everything, won't I need to worry about memory management, and possible leaks of loading so many different JavaScript bits into a single page instance?
If I understand the case correctly, you are trying to take a site that currently has pages already made for normal navigation, and you want to pull them down via ajax, to save yourself the page-reload?
Then, when this happens, you need to not reload the script tags for those pages, unless they're not loaded onto the page already?
If that is the case, you could try to grab all the tags from the page before inserting the new html into the dom:
//first set up a cache of urls you already have loaded.
var loadedScripts = [];
//after user has triggered the ajax call, and you've received the text-response
function clearLoadedScripts(response){
var womb = document.createElement('div');
womb.innerHTML = response;
var scripts = womb.getElementsByTagName('script');
var script, i = scripts.length;
while (i--) {
script = scripts[i];
if (loadedScripts.indexOf(script.src) !== -1) {
script.parentNode.removeChild(script);
}
else {
loadedScripts.push(script.src);
}
}
//then do whatever you want with the contents.. something like:
document.body.innerHTML = womb.getElementsByTagName('body')[0].innerHTML);
}
Oh boy are you in luck. I just did all of this research for my own project.
1: The hash event / manager you should be using is Ben Alman's BBQ:
http://benalman.com/projects/jquery-bbq-plugin/
2: To make search engines love you, you need to follow this very clear set of rules:
http://code.google.com/web/ajaxcrawling/docs/specification.html
I found this late and the game and had to scrap a lot of my code. It sounds like you're going to have to scrap some too, but you'll get a lot more out of it as a consequence.
Good luck!
I have never built such a site so I don't know if that is nbest practice, but I would put some sort of control information (like a comment or a HTTP header) in the response, and let the loader script handle redundancy/dependency cheching and adding the script tags to the header.
Do you have control over those pages being loaded? If not, I would recommend inserting the loaded page in an IFrame.
Taking the page scripts out of their context and inserting them in the head or adding them to another HTML element may cause problems unless you know exactly how the page is build.
If you have full control of the pages being loaded, I would recommend that you convert all your HTML to JS. It may sound strange but actually, a HTML->JS converter is not that far away. You could start of with Pure JavaScript HTML Parser and then let the parser output JS code, that builds the DOM using JQuery for example.
I was actually about to go down that road for a while ago on a webapp that I started working on, but now I handed it over to a contractor who converted all my pure JS pages into HTML+JQuery, whatever makes his daily work productive, I dont care, but I was really into that pure JS webapp approach and will definitely try it.
To me it sounds like you are creating a single-page app from the start (i.e. not re-factoring an existing site).
Several options I can think of:
Let the server control which script tags are included. pass a list of already-loaded script tags with the XHR request and have the server sort out which additional scripts need to be loaded.
Load all scripts before-hand (perhaps add them to the DOM after the page has loaded to save time) and then forget about it. For scripts that need to initialize UI, just have each requested page call include a script tag that calls a global init function with the page name.
Have each requested page call a JS function that deals with loading/caching scripts. This function would be accessible from the global scope and would look like this: require_scripts('page_1_init', 'form_code', 'login_code') Then just have the function keep a list of loaded scripts and only append DOM script tags for scripts that haven't been loaded yet.
You could use a script loader like YUI Loader, LAB.js or other like jaf
Jaf provides you with mechanism to load views (HTML snippets) and their respective js, css files to create single page apps. Check out the sample todo list app. Although its not complete, there's still a lot of useful libraries you can use.
Personally, I would transmit JSON instead of raw HTML:
{
"title": "About",
"requires": ["navigation", "maps"],
"content": "<div id=…"
}
This lets you send metadata, like an array of required scripts, along with the content. You'd then use a script loader, like one of the ones mentioned above, or your own, to check which ones are already loaded and pull down the ones that aren't (inserting them into the <head>) before rendering the page.
Instead of including scripts inline for page-specific logic, I'd use pre-determined classes, ids, and attributes on elements that need special handling. You can fire an "onrender" event or let each piece of logic register an on-render callback that your page loader will call after a page is rendered or loaded for the first time.

Categories

Resources