I have 5 html pages and a JavaScript function DoInitialConfiguration() in a JavaScript File. User can open any of the five html pages and I want that irrespective of which page is opened, I call this function on the first page access. But also want to remember that the function has been called once and not call it in other page load. I only have these 5 html pages and the JavaScript file which has the function. I am owner of the JavaScript file but can do limited change in the html pages (which I don't own) like load the JavaScipt file and call the function DoInitialConfiguration().
Since the JavaScript file will remain in browser cache, is there a way to remember the function has been called once by using any variable in the JS file. It is OK to call DoInitialConfiguration() again if the page is reloaded after clearing browser cache.
how can this functionality be achieved
If your 5 pages are hosted under same site (which probably would be the case), you can use localStorage to add a key to check if your script was called first time or not.
if (localStorage.getItem("firstRun") != null) {
// second run+ code goes here
} else {
localStorage.setItem("firstRun", "ohyes");
// first run code goes here
}
You can possibly use localStorage for this. Once your code executes set a localStorage variable i.e. localStorage.setItem(<key>, <value>) and in the function check if the localStorage has been set i.e. localStorage.getItem("lastname"). If its set do not execute the code.
It would be good to understand you setup and case study better.
If I understand you correctly, you have 5 separate HTML pages (and you are not running a Single Page Application [SPA]) then what you want to do is impossible through browser and cache memory alone. If you want to remember settings you need to save these using localStorage or cookies (as some of the answers popped up have suggested) but as they are 5 different html pages what does the Js do to make you not want to re-run it on a second page load?
Here's the scenario, not sure what I'm missing.
Page A.htm makes an ajax request for page B.htm, and inserts the response into the page.
Page B.htm contains links to several other JS files, many of which contain a document.ready() function to initialize them.
This works fine when A.htm and B.htm are on the same server but not when they are on different servers.
What I think I'm seeing here, is that when page A and B are on different servers (cross domain ajax), the external resources are being returned asynchronously, or at least out of order, so scripts are executing expecting JQuery.UI to be loaded already, when it is not.
Appreciate any pointers or advice. Apologies for the poor explanation.
You are injecting HTML + script tags via jQuery. In this case *:
HTML content except scripts are injected in the document
Then all scripts are executed one by one
If a script is external then it is downloaded and executed asynchronously
Therefore an external or inline script that depends on jQuery UI might execute before jQuery UI.
One possible solution is to change the way your pages work:
Get rid of external scripts in pageb.html but keep inline scripts
Load the required scripts in pagea.html
Load pageb.html
Another solution is to roll your own jQuery function that will:
Strip all <script src> elements from HTML
Download and execute those scripts in order
Inject the remaining HTML
* The exact behavior is not documented. I had to look into the source code to infer the details.
you are correct in your impression that the issue is a difference in how the requests are handled cross-domain.
Here is a link to get you on the right track : How to make synchronous JSONP crossdomain call
However, you will have to actually re-achitect your solution somewhat to check if the resource has been loaded before moving on. There are many solutions (see the link)
You can set a timer interval and check for something in the dom, or another reasonable solution (despite it's lack of efficiency) is to create a "proxy" serverside (eg php) file on your server and have that file do the cross-domain request, then spit out the result.
Note that since jquery UI is a rather large file, it's conceivable that the cross-domain request finishes first, and executes immediately, even though jqueryUI is not loaded yet. In any case, you're going to have to start thinking about having your app react rather than follow a sequence.
Currently I have an website where each page on the site has a corresponding js file that contains a class (well function) defined in it. Currently when the user navigates to another page (which is just ajax content) I remove the old script tag from the page and add a new script tag for the new page. This is working though I was wondering if there could be any problems with this or better ways to do it? I've been thinking of using an AJAX call (XMLHTTPRequest) to get the new js file then using eval to initiate the new page.
What you are doing makes sense because the browser can intelligently cache the js.
Getting the js yourself might prevent the possibility to cache.
I have a little problem while inserting JavaScript into my WebView before the page is loaded.
The reason why I want to insert JavaScript ist because I can't pass JSON to Java via a JavascriptInterface. I don't want to use strings or JSON.stringify while developing the WebApps so my approach was to add some JavaScript inside the WebViewClient.onPageStarted() method.
simplified example:
var JSWrapper = {
callJavascriptInterface: function (str, obj) {
if (typeof obj === 'object') {
JavascriptInterface.function(str, JSON.stringify(obj));
} else {
JavascriptInterface.function(str);
}
}
}
inside onPageStarted I also unregister sensors and remove old callbacks I registered via JavascriptInterfaces, cause they should be enabled anymore when the site has changed.
That works when I change the site via a link or something but if I use window.location.reload() I end up with errors (JSWrapper not defined).
Does someone have an idea how to solve this problem or maybe even a better approach for turning off the sensors, removing callbacks and the JSON.stringify?
onPageStarted is not a good place for doing anything with the page. It is called right after the moment when the first reply has been obtained from the server, and may happen before the page has been actually processed by the rendering engine. Thus, depending on the speed of page loading, you might end up interacting with the previously loaded page, which will be discarded (together with your changes) soon.
Another point to keep in mind is that changes to Java interfaces injected / removed via add/removeJavascriptInterface only affect the next loaded page. That is, after you, say, added a new Java interface, you must do a page (re)load in order to make this interface actually available to the page. If you try to change injected interfaces configuration from onPageStarted, this again may or may not affect the page you are loading.
So if you want to do a cleanup, or prepare for navigating to a new page, it's better to use shouldOverrideUrlLoading, because it is called after the page has decided to navigate, but before any loading has started (just remember to return false from shouldOverrideUrlLoading so loading proceeds as usual).
If you want to modify freshly loaded page, e.g. insert your JavaScript code, use onPageFinished -- it is called after the resources of the new page have been loaded and parsed. But remember that it's too late to inject Java interfaces at this point, as they will only affect next loaded page.
With a single page app, where I change the hash and load and change only the content of the page, I'm trying to decide on how to manage the JavaScript that each "page" might need.
I've already got a History module monitoring the location hash which could look like domain.com/#/company/about, and a Page class that will use XHR to get the content and insert it into the content area.
function onHashChange(hash) {
var skipCache = false;
if(hash in noCacheList) {
skipCache = true;
}
new Page(hash, skipCache).insert();
}
// Page.js
var _pageCache = {};
function Page(url, skipCache) {
if(!skipCache && (url in _pageCache)) {
return _pageCache[url];
}
this.url = url;
this.load();
}
The cache should let pages that have already been loaded skip the XHR. I also am storing the content into a documentFragment, and then pulling the current content out of the document when I insert the new Page, so I the browser will only have to build the DOM for the fragment once.
Skipping the cache could be desired if the page has time sensitive data.
Here's what I need help deciding on: It's very likely that any of the pages that get loaded will have some of their own JavaScript to control the page. Like if the page will use Tabs, needs a slide show, has some sort of animation, has an ajax form, or what-have-you.
What exactly is the best way to go around loading that JavaScript into the page? Include the script tags in the documentFragment I get back from the XHR? What if I need to skip the cache, and re-download the fragment. I feel the exact same JavaScript being called a second time might cause conflicts, like redeclaring the same variables.
Would the better way be to attach the scripts to the head when grabbing the new Page? That would require the original page know all the assets that every other page might need.
And besides knowing the best way to include everything, won't I need to worry about memory management, and possible leaks of loading so many different JavaScript bits into a single page instance?
If I understand the case correctly, you are trying to take a site that currently has pages already made for normal navigation, and you want to pull them down via ajax, to save yourself the page-reload?
Then, when this happens, you need to not reload the script tags for those pages, unless they're not loaded onto the page already?
If that is the case, you could try to grab all the tags from the page before inserting the new html into the dom:
//first set up a cache of urls you already have loaded.
var loadedScripts = [];
//after user has triggered the ajax call, and you've received the text-response
function clearLoadedScripts(response){
var womb = document.createElement('div');
womb.innerHTML = response;
var scripts = womb.getElementsByTagName('script');
var script, i = scripts.length;
while (i--) {
script = scripts[i];
if (loadedScripts.indexOf(script.src) !== -1) {
script.parentNode.removeChild(script);
}
else {
loadedScripts.push(script.src);
}
}
//then do whatever you want with the contents.. something like:
document.body.innerHTML = womb.getElementsByTagName('body')[0].innerHTML);
}
Oh boy are you in luck. I just did all of this research for my own project.
1: The hash event / manager you should be using is Ben Alman's BBQ:
http://benalman.com/projects/jquery-bbq-plugin/
2: To make search engines love you, you need to follow this very clear set of rules:
http://code.google.com/web/ajaxcrawling/docs/specification.html
I found this late and the game and had to scrap a lot of my code. It sounds like you're going to have to scrap some too, but you'll get a lot more out of it as a consequence.
Good luck!
I have never built such a site so I don't know if that is nbest practice, but I would put some sort of control information (like a comment or a HTTP header) in the response, and let the loader script handle redundancy/dependency cheching and adding the script tags to the header.
Do you have control over those pages being loaded? If not, I would recommend inserting the loaded page in an IFrame.
Taking the page scripts out of their context and inserting them in the head or adding them to another HTML element may cause problems unless you know exactly how the page is build.
If you have full control of the pages being loaded, I would recommend that you convert all your HTML to JS. It may sound strange but actually, a HTML->JS converter is not that far away. You could start of with Pure JavaScript HTML Parser and then let the parser output JS code, that builds the DOM using JQuery for example.
I was actually about to go down that road for a while ago on a webapp that I started working on, but now I handed it over to a contractor who converted all my pure JS pages into HTML+JQuery, whatever makes his daily work productive, I dont care, but I was really into that pure JS webapp approach and will definitely try it.
To me it sounds like you are creating a single-page app from the start (i.e. not re-factoring an existing site).
Several options I can think of:
Let the server control which script tags are included. pass a list of already-loaded script tags with the XHR request and have the server sort out which additional scripts need to be loaded.
Load all scripts before-hand (perhaps add them to the DOM after the page has loaded to save time) and then forget about it. For scripts that need to initialize UI, just have each requested page call include a script tag that calls a global init function with the page name.
Have each requested page call a JS function that deals with loading/caching scripts. This function would be accessible from the global scope and would look like this: require_scripts('page_1_init', 'form_code', 'login_code') Then just have the function keep a list of loaded scripts and only append DOM script tags for scripts that haven't been loaded yet.
You could use a script loader like YUI Loader, LAB.js or other like jaf
Jaf provides you with mechanism to load views (HTML snippets) and their respective js, css files to create single page apps. Check out the sample todo list app. Although its not complete, there's still a lot of useful libraries you can use.
Personally, I would transmit JSON instead of raw HTML:
{
"title": "About",
"requires": ["navigation", "maps"],
"content": "<div id=…"
}
This lets you send metadata, like an array of required scripts, along with the content. You'd then use a script loader, like one of the ones mentioned above, or your own, to check which ones are already loaded and pull down the ones that aren't (inserting them into the <head>) before rendering the page.
Instead of including scripts inline for page-specific logic, I'd use pre-determined classes, ids, and attributes on elements that need special handling. You can fire an "onrender" event or let each piece of logic register an on-render callback that your page loader will call after a page is rendered or loaded for the first time.