Reload file.js every minute - javascript

I read (somewhere else on this site) you can't reload (or inject javascript) onto a page that is already rendered.
Is there any other way of doing this. For instance an iFrame?
I have a recent comment widget.js and I need to constantly get it to reload without reloading the whole page.
Any ideas?
edit: The site has recent comments on it and they are displayed via a recentcomment.js
Once the page is loaded it doesn't update itself unless you reload the page. I want it to update itself, a way to do this is to just reload the js file on the page, correct?

Why do you need to do this? It seems to me that there's probably a more appropriate solution to your problem.
But to answer it:
var elm = document.createElement("script");
elm.src = "Widget.js";
document.getElementsByTagName("head")[0].appendChild(elm);
Hope I didn't write any mistakes...

Rather than reloading the file, you can have all the implementation of the file contained in a function and then call the function every minute using the setTimeout() function.
Alternatively, if you want to reload it because the content of the file might have changed, it would probably be better to move that part of the code out to some external file and then use a function (running every minute with setTimeout()) to load the new content you need.

You can make cross-domain AJAX requests using certain methods, so you could make an AJAX request for the script file and parse it yourself. Parsing it yourself probably isn't an optimal solution, but it looks like you're dealing with a brain-dead service provider anyways.
Look at this guy's jQuery mod for an example:
http://james.padolsey.com/javascript/cross-domain-requests-with-jquery/
Once you get the data from the 3rd party, you could probably use some combination of regex and JSON parser to extract the comments.

Related

How to handle the content part in AJAX page switching in PWA?

I have zero experience in native apps, which might help with this question.
Since service worker caches everything so nicely, then I don't see any reason why I should render the entire webpage again when the page gets switched (link gets clicked.) So I will switch only the content, use history pushstate to change the URL and change the title. I have that part figured out.
Problem is, I cannot find any resources that would support either of the two content load ideas I have:
Load center content via AJAX with HTML.
Load center content as data only and render the HTML on-the-fly in JS.
First method would be fairly straight forward, but would mean that the payload would be bigger.
Second seems much more advanced, but would mean that HTML templates have to be in the JS somehow already? I also have a feeling, that there is a method somewhere in here.. that would allow to open the heavily cached page (lets say the article page) and replace the (text) contents. But as I said, I cannot find any resources to wager the cons and pros or give any reliable information on PWA AJAX page switching.
Any credible information on this matter would be much appreciated.
EDIT
I have kept reading and researching on this matter, but sadly there is no clear indication on how to handle dynamic content over AJAX. Whether I should parse the JSON data from AJAX to HTML in JS or send it already as HTML from the backend.
To add in favour to second option. I have figured out, that my theory had somewhat weight to it. If I use pure.js to pull a HTML template from hidden template tag and generate the HTML on the fly from JSON over AJAX.
you make it so complicated can we take a look at your code please?!
if you mean retrieving data from database by ajaxthen all what when you need is a jquery plugin
$(document).ready(function(){
var contentData1 = document.getElementById('contentData1');
$(function() {
$.post("pathToPHP.php",{contentData1: contentData1},function(data){
$("#container").html(data);
});
});
and the pathToPHP.php file should retrieve the data you want
echo "";

Force a route cleanly with javascript

I'm working with a legacy app's UI and the path that links to this app is a default:
something/fldr
Whenever that page loads it forces a fldr/landing.asp page. We want to get it to go to other.asp instead of landing.
My approach for this is to use:
if (document.readyState === "interactive") {
if(location.href == 'https://www.something.com/fldr'){
location.href="https://www.something.com/other.asp";
}
}
Doing this causes a page stutter, where the landing.asp loads, shows for like 2 seconds and then refreshes to the correct page.
Is there a standard method for doing something like this in JS or jQuery? I feel like there is a way to make the page hang up until the if statements executes rather than try to load the wrong page. But I can't for the life of me remember what it is. I've handled this on the back end by forcing the correct page to return in the API but I still feel like this is something that can be resolved with only JS.
Note: The route names are made up since this is a stripped down problem of a legacy app.
JavaScript (when running in a browser) is a client-side technology.
That means it cannot run without the page partially loading after the page has been served and sent to the user's browser (client). The browser begins loading resources and parsing scripts and code, and your script will execute in the order it is parsed. This is, in fact, the delay you're experiencing.
While you may possibly tweak this to make the location.href change
execute in some earlier part of this process, there is no way to avoid
a partial page load prior to the client-side redirect you have
implemented.
Essentially, there is a better way to do this, one which will reduce the redirect delay to be imperceptible to a user.
Making this change at the web-server level is the ideal solution; however, first consider, is that even needed?
First, before implementing a redirect, I would suggest to look in the IIS settings and see if there is a default document set to fldr/landing.asp;
You can then just change that setting to make the default document to what you need.
Here's an example for IIS how to do this.
If there is not a default document or if there is some other code or application logic that is forcing landing.asp to load, then you would set up a 301 Permanent Redirect for that URL on the web server.
Here are IIS docs on setting this up.
IF for some reason the above options are unavailable to you (don't have access to web server, etc.), then the best you can do is ensure that script is the first thing in the page before any other scripts, stylesheets, etc., are loaded.
Another hacky thing that might work is just replacing the entire content of landing.asp with other.asp and call it a day :)
That is a last resort of course, and hopefully you can just change the default document and that will handle it.

Run/inject javascript on page to get the html and post it to a URL

Before I used to just go to "View source" in the browser and grap all the html and post it into a form on my page. But after there have been inplemented delayed loading with ajax of some of the content I can't do this anymore.
It was not a problem doing it the old way ... but this does not work any more, since I'm missing important information.
Is it possible to somehow run a javascript in the browser, like from a bookmark shortcut or something like that. So I can grep all the html(or better yet, now filter some of the data) and then post it back to my site?
I have no idea what this is called or if its even possible.
I guess a browser extension could do this, but making for all browsers would be a pain, if this could be done with javascript.
All ideas are welcome.
If you are using jquery, you could just use ajax and send the html of the body (or whatever area of the page you want) to your server.
$.post('url-to-send.ext', {data:$(body).html()});
So, after alot of searching ... I fianlly found the answer to my own question.
Bookmarklets: http://en.wikipedia.org/wiki/Bookmarklet
Which as descripbed here: http://www.learningjquery.com/2006/12/jquerify-bookmarklet let you inject jquery on the site:
Create the following as a bookmark:
var s=document.createElement('script');
s.setAttribute('src','https://ajax.googleapis.com/ajax/libs/jquery/1.6.4/jquery.min.js');
document.getElementsByTagName('body')[0].appendChild(s);
Now it just extending it and fetch the information I need. Neat little trick I would say.

How to capture a complete webpage using javascript

I inject javascript code into a page user is currently viewing, on users command this script make DOM changes. At the end of this interaction user might want to save the page so that s/he can view/edit it later. I could remember the DOM changes that user made, But if the original page(at its source) is changed, I will not be able to restore this page for user. That is why I want to send the changed page to my server. I should be able to restore it completely and the page should behave exactly the way it did(including scripts and media).
Additionally I can not store media of users page at my end(resource limitation), so I guess I have to parse and modify all addresses/references/links of media to global URL/URI in various scripts(HTML/CSS/JavaScript).
Now the question is, Is there a library/framework/jquery extension that can help me achieve this objective ?
else, What is the right/professional way to do it ?
Since you are using jQuery you could try $("html").html(); just make sure to add the appropriate <html> tags when you output it again.
$('body').html()
$('head').html()
$('html').html()
Download firebug, and try it in the console window on this page. I am getting what looks like the correct data back.
Have I got It right that you are building some kind of CMS that let's the user edit entire pages (Not just seperate content blocks) in Contenteditable mode?
I would definatly advise looking at a solution like ckeditor/tinymce etc... Because doing it all yourself will be a terrible pain.
The answer from #Sydenam should work fine to save the whole HTML page.
Meanwhile, and this is IMPORTANT, I would recommend you to consider a potential SECURITY ISSUE here. Indeed the user can inject whatever he wants in the DOM and have you saving it, like nasty Javascript functions sending confidential information on a remote server for example.
So, in my perspective, a professional way of doing this would be to dedicate a PART of the DOM only to that usage, let say a <div id='editable_div'> that you can load using a $('#editable_div').load('your_url',parameters, etc...), and save afterward using another AJAX call.
When saving it you can parse this chunk of HTML and make sure nothing nasty is inside with some regexp (like tags).
Hope it helps,
Regards,

Best practices managing JavaScript on a single-page app

With a single page app, where I change the hash and load and change only the content of the page, I'm trying to decide on how to manage the JavaScript that each "page" might need.
I've already got a History module monitoring the location hash which could look like domain.com/#/company/about, and a Page class that will use XHR to get the content and insert it into the content area.
function onHashChange(hash) {
var skipCache = false;
if(hash in noCacheList) {
skipCache = true;
}
new Page(hash, skipCache).insert();
}
// Page.js
var _pageCache = {};
function Page(url, skipCache) {
if(!skipCache && (url in _pageCache)) {
return _pageCache[url];
}
this.url = url;
this.load();
}
The cache should let pages that have already been loaded skip the XHR. I also am storing the content into a documentFragment, and then pulling the current content out of the document when I insert the new Page, so I the browser will only have to build the DOM for the fragment once.
Skipping the cache could be desired if the page has time sensitive data.
Here's what I need help deciding on: It's very likely that any of the pages that get loaded will have some of their own JavaScript to control the page. Like if the page will use Tabs, needs a slide show, has some sort of animation, has an ajax form, or what-have-you.
What exactly is the best way to go around loading that JavaScript into the page? Include the script tags in the documentFragment I get back from the XHR? What if I need to skip the cache, and re-download the fragment. I feel the exact same JavaScript being called a second time might cause conflicts, like redeclaring the same variables.
Would the better way be to attach the scripts to the head when grabbing the new Page? That would require the original page know all the assets that every other page might need.
And besides knowing the best way to include everything, won't I need to worry about memory management, and possible leaks of loading so many different JavaScript bits into a single page instance?
If I understand the case correctly, you are trying to take a site that currently has pages already made for normal navigation, and you want to pull them down via ajax, to save yourself the page-reload?
Then, when this happens, you need to not reload the script tags for those pages, unless they're not loaded onto the page already?
If that is the case, you could try to grab all the tags from the page before inserting the new html into the dom:
//first set up a cache of urls you already have loaded.
var loadedScripts = [];
//after user has triggered the ajax call, and you've received the text-response
function clearLoadedScripts(response){
var womb = document.createElement('div');
womb.innerHTML = response;
var scripts = womb.getElementsByTagName('script');
var script, i = scripts.length;
while (i--) {
script = scripts[i];
if (loadedScripts.indexOf(script.src) !== -1) {
script.parentNode.removeChild(script);
}
else {
loadedScripts.push(script.src);
}
}
//then do whatever you want with the contents.. something like:
document.body.innerHTML = womb.getElementsByTagName('body')[0].innerHTML);
}
Oh boy are you in luck. I just did all of this research for my own project.
1: The hash event / manager you should be using is Ben Alman's BBQ:
http://benalman.com/projects/jquery-bbq-plugin/
2: To make search engines love you, you need to follow this very clear set of rules:
http://code.google.com/web/ajaxcrawling/docs/specification.html
I found this late and the game and had to scrap a lot of my code. It sounds like you're going to have to scrap some too, but you'll get a lot more out of it as a consequence.
Good luck!
I have never built such a site so I don't know if that is nbest practice, but I would put some sort of control information (like a comment or a HTTP header) in the response, and let the loader script handle redundancy/dependency cheching and adding the script tags to the header.
Do you have control over those pages being loaded? If not, I would recommend inserting the loaded page in an IFrame.
Taking the page scripts out of their context and inserting them in the head or adding them to another HTML element may cause problems unless you know exactly how the page is build.
If you have full control of the pages being loaded, I would recommend that you convert all your HTML to JS. It may sound strange but actually, a HTML->JS converter is not that far away. You could start of with Pure JavaScript HTML Parser and then let the parser output JS code, that builds the DOM using JQuery for example.
I was actually about to go down that road for a while ago on a webapp that I started working on, but now I handed it over to a contractor who converted all my pure JS pages into HTML+JQuery, whatever makes his daily work productive, I dont care, but I was really into that pure JS webapp approach and will definitely try it.
To me it sounds like you are creating a single-page app from the start (i.e. not re-factoring an existing site).
Several options I can think of:
Let the server control which script tags are included. pass a list of already-loaded script tags with the XHR request and have the server sort out which additional scripts need to be loaded.
Load all scripts before-hand (perhaps add them to the DOM after the page has loaded to save time) and then forget about it. For scripts that need to initialize UI, just have each requested page call include a script tag that calls a global init function with the page name.
Have each requested page call a JS function that deals with loading/caching scripts. This function would be accessible from the global scope and would look like this: require_scripts('page_1_init', 'form_code', 'login_code') Then just have the function keep a list of loaded scripts and only append DOM script tags for scripts that haven't been loaded yet.
You could use a script loader like YUI Loader, LAB.js or other like jaf
Jaf provides you with mechanism to load views (HTML snippets) and their respective js, css files to create single page apps. Check out the sample todo list app. Although its not complete, there's still a lot of useful libraries you can use.
Personally, I would transmit JSON instead of raw HTML:
{
"title": "About",
"requires": ["navigation", "maps"],
"content": "<div id=…"
}
This lets you send metadata, like an array of required scripts, along with the content. You'd then use a script loader, like one of the ones mentioned above, or your own, to check which ones are already loaded and pull down the ones that aren't (inserting them into the <head>) before rendering the page.
Instead of including scripts inline for page-specific logic, I'd use pre-determined classes, ids, and attributes on elements that need special handling. You can fire an "onrender" event or let each piece of logic register an on-render callback that your page loader will call after a page is rendered or loaded for the first time.

Categories

Resources