I have an internet radio station and I need a script that will display a picture of the current song in a particular dvi with an id. The image is automatically uploaded via ftp to the server each time the song changes..
HTML:
<div id="auto"></div>
JS:
$ (document).ready(function() {
$('#auto').html('<img src="artwork.png"></img>');
refresh();
});
function refresh() {
setTimeout (function() {
$('#auto').html('<img src="artwork.png"></img>');
refresh();
}, 1000);
}
I tried this, but all I get is that the image is loaded, but in case of a change, I have to manually refresh the whole page again..
I'll point out multiple things here.
I think your code is just fine if you are going for the setTimeout recursive calls instead of one setInterval action to repeat it.
File Caching
your problem is probably the browser's cache since you are using the same image name and directory all the time. browsers compare the file name and directory and to decide to load it from its cache or else it will request it from the server. there are different tricks you can do to reload the image from the server in this particular case.
Use different file names/directories for the songs loaded dynamically
Use a randomized GET query (e.g. image.png?v=current timestamp)
Your method for switching
you are replacing the file with FTP, I wouldn't recommend that. maybe you should have all your albums and thumbnails uploaded to the server and use a different dynamic switching for efficiency and less error proneness and will help you achieve method #1 in the previous section better.
Loading with constant refresh
I would like to highlight that if you are using nodeJs or nginx servers - which are event based - you can achieve the same functionality with much less traffic. you don't need a refresh method since those servers can actually send data on specific events to the browser telling it to load a specific resource at that time. no constant refresh is required for this.
You consider your options, I tried to be as comprehensive as I could
At the top level, browser cache the image based on its absolute URL. You may add extra query to the url to trick browser that is another new image. In this case, new URL of artist.png will be artist.png?timestamp=123
Check this out for the refresh():
function refresh() {
setTimeout (function() {
var timestamp = new Date().getTime();
// reassign the url to be like artwork.png?timestamp=456784512 based on timestmap
$('#auto').html('<img src="artwork.png?timestamp='+ timestamp +'"></img>');
refresh();
}, 1000);
}
You may assign id attribute to the image and change its src url
html
<img id="myArtworkId" src="artwork.png"/>
js in the refresh method
$('#myArtworkId').attr('src', 'artwork.png?timestamp=' + new Date().getTime());
You can use window.setInterval() to call a method every x seconds and clearInterval() to stop calling that method. View this answer for more information on this.
// Array containing src for demo
$srcs = ['https://www.petmd.com/sites/default/files/Acute-Dog-Diarrhea-47066074.jpg',
'https://www.catster.com/wp-content/uploads/2018/05/Sad-cat-black-and-white-looking-out-the-window.jpg',
'https://img.buzzfeed.com/buzzfeed-static/static/2017-05/17/13/asset/buzzfeed-prod-fastlane-03/sub-buzz-25320-1495040572-8.jpg?downsize=700:*&output-format=auto&output-quality=auto'
]
$i = 0;
$(document).ready(function() {
$('#auto').html('<img src="https://images.pexels.com/photos/617278/pexels-photo-617278.jpeg?auto=compress&cs=tinysrgb&dpr=1&w=500"></img>');
// call method after every 2 seconds
window.setInterval(function() {
refresh();
}, 2000);
// To stop the calling of refresh method uncomment the line below
//clearInterval()
});
function refresh() {
$('#auto').html('<img src="' + $srcs[$i++] + '"></img>');
// Handling of index out of bound exception
if ($srcs.length == $i) {
$i = 0;
}
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
<div id="auto"></div>
I'm using an <iframe> (I know, I know, ...) in my app (single-page application with ExtJS 4.2) to do file downloads because they contain lots of data and can take a while to generate the Excel file (we're talking anything from 20 seconds to 20 minutes depending on the parameters).
The current state of things is : when the user clicks the download button, he is "redirected" by Javascript (window.location.href = xxx) to the page doing the export, but since it's done in PHP, and no headers are sent, the browser continuously loads the page, until the file is downloaded. But it's not very user-friendly, because nothing shows him whether it's still loading, done (except the file download), or failed (which causes the page to actually redirect, potentially making him lose the work he was doing).
So I created a small non-modal window docked in the bottom right corner that contains the iframe as well as a small message to reassure the user. What I need is to be able to detect when it's loaded and be able to differenciate 2 cases :
No data : OK => Close window
Text data : Error message => Display message to user + Close window
But I tried all 4 events (W3Schools doc) and none is ever fired. I could at least understand that if it's not HTML data returned, it may not be able to fire the event, but even if I force an error to return text data, it's not fired.
If anyone know of a solution for this, or an alternative system that may fit here, I'm all ears ! Thanks !
EDIT : Added iframe code. The idea is to get a better way to close it than a setTimeout.
var url = 'http://mywebsite.com/my_export_route';
var ifr = $('<iframe class="dl-frame" src="'+url+'" width="0" height="0" frameborder="0"></iframe>');
ifr.appendTo($('body'));
setTimeout(function() {
$('.dl-frame').remove();
}, 3000);
I wonder if it would require some significant changes in both frontend and backend code, but have you considered using AJAX? The workflow would be something like this: user sends AJAX request to start file generating and frontend constantly polls it's status from the server, when it's done - show a download link to the user. I believe that workflow would be more straightforward.
Well, you could also try this trick. In parent window create a callback function for the iframe's complete loading myOnLoadCallback, then call it from the iframe with parent.myOnLoadCallback(). But you would still have to use setTimeout to handle server errors/connection timeouts.
And one last thing - how did you tried to catch iframe's events? Maybe it something browser-related. Have you tried setting event callbacks in HTML attributes directly? Like
<iframe onload="done()" onerror="fail()"></iframe>
That's a bad practice, I know, but sometimes job need to be done fast, eh?
UPDATE
Well, I'm afraid you have to spend a long and painful day with a JS debugger. load event should work. I still have some suggestions, though:
1) Try to set event listener before setting element's src. Maybe onload event fires so fast that it slips between creating element and setting event's callback
2) At the same time try to check if your server code plays nicely with iframes. I have made a simple test which attempts to download a PDF from Dropbox, try to replace my URL with your backed route's.
<script src="https://code.jquery.com/jquery-1.11.1.min.js"></script>
<iframe id="book"></iframe>
<button id="go">Request downloads!</button>
<script>
var bookUrl = 'https://www.dropbox.com/s/j4o7tw09lwncqa6/thinkpython.pdf';
$('#book').on('load', function(){
console.log('WOOT!', arguments);
});
$('#go').on('click', function(){
$('#book').attr('src', bookUrl);
});
</script>
UPDATE 2
3) Also, look at the Network tab of your browser's debugger, what happens when you set src to the iframe, it should show request and server's response with headers.
I've tried with jQuery and it worked just fine as you can see in this post.
I made a working example here.
It's basically this:
<iframe src="http://www.example.com" id="myFrame"></iframe>
And the code:
function test() {
alert('iframe loaded');
}
$('#myFrame').load(test);
Tested on IE11.
I guess I'll give a more hacky alternative to the more proper ways of doing it that the others have posted. If you have control over the PHP download script, perhaps you can just simply output javascript when the download is complete. Or perhaps redirect to a html page that runs javascript. The javascript run, can then try to call something in the parent frame. What will work depends if your app runs in the same domain or not
Same domain
Same domain frame can just use frame javascript objects to reference each other. so it could be something like, in your single page application you can have something like
window.downloadHasFinished=function(str){ //Global pollution. More unique name?
//code to be run when download has finished
}
And for your download php script, you can have it output this html+javascript when it's done
<script>
if(parent && parent.downloadHasFinished)
parent.downloadHasFinished("if you want to pass a data. maybe export url?")
</script>
Demo jsfiddle (Must run in fullscreen as the frames have different domain)
Parent jsfiddle
Child jsfiddle
Different Domains
For different domains, We can use postMessage. So in your single page application it will be something like
$(window).on("message",function(e){
var e=e.originalEvent
if(e.origin=="http://downloadphp.anotherdomain.com"){ //for security
var message=e.data //data passed if any
//code to be run when download has finished
}
});
and in your php download script you can have it output this html+javascript
<script>
parent.postMessage("if you want to pass data",
"http://downloadphp.anotherdomain.com");
</script>
Parent Demo
Child jsfiddle
Conclusion
Honestly, if the other answers work, you should probably use those. I just thought this was an interesting alternative so I posted it up.
You can use the following script. It comes from a project of mine.
$("#reportContent").html("<iframe id='reportFrame' sandbox='allow-same-origin allow-scripts' width='100%' height='300' scrolling='yes' onload='onReportFrameLoad();'\></iframe>");
Maybe you should use
$($('.dl-frame')[0].contentWindow.document).ready(function () {...})
Try this (pattern)
$(function () {
var session = function (url, filename) {
// `url` : URL of resource
// `filename` : `filename` for resource (optional)
var iframe = $("<iframe>", {
"class": "dl-frame",
"width": "150px",
"height": "150px",
"target": "_top"
})
// `iframe` `load` `event`
.one("load", function (e) {
$(e.target)
.contents()
.find("html")
.html("<html><body><div>"
+ $(e.target)[0].nodeName
+ " loaded" + "</div><br /></body></html>");
alert($(e.target)[0].nodeName
+ " loaded" + "\nClick link to download file");
return false
});
var _session = $.when($(iframe).appendTo("body"));
_session.then(function (data) {
var link = $("<a>", {
"id": "file",
"target": "_top",
"tabindex": "1",
"href": url,
"download": url,
"html": "Click to start {filename} download"
});
$(data)
.contents()
.find("body")
.append($(link))
.addBack()
.find("#file")
.attr("download", function (_, o) {
return (filename || o)
})
.html(function (_, o) {
return o.replace(/{filename}/,
(filename || $(this).attr("download")))
})
});
_session.always(function (data) {
$(data)
.contents()
.find("a#file")
.focus()
// start 6 second `download` `session`,
// on `link` `click`
.one("click", function (e) {
var timer = 6;
var t = setInterval(function () {
$(data)
.contents()
.find("div")
// `session` notifications
.html("Download session started at "
+ new Date() + "\n" + --timer);
}, 1000);
setTimeout(function () {
clearInterval(t);
$(data).replaceWith("<span class=session-notification>"
+ "Download session complete at\n"
+ new Date()
+ "</span><br class=session-notification />"
+ "<a class=session-restart href=#>"
+ "Restart download session</a>");
if ($("body *").is(".session-restart")) {
// start new `session`,
// on `.session-restart` `click`
$(".session-restart")
.on("click", function () {
$(".session-restart, .session-notification")
.remove()
// restart `session` (optional),
// or, other `session` `complete` `callback`
&& session(url, filename ? filename : null)
})
};
}, 6000);
});
});
};
// usage
session("http://www.ecma-international.org/publications/files/ECMA-ST/Ecma-262.pdf", "ECMA_JS.pdf")
});
jsfiddle http://jsfiddle.net/guest271314/frc82/
In regards to your comment about to get a better way to close it instead of setTimeout. You could use jQuery fadeOut option or any of the transitions and in the 'complete' callback remove the element. Below is an example you can dump right into a fiddle and only need to reference jQuery.
I also wrapped inside listener for 'load' event to not do the fade until the iFrame has been loaded as question originally was asking.
// plugin your URL here
var url = 'http://jquery.com';
// create the iFrame, set attrs, and append to body
var ifr = $("<iframe>")
.attr({
"src": url,
"width": 300,
"height": 100,
"frameborder": 0
})
.addClass("dl-frame")
.appendTo($('body'))
;
// log to show its part of DOM
console.log($(".dl-frame").length + " items found");
// create listener for load
ifr.one('load', function() {
console.log('iframe is loaded');
// call $ fadeOut to fade the iframe
ifr.fadeOut(3000, function() {
// remove iframe when fadeout is complete
ifr.remove();
// log after, should no longer exist in DOM
console.log($(".dl-frame").length + " items found");
});
});
If you are doing a file download from a iframe the load event wont fire :) I was doing this a week ago. The only solution to this problem is to call a download proxy script with a tag and then return that tag trough a cookie then the file is loaded. min while yo need to have a setInterval on the page witch will watch for that specific cookie.
// Jst to clearyfy
var token = new Date().getTime(); // ticks
$('<iframe>',{src:"yourproxy?file=somefile.file&token="+token}).appendTo('body');
var timers = [];
timers[timers.length+1] = setInterval(function(){
var _index = timers.length+1;
var cookie = $.cooke(token);
if(typeof cookie != "undefined"){
// File has been downloaded
$.removeCookie(token);
clearInterval(_index);
}
},400);
in your proxy script add the cookie with the name set to the string sent bay the token url parameter.
If you control the script in server that generates excel or whatever you are sending to iframe why don't you put a UID flag and store it in session with value 0, so... when iframe is created and server script is called just set UID flag to 1 and when script is finished (the iframe will be loaded) just put it to 2.
Then you only need a timer and a periodic AJAX call to the server to check the UID flag... if it's set to 0 the process doesn't started, if it's 1 the file is creating, and finally if it's 2 the process has been ended.
What do you think? If you need more information about this approach just ask.
What you are saying could be done for images and other media formats using $(iframe).load(function() {...});
For PDF files or other rich media, you can use the following Library:
http://johnculviner.com/jquery-file-download-plugin-for-ajax-like-feature-rich-file-downloads/
Note: You will need JQuery UI
You can use this library. The code snippet for you purpose would be something like:
window.onload = function () {
rajax_obj = new Rajax('',
{
action : 'http://mywebsite.com/my_export_route',
onComplete : function(response) {
//This will only called if you have returned any response
// instead of file from your export script
// In your case 2
// Text data : Error message => Display message to user
}
});
}
Then you can call rajax_obj.post() on your download link click.
Download
NB: You should add some header to your PHP script so it force file download
header('Content-Disposition: attachment; filename="'.$file.'"');
header('Content-Transfer-Encoding: binary');
There is two solutions that i can think of. Either you have PHP post it's progress to a MySQL table where from frontend will be pulling information from using AJAX calls to check up on the progress of the generation. Using somekind of unique key that is being generated when accessing the page would be ideal for multiple people generating excel files at the same time.
Another solution would be to use nodejs & then in PHP post the progress of the excel file using cURL or a socket to a nodejs service. Then when receiving updates from PHP in nodejs you simply write the progress of the excel file for the right socket. This will cut off some browser support though. Unless you go through with it using external libraries to bring websocket support for pretty much all browsers & versions.
Hope this answer helped. I was having the same issue previous year. Ended up doing AJAX polling having PHP post progress on the fly.
Try this:
Note: You should be on the same domain.
var url = 'http://mywebsite.com/my_export_route',
iFrameElem = $('body')
.append('<iframe class="dl-frame" src="' + url + '" width="0" height="0" frameborder="0"></iframe>')
.find('.dl-frame').get(0),
iDoc = iFrameElem.contentDocument || iFrameElem.contentWindow.document;
$(iDoc).ready(function (event) {
console.log('iframe ready!');
// do stuff here
});
I'm loading an AJAX request of another HTML page, which is then inserted into a DOM element of the current page.
The page I'm getting through AJAX includes link references to stylesheets, as well as multiple images, which must be loaded from the server.
I want to execute code after all resources from the AJAX call loads, including referenced stylesheets and images.
Note that these stylesheets and images are not directly loaded from AJAX but are loaded as a result of the insertion of the HTML from the AJAX call.
Thus, I'm not looking for the success: callback, but rather like another $(window).load(function () { ... }); after the AJAX call (I've tried listening again to $(window).load without success.
Let me know if you need more code.
Checking whether a stylesheet has been loaded is difficult to do -- especially cross-browser. This article suggests having an element that will be changed by a known rule in the loaded stylesheet and polling to check whether its style has been changed to detect loading.
Images are easier, and I would expect they take a lot longer to load so you can probably get away with only checking image loading.
success: function (html) {
var imageloads = [];
$(html).find("img").each(function () {
var dfd = $.Deferred();
$(this).on('load', function () {
dfd.resolve();
});
//Image was cached?
if (this.complete) {
$(this).trigger('load');
}
imageloads.push(dfd);
});
$.when.apply(undefined, imageloads).done(function () {
//images finished loading
});
}
My web app dynamically loads sections of its UI with jquery.ajax. The new UI sections come with script though. I'm loading them as such:
Use...
$.ajax({
url: url,
dataType: 'html',
success: function(data, textStatus, XMLHttpRequest) {
$(target_selector).html( data );
update_ui_after_load();
}
});
This almost works. The problem is that the scripts included in the dynamic part of the page run before the new page fragment is inserted into the DOM. But often these scripts want to modify the HTML they're being delivered with. My best hacky solution so far is just to delay the scripts some reasonable amount of time to let the DOM insertion happen, by wrapping them in a setTimeout:
window.setTimeout( function() {
// process downloaded Fragment
}, 300);
Obviously this is unreliable and hideous. What's a better way?
Using
$(function);
will make the function you pass to jQuery to be run after the fragment is inline on the page.
I found it in
ASP.NET Ajax partial postback and jQuery problem
after looking at your question.
Are you familiar with the live() function? Might be what you're looking for here.
http://api.jquery.com/live/
The problem is that the scripts included in the dynamic part of the page run before the new page fragment is inserted into the DOM. But often these scripts want to modify the HTML they're being delivered with.
I'm fairly sure that in that case, the only sensible thing is to place the script after the HTML element.
Everything else would become kludgy quickly - I guess you could implement your own "ready" handler that gets executed after your HTML has been inserted, but that would be a lot of work to implement for no real gain.
I solved it by making a new simple ready handler system as follows...
var ajaxOnLoad = (function() {
var ajaxOnLoad = {};
var onLoadQueue=[];
ajaxOnLoad.onLoad= function(fn) {
onLoadQueue.push(fn);
}
ajaxOnLoad.fireOnLoad = function() {
while( onLoadQueue.length > 0 ) {
var fn = onLoadQueue.shift();
fn();
}
}
window.ajaxOnLoad = ajaxOnLoad;
return ajaxOnLoad;
})();
So in the pages which get .ajax() loaded, the scripts are queued to run with
ajaxOnLoad.onLoad( function() {
// Stuff to do after the page fragment is inserted in the main DOM
});
and in the code which does the insertion, before the update_ui_after_load() call, run
ajaxOnLoad.fireOnLoad();
A more complete solution could parse the pages, find script tags, and queue them up automatically. But since I have complete control of the fragments being inserted, it's easier for me to switch to using ajaxOnLoad.onLoad.
I have a simple greasemonkey script that makes some simple dom manipulation. The greasemonkey script is loaded after the DOM is loaded, that's fine so fare and it does work for the initial Page. But on this site (it's twitter ;-) ) parts of the page get loaded after a click by xmlhttprequest and this part did not get manipulated by my greasemonkeyscript.
Is there a simple possibility to run the script again after the xmlhttprequest is loaded or after the DOM is changed by the request?
A powerful technique when using Greasemonkey is to 'hijack' existing JavaScript functions. If the page you are altering has a function called processAjaxResponse which is called when to process the XmlHttpRequest response, the you can do the following in your Greasemonkey script:
var originalProcessAjaxResponse;
function myNewProcessAjaxResponse() {
/* Manipulate DOM if you need to, and then... */
originalProcessAjaxResponse();
}
function hijack() {
var originalProcessAjaxResponse = processAjaxResponse;
processAjaxResponse = myNewProcessAjaxResponse();
}
This allows you to inject your own functionality when the AJAX response event occurs.
Thanks Howard. It was the way in the right direction.
It's quite hard to find the right entry point to hijack a function on a minifyied script. But I found that there is a huck in the twitter script. It does call window.onPageChange after the Ajax request. (I wonder if this is a common best practice in javascript and if other do this as well?) I did found some code on http://userscripts.org/scripts/review/47998 wich does use this posibility to attach events.
if (typeof unsafeWindow.onPageChange === 'function') {
var _onPageChange = unsafeWindow.onPageChange;
unsafeWindow.onPageChange = function(){
_onPageChange();
filter();
};
} else {
unsafeWindow.onPageChange = filter;
}