I have a web app that loads instagram images and creates a slideshow with background sound. While the images are being loaded, a pre loader is run to show the processing. But it doesn't fully load the mp3 file and when I check in the browser's console it has got 206 partial content. So, on clicking the play button, I have to wait a few second so that music is loaded fully to play the slideshow.
console.log('loading theme: ' + folder);
$.ajaxSetup({
cache: false
});
$.when(
$.getScript(epic.getFrontendURL() + "animations/" + folder + "/js/data.js"),
$.getScript(epic.getFrontendURL() + "animations/" + folder + "/js/main.js"),
$.Deferred(function (deferred) {
$(deferred.resolve);
})
).done(function () {
//some methods
sound_bg.src = epic.getFrontendURL() + "sound/back.mp3?
});
So, is this related somehow, with done which delays the processing.
This behavior is normal. Browsers use HTTP range requests to load audio and video files to minimize bandwidth usage.
If you want the browser to preload the audio, set the preload attribute to auto on the audio element:
<audio src="..." preload="auto"></audio>
I found a way to check if there are errors, if there are sound errors, your website will reload the faulty sound, until it's loaded right. This has worked for me:
var sound1 = new Audio('sfx/1.mp3')
var sound2 = new Audio('sfx/2.mp3')
var sound3 = new Audio('sfx/3.mp3')
var soundChecker = setInterval(function() {
if (sound1.error) {
sound1 = new Audio('sfx/1.mp3')
} else if (sound2.error) {
sound2 = new Audio('sfx/2.mp3')
} else if (sound3.error) {
sound3 = new Audio('sfx/3.mp3')
} else {
clearInterval(soundChecker)
}
}, 1500)
This will check for errors when loading sounds every one and a half seconds (you could modify the time for slower connections), if there is an error with any of the loaded sounds, soundChecker will reload them until all sounds have been loaded successfully (without errors). If all sounds have been loaded successfully the soundChecker interval will clear itself and stop checking.
You will notice that errors such as net::ERR_CACHE_READ_FAILURE 206 (Partial Content) will still be printed in the console but the sound will work, because in a following reload it was successfully loaded.
Related
I have a html/javascript client that is listening to a mjpeg video stream:
myImg = document.getElementById('my-image');
myImg.src = 'http://myserver.com/camera.mjpeg';
Works fine but if the video stream dies for whatever reason the video feed "freezes" on the last received image and I have no opportunity to display an error to the user. I've see this post that offers a solution (creating a long running ajax request alongside the stream) that only works some of the time. I was hoping there would be a more supported method like through a disconnect event or something.
Even an event for when data is received would be better than nothing. At least that way I could tell if it's been a while since a frame came through. Using addEventListener('load') only works on the very first frame.
Any ideas?
Update:
Based on comments I have tried the following approaches, none of which has worked:
myImg.addEventListener('error', event => { ... });
myImg.addEventListener('stalled', event => { ... });
myImg.addEventListener('suspend', event => { ... });
This is common with a normal implementation of a mjpeg, for example
<video src="http://myserver.com/camera.mjpeg" controls>
Your browser does not support the <code>video</code> element.
</video>
the mjpeg is a series of images and eventually it will not get the next one for whatever reason, breaking the connection. (this is sometimes because the source is cached, causing the browser to use the last image every time). I don't consider this an error, more something to program around with mjpeg streams.
A simple solution you can do, set a refresh rate and set the src continuously refreshing the connection every ~500ms (or less depending on your network connection/resources).
setInterval(function() {
var myImg = document.getElementById('myImg');
myImg.src = 'http://myserver.com/camera.mjpeg?rand=' + Math.random();
}, 5000);
The random number is added to prevent browser side caching in the event the server sends those headers.
Or you can create a ReadableStream, and keep reading a blob of bytes directly into the source of the image. There is a robust example in this repo, from this other question.
In Safari document.readyState will change from interactive to complete.
For example put this before the image loads:
<script>
console.log('Initial ready state', document.readyState);
document.onreadystatechange = function() {
console.log('Ready state changed to:', document.readyState);
}
</script>
And the output will be:
Initial ready state – "loading"
Ready state changed to: – "interactive"
// When the connection disconnects:
Ready state changed to: – "complete"
In google chrome the readyState doesn't stay on interactive, but it looks like chrome is better at reconnecting, so might not be an issue for you.
Edit: One way to make use of this is to drop the image in an iframe, you'll continually get load events in safari (this does not work in chrome).
iframe = document.createElement('iframe')
iframe.onload = console.log
iframe.src = "http://10.0.0.119:8080/stream"
document.body.append(iframe)
Edit2: Another technique -- use image.decode to detect when the connection is down and reload the image:
<img id="stream" src="http://10.0.0.119:8080/stream"></img>
<script>
let image = document.getElementById('stream');
async function check() {
while (true) {
try {
await image.decode();
} catch {
let src = image.src;
image.src = "";
image.src = src;
}
await new Promise((resolve) => setTimeout(resolve, 5000));
}
}
check();
</script>
Would something like this work?
function hasLoaded(myImg) {
return myImg.complete && myImg.naturalHeight !== 0;
}
Following Beau Bouchard answers.
The setInterval timer, works fine but it tends to max out active client listening. ( if ur mjpeg stream are coming directly from an IP Camera). Could possibly create a restreaming server the mjpeg server to allow more clients to be able to be listening to it) Short pooling though does tend to be very resource heavy.
Tried the Restream Api as well. When loading the image back into the img tag, you do get a jittery effect, most likely because the chunks are coming in randomly and not smooth out via time?
In the end, i use the onload img tag event. This triggers whenever an img is loaded. Then a time interval to check if the img tag has stop loading to determine if the mjpeg stream has stop.
Met the same requirement, test on Image onload event and works!!
If FFMPEG feed FFSERVER stop, although the MJPEG in a still image,
After couple times error count, mjpeg FAIL! detected.
<!DOCTYPE HTML>
<HTML>
<HEAD>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<TITLE> mjpeg detect </TITLE>
<script type="text/javascript" language="JavaScript" src="/js/jquery.js"></script>
<script type="text/javascript" language="JavaScript">
//------------------------------------------------------------------------------
// localhost/tool/mjpeg.htm
// document.ready
$(function(){
setTimeout("mjpegRefresh()", 10000);
});
var mTmjpegRefresh, mBmjpegStatus=0, mNmjpegError=0;
var mjpegRefresh = function()
{
clearTimeout(mTmjpegRefresh);
mBmjpegStatus=0;
$('#myMJPEG').attr('src', "http://192.168.1.17:8090/live.mjpeg?rand=" + Math.random());
console.log("mjpeg refresh: ", Math.round( (new Date()).getTime()/1000)) ;
mTmjpegRefresh = setTimeout("mjpegRefresh()", 10000);
mTmjpegStatusCheck = setTimeout("mjpegStatusCheck()", 5000);
}
var mjpegOnload = function()
{
console.log("mjpeg Onload");
mBmjpegStatus=1;
}
var mTmjpegStatusCheck;
var mjpegStatusCheck = function()
{
clearTimeout(mTmjpegStatusCheck);
if(mBmjpegStatus>0)
{
mNmjpegError=0;
}
else
{
mNmjpegError++;
}
if(mNmjpegError>5)
{
console.log("mjpeg FAIL!");
}
mTmjpegStatusCheck = setTimeout("mjpegStatusCheck()", 5000);
}
//------------------------------------------------------------------------------
</script>
</HEAD>
<BODY>
<img src="http://192.168.1.17:8090/live.mjpeg" width="720" height="404" id="myMJPEG" onload="mjpegOnload()">
</BODY>
</HTML>
I am trying to play a beep sound a minute after user has come on the page of my website. I found the solution here https://stackoverflow.com/a/18628124/912359
Here's my code:
$(document).ready(function(){
setTimeout(function () {
try{
if(!$(".facebook-chat").hasClass("active")){
$(".facebook-chat").addClass("active");
var audio = new Audio("/sound/chat.mp3");
audio.play();
}
}catch(e){
}
}, 60000);
}):
This throws an exception:
Uncaught (in promise) DOMException
Strangely, once I load the sound file separately in my browser and come back to the page, it works perfectly. Any ideas how I can fix it.
[Edit]
The issue is that user has to interact with the browser before the sound can be played. So I put the same code under click event of the body and it works. But the same doesn't work on scroll event either. I guess chrome doesn't consider scroll a user interaction. Can anyone add what other interactions can be used to trigger this?
Also, how is it working if I load the audio file in a separate window and come back to my page.
You can try loading the audio when the document is ready and then play it later only if the resource is loaded (for this check you can register a callback on onloadeddata). Otherwise, if resource is not loaded, you can try loading it again.
$(document).ready(function()
{
let aud = new Audio('https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3');
let canPlay = false;
aud.onloadeddata = () => (console.log("audio loaded"), canPlay = true);
setInterval(function()
{
if (canPlay)
aud.play();
else
aud = new Audio('https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3');
}, 3000);
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
Best solution I could come up with when I tried the same was:
const playPromise = audio.play();
if (playPromise !== null){
playPromise.catch(function() { audio.play(); })
}
But sometimes (One out of ten) the second audio.play() where also uncaught and the audio did not play either. I suggest you made a loop that stops only when the Promise is finally caught.
To give you some background, many (if not all) websites load their images one by one, so if there are a lot of images, and/or you have a slow computer, most of the images wont show up. This is avoidable for the most part, however if you're running a script to exact image URLs, then you don't need to see the image, you just want its URL. My question is as follows:
Is it possible to trick a webpage into thinking an image is done loading so that it will start loading the next one?
Typically browser will not wait for one image to be downloaded before requesting the next image. It will request all images simultaneously, as soon as it gets the srcs of those images.
Are you sure that the images are indeed waiting for previous image to download or are they waiting for a specific time interval?
In case if you are sure that it depends on download of previous image, then what you can do is, route all your requests through some proxy server / firewall and configure it to return an empty file with HTTP status 200 whenever an image is requested from that site.
That way the browser (or actually the website code) will assume that it has downloaded the image successfully.
how do I do that? – Jack Kasbrack
That's actually a very open ended / opinion based question. It will also depend on your OS, browser, system permissions etc. Assuming you are using Windows and have sufficient permissions, you can try using Fiddler. It has an AutoResponder functionality that you can use.
(I've no affiliation with Fiddler / Telerik as such. I'm suggesting it only as an example and because I've used it in the past and know that it can be used for the aforementioned purpose. There will be many more products that provide similar functionality and you should use the product of your choice.)
use a plugin called lazy load. what it does is it will load the whole webpage and will just load the image later on. it will only load the image when the user scroll on it.
To extract all image URLs to a text file maybe you could use something like this,
If you execute this script inside any website it will list the URLs of the images
document.querySelectorAll('*[src]').forEach((item) => {
const isImage = item.src.match(/(http(s?):)([/|.|\w|\s|-])*\.(?:jpg|jpeg|gif|png|svg)/g);
if (isImage) console.log(item.src);
});
You could also use the same idea to read Style from elements and get images from background url or something, like that:
document.querySelectorAll('*').forEach((item) => {
const computedItem = getComputedStyle(item);
Object.keys(computedItem).forEach((attr) => {
const style = computedItem[attr];
const image = style.match(/(http(s?):)([/|.|\w|\s|-])*\.(?:jpg|jpeg|gif|png|svg)/g);
if (image) console.log(image[0]);
});
});
So, at the end of the day you could do some function like that, which will return an array of all images on the site
function getImageURLS() {
let images = [];
document.querySelectorAll('*').forEach((item) => {
const computedItem = getComputedStyle(item);
Object.keys(computedItem).forEach((attr) => {
const style = computedItem[attr];
const image = style.match(/(http(s?):)([/|.|\w|\s|-])*\.(?:jpg|jpeg|gif|png|svg)/g);
if (image) images.push(image[0]);
});
});
document.querySelectorAll('*[src]').forEach((item) => {
const isImage = item.src.match(/(http(s?):)([/|.|\w|\s|-])*\.(?:jpg|jpeg|gif|png|svg)/g);
if (isImage) images.push(item.src);
});
return images;
}
It can probably be optimized but, well you get the idea..
If you just want to extract images once. You can use some tools like
1) Chrome Extension
2) Software
3) Online website
If you want to run it multiple times. Probably use the above code https://stackoverflow.com/a/53245330/4674358 wrapped in if condition
if(document.readyState === "complete") {
extractURL();
}
else {
//Add onload or DOMContentLoaded event listeners here: for example,
window.addEventListener("onload", function () {
extractURL();
}, false);
//or
/*document.addEventListener("DOMContentLoaded", function () {
extractURL();
}, false);*/
}
extractURL() {
//code mentioned above
}
You want the "DOMContentLoaded" event docs. It fires as soon as the document is fully parsed, but before everything has been loaded.
let addIfImage = (list, image) => image.src.match(/(http(s?):)([/|.|\w|\s|-])*\.(?:jpg|jpeg|gif|png|svg)/g) ?
[image.src, ...list] :
list;
let getSrcFromTags= (tag = 'img') => Array.from(document.getElementsByTagName(tag))
.reduce(addIfImage, []);
if (document.readyState === "loading") {
document.addEventListener("DOMContentLoaded", doSomething);
} else { // `DOMContentLoaded` already fired
doSomething();
}
I am using this, works as expected:
var imageLoading = function(n) {
var image = document.images[n];
var downloadingImage = new Image();
downloadingImage.onload = function(){
image.src = this.src;
console.log('Image ' + n + ' loaded');
if (document.images[++n]) {
imageLoading(n);
}
};
downloadingImage.src = image.getAttribute("data-src");
}
document.addEventListener("DOMContentLoaded", function(event) {
setTimeout(function() {
imageLoading(0);
}, 0);
});
And change every src attribute of image element to data-src
I'm using an <iframe> (I know, I know, ...) in my app (single-page application with ExtJS 4.2) to do file downloads because they contain lots of data and can take a while to generate the Excel file (we're talking anything from 20 seconds to 20 minutes depending on the parameters).
The current state of things is : when the user clicks the download button, he is "redirected" by Javascript (window.location.href = xxx) to the page doing the export, but since it's done in PHP, and no headers are sent, the browser continuously loads the page, until the file is downloaded. But it's not very user-friendly, because nothing shows him whether it's still loading, done (except the file download), or failed (which causes the page to actually redirect, potentially making him lose the work he was doing).
So I created a small non-modal window docked in the bottom right corner that contains the iframe as well as a small message to reassure the user. What I need is to be able to detect when it's loaded and be able to differenciate 2 cases :
No data : OK => Close window
Text data : Error message => Display message to user + Close window
But I tried all 4 events (W3Schools doc) and none is ever fired. I could at least understand that if it's not HTML data returned, it may not be able to fire the event, but even if I force an error to return text data, it's not fired.
If anyone know of a solution for this, or an alternative system that may fit here, I'm all ears ! Thanks !
EDIT : Added iframe code. The idea is to get a better way to close it than a setTimeout.
var url = 'http://mywebsite.com/my_export_route';
var ifr = $('<iframe class="dl-frame" src="'+url+'" width="0" height="0" frameborder="0"></iframe>');
ifr.appendTo($('body'));
setTimeout(function() {
$('.dl-frame').remove();
}, 3000);
I wonder if it would require some significant changes in both frontend and backend code, but have you considered using AJAX? The workflow would be something like this: user sends AJAX request to start file generating and frontend constantly polls it's status from the server, when it's done - show a download link to the user. I believe that workflow would be more straightforward.
Well, you could also try this trick. In parent window create a callback function for the iframe's complete loading myOnLoadCallback, then call it from the iframe with parent.myOnLoadCallback(). But you would still have to use setTimeout to handle server errors/connection timeouts.
And one last thing - how did you tried to catch iframe's events? Maybe it something browser-related. Have you tried setting event callbacks in HTML attributes directly? Like
<iframe onload="done()" onerror="fail()"></iframe>
That's a bad practice, I know, but sometimes job need to be done fast, eh?
UPDATE
Well, I'm afraid you have to spend a long and painful day with a JS debugger. load event should work. I still have some suggestions, though:
1) Try to set event listener before setting element's src. Maybe onload event fires so fast that it slips between creating element and setting event's callback
2) At the same time try to check if your server code plays nicely with iframes. I have made a simple test which attempts to download a PDF from Dropbox, try to replace my URL with your backed route's.
<script src="https://code.jquery.com/jquery-1.11.1.min.js"></script>
<iframe id="book"></iframe>
<button id="go">Request downloads!</button>
<script>
var bookUrl = 'https://www.dropbox.com/s/j4o7tw09lwncqa6/thinkpython.pdf';
$('#book').on('load', function(){
console.log('WOOT!', arguments);
});
$('#go').on('click', function(){
$('#book').attr('src', bookUrl);
});
</script>
UPDATE 2
3) Also, look at the Network tab of your browser's debugger, what happens when you set src to the iframe, it should show request and server's response with headers.
I've tried with jQuery and it worked just fine as you can see in this post.
I made a working example here.
It's basically this:
<iframe src="http://www.example.com" id="myFrame"></iframe>
And the code:
function test() {
alert('iframe loaded');
}
$('#myFrame').load(test);
Tested on IE11.
I guess I'll give a more hacky alternative to the more proper ways of doing it that the others have posted. If you have control over the PHP download script, perhaps you can just simply output javascript when the download is complete. Or perhaps redirect to a html page that runs javascript. The javascript run, can then try to call something in the parent frame. What will work depends if your app runs in the same domain or not
Same domain
Same domain frame can just use frame javascript objects to reference each other. so it could be something like, in your single page application you can have something like
window.downloadHasFinished=function(str){ //Global pollution. More unique name?
//code to be run when download has finished
}
And for your download php script, you can have it output this html+javascript when it's done
<script>
if(parent && parent.downloadHasFinished)
parent.downloadHasFinished("if you want to pass a data. maybe export url?")
</script>
Demo jsfiddle (Must run in fullscreen as the frames have different domain)
Parent jsfiddle
Child jsfiddle
Different Domains
For different domains, We can use postMessage. So in your single page application it will be something like
$(window).on("message",function(e){
var e=e.originalEvent
if(e.origin=="http://downloadphp.anotherdomain.com"){ //for security
var message=e.data //data passed if any
//code to be run when download has finished
}
});
and in your php download script you can have it output this html+javascript
<script>
parent.postMessage("if you want to pass data",
"http://downloadphp.anotherdomain.com");
</script>
Parent Demo
Child jsfiddle
Conclusion
Honestly, if the other answers work, you should probably use those. I just thought this was an interesting alternative so I posted it up.
You can use the following script. It comes from a project of mine.
$("#reportContent").html("<iframe id='reportFrame' sandbox='allow-same-origin allow-scripts' width='100%' height='300' scrolling='yes' onload='onReportFrameLoad();'\></iframe>");
Maybe you should use
$($('.dl-frame')[0].contentWindow.document).ready(function () {...})
Try this (pattern)
$(function () {
var session = function (url, filename) {
// `url` : URL of resource
// `filename` : `filename` for resource (optional)
var iframe = $("<iframe>", {
"class": "dl-frame",
"width": "150px",
"height": "150px",
"target": "_top"
})
// `iframe` `load` `event`
.one("load", function (e) {
$(e.target)
.contents()
.find("html")
.html("<html><body><div>"
+ $(e.target)[0].nodeName
+ " loaded" + "</div><br /></body></html>");
alert($(e.target)[0].nodeName
+ " loaded" + "\nClick link to download file");
return false
});
var _session = $.when($(iframe).appendTo("body"));
_session.then(function (data) {
var link = $("<a>", {
"id": "file",
"target": "_top",
"tabindex": "1",
"href": url,
"download": url,
"html": "Click to start {filename} download"
});
$(data)
.contents()
.find("body")
.append($(link))
.addBack()
.find("#file")
.attr("download", function (_, o) {
return (filename || o)
})
.html(function (_, o) {
return o.replace(/{filename}/,
(filename || $(this).attr("download")))
})
});
_session.always(function (data) {
$(data)
.contents()
.find("a#file")
.focus()
// start 6 second `download` `session`,
// on `link` `click`
.one("click", function (e) {
var timer = 6;
var t = setInterval(function () {
$(data)
.contents()
.find("div")
// `session` notifications
.html("Download session started at "
+ new Date() + "\n" + --timer);
}, 1000);
setTimeout(function () {
clearInterval(t);
$(data).replaceWith("<span class=session-notification>"
+ "Download session complete at\n"
+ new Date()
+ "</span><br class=session-notification />"
+ "<a class=session-restart href=#>"
+ "Restart download session</a>");
if ($("body *").is(".session-restart")) {
// start new `session`,
// on `.session-restart` `click`
$(".session-restart")
.on("click", function () {
$(".session-restart, .session-notification")
.remove()
// restart `session` (optional),
// or, other `session` `complete` `callback`
&& session(url, filename ? filename : null)
})
};
}, 6000);
});
});
};
// usage
session("http://www.ecma-international.org/publications/files/ECMA-ST/Ecma-262.pdf", "ECMA_JS.pdf")
});
jsfiddle http://jsfiddle.net/guest271314/frc82/
In regards to your comment about to get a better way to close it instead of setTimeout. You could use jQuery fadeOut option or any of the transitions and in the 'complete' callback remove the element. Below is an example you can dump right into a fiddle and only need to reference jQuery.
I also wrapped inside listener for 'load' event to not do the fade until the iFrame has been loaded as question originally was asking.
// plugin your URL here
var url = 'http://jquery.com';
// create the iFrame, set attrs, and append to body
var ifr = $("<iframe>")
.attr({
"src": url,
"width": 300,
"height": 100,
"frameborder": 0
})
.addClass("dl-frame")
.appendTo($('body'))
;
// log to show its part of DOM
console.log($(".dl-frame").length + " items found");
// create listener for load
ifr.one('load', function() {
console.log('iframe is loaded');
// call $ fadeOut to fade the iframe
ifr.fadeOut(3000, function() {
// remove iframe when fadeout is complete
ifr.remove();
// log after, should no longer exist in DOM
console.log($(".dl-frame").length + " items found");
});
});
If you are doing a file download from a iframe the load event wont fire :) I was doing this a week ago. The only solution to this problem is to call a download proxy script with a tag and then return that tag trough a cookie then the file is loaded. min while yo need to have a setInterval on the page witch will watch for that specific cookie.
// Jst to clearyfy
var token = new Date().getTime(); // ticks
$('<iframe>',{src:"yourproxy?file=somefile.file&token="+token}).appendTo('body');
var timers = [];
timers[timers.length+1] = setInterval(function(){
var _index = timers.length+1;
var cookie = $.cooke(token);
if(typeof cookie != "undefined"){
// File has been downloaded
$.removeCookie(token);
clearInterval(_index);
}
},400);
in your proxy script add the cookie with the name set to the string sent bay the token url parameter.
If you control the script in server that generates excel or whatever you are sending to iframe why don't you put a UID flag and store it in session with value 0, so... when iframe is created and server script is called just set UID flag to 1 and when script is finished (the iframe will be loaded) just put it to 2.
Then you only need a timer and a periodic AJAX call to the server to check the UID flag... if it's set to 0 the process doesn't started, if it's 1 the file is creating, and finally if it's 2 the process has been ended.
What do you think? If you need more information about this approach just ask.
What you are saying could be done for images and other media formats using $(iframe).load(function() {...});
For PDF files or other rich media, you can use the following Library:
http://johnculviner.com/jquery-file-download-plugin-for-ajax-like-feature-rich-file-downloads/
Note: You will need JQuery UI
You can use this library. The code snippet for you purpose would be something like:
window.onload = function () {
rajax_obj = new Rajax('',
{
action : 'http://mywebsite.com/my_export_route',
onComplete : function(response) {
//This will only called if you have returned any response
// instead of file from your export script
// In your case 2
// Text data : Error message => Display message to user
}
});
}
Then you can call rajax_obj.post() on your download link click.
Download
NB: You should add some header to your PHP script so it force file download
header('Content-Disposition: attachment; filename="'.$file.'"');
header('Content-Transfer-Encoding: binary');
There is two solutions that i can think of. Either you have PHP post it's progress to a MySQL table where from frontend will be pulling information from using AJAX calls to check up on the progress of the generation. Using somekind of unique key that is being generated when accessing the page would be ideal for multiple people generating excel files at the same time.
Another solution would be to use nodejs & then in PHP post the progress of the excel file using cURL or a socket to a nodejs service. Then when receiving updates from PHP in nodejs you simply write the progress of the excel file for the right socket. This will cut off some browser support though. Unless you go through with it using external libraries to bring websocket support for pretty much all browsers & versions.
Hope this answer helped. I was having the same issue previous year. Ended up doing AJAX polling having PHP post progress on the fly.
Try this:
Note: You should be on the same domain.
var url = 'http://mywebsite.com/my_export_route',
iFrameElem = $('body')
.append('<iframe class="dl-frame" src="' + url + '" width="0" height="0" frameborder="0"></iframe>')
.find('.dl-frame').get(0),
iDoc = iFrameElem.contentDocument || iFrameElem.contentWindow.document;
$(iDoc).ready(function (event) {
console.log('iframe ready!');
// do stuff here
});
I'm trying to make a cross-device/browser image and audio preloading scheme for a GameAPI I'm working on. An audio file will preload, and issue a callback once it completes.
The problem is, audio will not start to load on slow page loads, but will usually work on the second try, probably because it cached it and knows it exists.
I've narrowed it down to the audio.load() function. Getting rid of it solves the problem, but interestingly, my motorola droid needs that function.
What are some experiences you've had with HTML5 audio preloading?
Here's my code. Yes, I know loading images in a separate function could cause a race condition :)
var resourcesLoading = 0;
function loadImage(imgSrc) {
//alert("Starting to load an image");
resourcesLoading++;
var image = new Image();
image.src = imgSrc;
image.onload = function() {
//CODE GOES HERE
//alert("A image has been loaded");
resourcesLoading--;
onResourceLoad();
}
}
function loadSound(soundSrc) {
//alert("Starting to load a sound");
resourcesLoading++;
var loaded = false;
//var soundFile = document.createElement("audio");
var soundFile = document.createElement("audio");
console.log(soundFile);
soundFile.autoplay = false;
soundFile.preload = false;
var src = document.createElement("source");
src.src = soundSrc + ".mp3";
soundFile.appendChild(src);
function onLoad() {
loaded = true;
soundFile.removeEventListener("canplaythrough", onLoad, true);
soundFile.removeEventListener("error", onError, true);
//CODE GOES HERE
//alert("A sound has been loaded");
resourcesLoading--;
onResourceLoad();
}
//Attempt to reload the resource 5 times
var retrys = 4;
function onError(e) {
retrys--;
if(retrys > 0) {
soundFile.load();
} else {
loaded = true;
soundFile.removeEventListener("canplaythrough", onLoad, true);
soundFile.removeEventListener("error", onError, true);
alert("A sound has failed to loaded");
resourcesLoading--;
onResourceLoad();
}
}
soundFile.addEventListener("canplaythrough", onLoad, true);
soundFile.addEventListener("error", onError, true);
}
function onResourceLoad() {
if(resourcesLoading == 0)
onLoaded();
}
It's hard to diagnose the problem because it shows no errors and only fails occasionally.
I got it working. The solution was fairly simple actually:
Basically, it works like this:
channel.load();
channel.volume = 0.00000001;
channel.play();
If it isn't obvious, the load function tells browsers and devices that support it to start loading, and then the sound immediately tries to play with the volume virtually at zero. So, if the load function isn't enough, the fact that the sound 'needs' to be played is enough to trigger a load on all the devices I tested.
The load function may actually be redundant now, but based off the inconsistiency with audio implementation, it probably doesn't hurt to have it.
Edit: After testing this on Opera, Safari, Firefox, and Chrome, it looks like setting the volume to 0 will still preload the resource.
canplaythrough fires when enough data has buffered that it probably could play non-stop to the end if you started playing on that event. The HTML Audio element is designed for streaming, so the file may not have completely finished downloading by the time this event fires.
Contrast this to images which only fire their event once they are completely downloaded.
If you navigate away from the page and the audio has not finished completely downloading, the browser probably doesn't cache it at all. However, if it has finished completely downloading, it probably gets cached, which explains the behavior you've seen.
I'd recommend the HTML5 AppCache to make sure the images and audio are certainly cached.
The AppCache, as suggested above, might be your only solution to keep the audio cached from one browser-session to another (that's not what you asked for, right?). but keep in mind the limited amount of space, some browsers offer. Safari for instance allows the user to change this value in the settings but the default is 5MB - hardly enough to save a bunch of songs, especially if other websites that are frequented by your users use AppCache as well. Also IE <10 does not support AppCache.
Alright so I ran into the same problem recently, and my trick was to use a simple ajax request to load the file entirely once (which end into the cache), and then by loading the sound again directly from the cache and use the event binding canplaythrough.
Using Buzz.js as my HTML5 audio library, my code is basically something like that:
var self = this;
$.get(this.file_name+".mp3", function(data) {
self.sound = new buzz.sound(self.file_name, {formats: [ "mp3" ], preload: true});
self.sound.bind("error", function(e) {
console.log("Music Error: " + this.getErrorMessage());
});
self.sound.decreaseVolume(20);
self.sound.bind("canplaythrough",function(){ self.onSoundLoaded(self); });
});