Browser not updating text before inserting Ajax response with innerHTML - javascript

I'm working on a CMS which has a tree like page structure, so I am trying to emulate the Windows Explorer one uses to browser their C drive for example. So initially I list the pages at the root level, and using an onClick event and AJAX, clicking on a root page will display pages below that, in a DIV I've created/allocated for that.
All works fine, and I have an animated loading gif displayed in another DIV while xmlhttp.send is running, which is switched off when if (xmlhttp.readyState==4 && xmlhttp.status==200) is true.
The problem is that when there are a large number of sub pages (1,000 or so and yes, the client created them), the AJAX completes and the it gets to document.getElementById(DivId).innerHTML = xmlhttp.responseText; which causes the gif to stop spinning. So I've googled that and it seems this is a browser issue.
So I thought, I'll use the gif during the AJAX call and display a wait text while the browser is rendering the new innerHTML. However, despite it taking several seconds, this text never gets displayed, I just see the frozen gif and then once rendered, the "done" text.
If I comment out the "done" line, the wait text does get displayed though.
The code is below:
function getPages(page_id, DivId)
{
var loadingicon_div = "page_" + page_id + "_loadingicon";
var loading_icon = 'image here, not allowed to post images..';
document.getElementById(loadingicon_div).innerHTML = loading_icon;
xmlhttp = new GetXmlHttpObject();
xmlhttp.onreadystatechange = function()
{
if (xmlhttp.readyState == 4 && xmlhttp.status == 200)
{
document.getElementById(loadingicon_div).innerHTML = "Retrieved pages from the server, please wait while your browser displays them ...";
document.getElementById(DivId).innerHTML = xmlhttp.responseText;
document.getElementById(DivId).style.padding="5px";
document.getElementById(loadingicon_div).innerHTML = "done";
}
}
var url="tree_ajax.php";
xmlhttp.open("GET",url,true);
xmlhttp.send(null);
}

You should really be using a framework. I recently whipped up almost the same thing using ajax treeview built into extjs. Trust me this will save you lots of headaches. Cross browser dom is a PITA.
I know this doesn't answer your specific question, but heed my advice!
http://dev.sencha.com/deploy/dev/examples/tree/reorder.html

The only solution to this is a redesign. - no it's not, but you really should - read on ;)
The reason the animated gif stops animating is because the browser is using all it's muscle to append new elements to the DOM and to re-render/re-paint the page after this is done. There simply is no cputime left for it to animate the gif.
The slower the computer/browser is the more you will see this problem, but if you build a page with enough elements, you can make this happen in google chrome on a 4ghz computer.
The browser is a singlethreaded beast by nature when it comes to javascript/DOM manuipulation/rendering/painting and thus it can only really do one thing at a time.
It tries to make up for this by breaking complex operation up into slices and then run these slices in a way that makes it seem like it can do multiple things at once (much like operating systems does on single core machines, although much simpler)
But once you get a sufficiently complex dom structure the re-rendering/re-painting slice becomes so big that the browser seems to freeze while it happens.
As for the text not appearing:
If you wish to make a small DOM manipulation take effect (like inserting your text) before a you do a large DOM-manipulation, you need to make the browser treat this as two separate slices (which it doesn't want to do, because DOM-manipulation and subsequent re-rendering/re-painting is very cpu-costly)
To do this, break the callstack in javascript by using a setTimeout.
This will allow the browser to do a re-render/re-paint before the next DOM manipulation and subsequent re-render/re-paint.
Usually it is enough to do the setTimeout with a zero based delay, since the setTimeout in itself breaks the callstack, but in some cases you will need a small delay - play around with it and find your sweetspot :)
example:
//do small DOM manipulation here
setTimeout(function(){
//do major DOM manuipulation here
},0);

Related

How to replay an animated GIF in javascript (vanilla) - Without JQuery

My current issue is that I have a loading bar animation on my web-based app that is shown (obviously) when the whole page or specific things are loading up. It is supposed to look like one of those Samsung TV Apps so it needs to be quite polished with the UX.
What me and my team are doing right now is a mix between creating an element for it and I assumed it gets cached in the local device which is an issue. I've known of a few ways that I can go around this like adding a Math.random() query at the end of the src url but I'd rather not follow that route for now.
I also saw a way that I believe involved simply setting the element.src = 'theSameUrl.gif' URL to be the same and I assume forcing the device to reload the file instead of using the cached one.
I would also be open to trying new file types that could make this a lot easier but I must keep in mind that this app will work on a LOT of different hardware, from Samsung TV's to BT Boxes or even Virgin Media Tv Boxes, amazon firestick etc.
At this point I'll take anything :P
You can "force it" to reload by wiping it source: img.src = ""
Then you set it again: img.src = "your_src_path"
This way your .gif will start from zero, at least on Edge, Chrome and Firefox, it should work the same way on a TV.

Why does when i dynamically load a script, firefox randomly stop loading the tags scripts?

Why does Firefox randomly stop loading the <script> tag added dynamically with js?
On this picture, I load dynamically these scripts and I add them to the dom
"/assets/js/lib/socket.io-1.3.6.js"
"/assets/js/lib/tweenmax.min.js"
"/assets/js/lib.js"
"/assets/js/module.js"
"/assets/js/modules"
Quite randomly, the result is this, a big lag between a random script loaded dynamically and the rest of the scripts ( between 7-15s )
I actually load my scripts like that
function(url, callback){
var elem = document.createElement("script");
elem.async = true;
elem.src = url;
elem.type = "text/javascript";
elem.onload = callback;
document.getElementsByTagName("body")[0].appendChild(elem);
}
EDIT:
When I add scripts tags in my html page, the lag doesn't appear, it only appears when I load the scripts with JavaScript. But I actually need to load these scripts with JavaScript.
There is a fiddle of the bug https://jsfiddle.net/ccgb0hqr/
If the alert show up instantly refresh the page until the bug happens
Looks like socket.IO may be taking some time to load and then firing off multiple requests which will block your subsequent requests (Firefox will handle 6 at a time I believe) which coincidentally is the same amount of requests to /socket.io/, it might also explain the intermittent nature of the bug since the other requests may get in before or after socket.io initialises.
Try excluding socket.io and/or making it the last script to load to see if that helps.
You might also want to investigate any specific socket.io bugs like this one.
Looks like it was a bug from firefox.
Newer versions of firefox does not have this bug

How can you programmatically check if an iframe would be blocked by a site?

I'm generating a bunch of iframes dynamically that load random websites, and I was wondering if there would be a way to programmatically check if iframing for a website were blocked so I can fall back to a thumbnail of the site instead. Is there a way to do this, and if so, how? (JQuery is preferred.)
A very quick and dirty method to bypass some iframing blocks would be to append the url with a free defferer, like http://anonym.to/?site.com, though I don't really recommend this in legitamite practice.
There is no way to detect and stop frame blocking short of disabling JavaScript. Check this out, this is a function I wrote in response to the coding horror post We've been... Framed! Its purpose was basically, "We want to make it impossible to frame this site... the easiest way is to just take it out on the user. The framing site will find out soon enough and the use will likely blame them." (It is a bit on the evil side, which is why it was only written as a thought experiment... but it works)
// if this is a framed site
if( window[ [ "t", String.fromCharCode( 111 ), "p" ].join( "" ) ] != window )
destroyTheBrowser(); // royally mess with the user.
function destroyTheBrowser()
{
for( var i = 0; i < 100; i++ )
{
setInterval( destroyTheBrowser, 1 );
}
}
Firefox and Safari crash after consuming an additional 300M memory (last benchmark). Chrome crashes the tab. IE cripples the entire operating system. Can anyone show me script which will prevent this anti-anti-framing script from really messing up the user's browser?
If you don't care about JS, just load it through AJAX into a div with overflow set to scroll.

Javascript: Cancel/Stop Image Requests

I have a website that makes heavy use of Ajax. Occasionally I need to load large image files on the page for the user. My question is, when these large image files are being download, is there a way to stop them if, say, the user navigates away from the page displaying the image? Thanks.
I had the exact same issue, when users 'paged' quickly through (ajax) search results the browser was still trying to download profile images for every page not just the current one. This code worked for me, called on the paging event just before the new search was run:
//cancel image downloads
if(window.stop !== undefined)
{
window.stop();
}
else if(document.execCommand !== undefined)
{
document.execCommand("Stop", false);
}
Essentially it's like clicking the "Stop" button on the browser.
Tested in IE, FireFox, Chrome, Opera and Safari
like this.
$(img).attr('src','');
Assuming that you are using ajax to load the images, you could simply abort the request in the window.onunload event. Declare a global variable for the XMLHttpRequest object that you are using.
var xhr;
//if using the XMLHttpRequest object directly
//you may already be doing something like this
function getImage(...){
xhr = new XMLHttpRequest();
xhr.open(....);
}
if using jQuery, you could assign the return value of the call you $.ajax() or $.get to xhr variable.
xhr = $.ajax(.....);
Handle the window.onunload and abort the request.
window.onunload = function(){
xhr.abort();
}
Reassigning the SRC tag to a different image does not work in IE7, it continues trying to download the first image.
Here is the setup:
I created an HTTP handler that is of type JPEG. It contains code that never finishes executing. So someImage.src=myhandler.ashx will perpetually sit there loading until it times out.
In the middle of this, press another button that reassigns the image to a small image file:
someImage.src=small.jpg
The request for myhandler.ashx does not end, even though we have reassigned the src.
Furthermore if you actually delete the node someImage.parentNode.removeChild(someImage) is still keeps trying to download myhandler.ashx
Tested with IE7 and monitored with HTTP Watch Pro.
The poor mans solution would be to simply set the SRC property of the image tag to an empty string, or point it towards a widget.
edit
Saw your comment, surprised it doesn't work with changing the SRC property to empty... try using a blank.gif or something.
If that doesn't work, you may be bounded by browser architecture, meaning you are S.O.L.

Safari issues with javascript + css

I have some strange behavior going on with safari, im using the jQuery.GridLayout plugin and css for styling.
Just for some context, this website layout is a simple header followed by the content which are a collection of blocks (each block is a div) positioned by the javascript and rearranged every time the window is re-sized.
When I direct safari to the website url all the blocks overlap to some degree (like 50%) but as I re-size the window if they have to move, automatically all goes to the correct place and only breaks if I refresh the page.
So it seems that loading the page is messing it up either because something fails to register or because something does not happen until I re-size the window.
As anyone experienced such behavior within safari?
It works perfectly in firefox and opera, its an valid html 4.01 transitional page and the css is also validated (wc3 wise that is).
I know that publishing the code is invaluable to sort this kind of issues but this is a production project and I'm obliged not to it.
Either way I appreciate any advice on were to start looking?
How do one goes about debugging this issues in safari?
Thank you.
Safari fires DomReady before linked resources are loaded. This race condition regarding calculating sizes of elements defined in CSS can usually be avoided by loading your CSS resources before any JavaScript (eg: make sure the tags appear in the before ANY tags (which are blocking, but give a change for CSS to load asynchronously). Worse case scenario, move your blocks to the last element in , leaving your tags above.
CSS concatenation of multiple files (if you have them) is also recommended.
If you aren't able to post the actual code of the page for us, you might find your solution while trying to reproduce the problem without your specific content. In the past, I've solved some of my own problems while trying to generate a page that shows the problem to post on IRC / SO. If you are able to reproduce the problem without your content, post it for the community, and an answer will be much easier to find.
My shot-in-the-dark guesses lead towards:
You may find that one of your content blocks is causing the issue.
You may find that a different library you are using is causing the issue.
Some javascript code for your layout may be running before everything is ready / filled in. From my memory, Safari is quick to display pages before images are loaded for instance.
Perhaps you need to specify the an exact width/height of some of your Grid Containers.
Small update:
(new update at bottom)
http://www.howtocreate.co.uk/safaribenchmarks.html
And also something that is working is this small script:
<script language="JavaScript">
// CREDITS:
// Automatic Page Refresher by Peter Gehrig and Urs Dudli www.24fun.com
// Permission given to use the script provided that this notice remains as is.
// Additional scripts can be found at http:
//www.hypergurl.com
// Configure refresh interval (in seconds)
var refreshinterval=20
// Shall the coundown be displayed inside your status bar? Say "yes" or "no" below:
var displaycountdown="yes"
// Do not edit the code below
var starttime
var nowtime
var reloadseconds=0
var secondssinceloaded=0
function starttime() { starttime=new Date() starttime=starttime.getTime() countdown()
} function countdown() { nowtime= new Date() nowtime=nowtime.getTime() secondssinceloaded=(nowtime-starttime)/1000
reloadseconds=Math.round(refreshinterval-secondssinceloaded) if (refreshinterval>=secondssinceloaded)
{ var timer=setTimeout("countdown()",1000) if (displaycountdown=="yes")
{ window.status="Page refreshing in "+reloadseconds+ " seconds"
} } else { clearTimeout(timer) window.location.reload(true) } } window.onload=starttime
</script>
I find it odd that a refreshing script solves the issue in safari, but if i manually refresh the page the page havoc ensues...
########UPDATE##########
Well I finally got some more time to work on this and after doing some reading a rather obvious thing came to my mind, let the content load and then format it, so for now all of my js sits between </body> and </html>.
Its not perfect since now you can catch a glimpse of the content without being properly placed when the page first loads.
Maybe ill try calling the js a second time after a few ms have passed of loading.
I know this was proposed a bit upper the thread I just needed time to get my hands dirty thanks all, Ill keep updating till I get it solved in a more proper fashion :)

Categories

Resources