Effectively and repeatedly updating DOM with data without breaking page - javascript

I'm creating a SPA which of course uses AJAX to dynamically load pages in a div. My layout is a side accordion menu where when you click on an item, it loads information on a div next to it. My problem is that if a person decides to click menu items as fast as they can per second, I start getting blank content in my pages. So my question is, what are some best practices (if any) to avoid pages breaking when you have people clicking so fast?
Things I'm doing for performance:
Using html templates and json files to build pages before application starts, keeping the (simple) pages in memory to grab - instead of calling ajax every click
For now, on each click I'm using at clearTimeout and setTimeout set to some milliseconds to slow down the process of switching pages - not ideal
Unfortunately, I can't post code because it's too long, but I'm hoping I can get some help with this information.
Thanks guys!
**Note: This is only front-end work, no databases because the load isn't heavy.
EDIT:
Here is a little snippet of my setTimeout and clearTimeout:
if (clickTimeout) { clearTimeout(clickTimeout); }
clickTimeout = setTimeout(function()
{
clickLink(menuItem.attr('href'));
}, 500);

I usually just disable the UI element that caused the event before sending the ajax, and then the client side code re-enables it after the data has arrived. This of course can only be done if you have code that runs when the data arrives. You might also be making mistakes when updating the DOM with the data if that is how you display it. Sometimes blank DIVs only appear after a while.
I think the title of the question is wrong and should be more along the lines of effectively repeatedly updating the DOM with data without breaking the page.

Related

Javascript run function in background regardless the page

i need to run a function periodically regardless the page where i am. This function will get some data periodically.
I dont think that this works:
function myFunc()
{
//your code
}
//set the interval
setInterval(myFunc,2000) //this will run the function for every 2 sec.
Because it works only for the page where I am right now, so if i go to another page, function is not executed anymore.
I would like to write a function that start running when user is at index page and then is called periodically until user close the page.
Any idea? Thanks in advance!
That's not possible with javascript in the browser. When you navigate away from the page, the script will stop. You have to include a script on every page that initializes this periodical update. Or you could rewrite your application to a "single page application", which seems to be popular nowadays.
You'll need a backend application or cron-job to do that.
Another way do that would be to make an Ajax-only single page application. I guess twitter uses that model.
Depending on what your doing in the function you may be best to use a JS Worker which will run as a new thread and allow you to continue processing as much as you want in the background without having to worry about JS timeouts.
The main point here is what your asking for is near enough impossible within JS unless you use something similar to jQUery and dynamically load your pages in to a div? This would mean you still have the effect (visually) that you changing page but the browser only loads the data in.
Its very easy to in fact to load content in to a DIV using jQuery its:
$('#elementoloadid").load("/path/to/load");
You could achieve this without using jQuery but will take you longer.

Page Renders before data pull is complete

I have two columns.
The column on the left renders first (obviously). The column on the right is injected code that is built from a data pull. The column on the right finishes rendering the page before the data is loaded and visible. Therefore the right column is blank, unless I refresh the page several times.
Is there a way to slow down the page rendering? The page renders properly in ie, but not ff and chrome.
I doubt you actually want to slow down the page render since that will annoy you users. What you probably want to do is improve your code so that it properly replaces the content of the div (or whatever) that holds your right column. Use an library like jQuery to make the request for the content as an AJAX call. Without code examples I can't give you anything more specific.
No but you can delay rendering the right column until your ajax call are completed. Depending on the Javascript library you're using, or if you're using raw Javascript, there should be a way to register a callback function to be called back when the data finished downloading from the server.
Load your data in this callback instead of relying on delaying for a specific amount of time, because that will always be wrong, too slow for some users and too fast for other users.

Will infinite scroll cause browser crashes?

I implemented infinite scroll like so:
new_page_value = 1;
$(window).scroll(function() {
if($(window).scrollTop() >= $(document).height() - $(window).height() - 200) {
new_page_value = parseInt(new_page_value) + 1;
get_page(new_page_value);
}
});
When the user almost reaches the bottom of the page (200px left) the function get_page() is called. This contains an ajax call that gets all the contents of the new page and appends it to the <body> of the document.
Now I just realized if my site gets big and instead of having 10 small pages I have a gazillion giant pages then the user's browser might crash if they are persistent enough to keep infinite scrolling for long time.
Would this be a possible solution to this problem:
I will keep appending the new pages to the document <body> until the 10th page, after that I will be replacing the <body> content entirely instead of appending. So using html() rather than append().
I just don't know if this will actually work to prevent crashes. Will .html() clear the "memory" of prior html that was brought in via ajax?
I really think this is a common issue for many sites with AJAX list content. So let's take an example at some of the most popular ( think of scale = experience ) websites and their solutions :
Google Images
If you check out images.google.com and you search for whatever, for e.g. "guiness", you will see a page full of results (actually the images are ajax loaded, not the html-code, so the page is with fixed height) and when you scroll at the bottom there is a button "Show more results". This might be solution one of your problem, but is it really necessary to place a button at the bottom after, for e.g. the 10-th page? I really think it is generally a good solution for page usability and memory leaks, but it is really not a necessary option as we can see in :
Facebook
Facebook Newsfeed is another story. There is a button "Show more posts", but I really don't know when exactly it is displayed rather than loading the next page of posts. It happened to me once to load 10-15 pages of posts, only by scrolling. And you know Facebook posts include videos, photos, AJAX comments and a lot of more Javascript fancy stuff, which take a lot of memory. I think they've managed to do this after a lot of research, how much of the users scroll to the bottom.
Youtube
Youtube has "Load more videos" at every page, so the solution is basically similar to Google, except that Google renders the whole html of the page and on scrolling just loads the images.
Twitter
Twitter supports infinite scrolling. Yep, they do it may be because tweet is 140 characters and they don't need to worry about memory so much. After all who is willing to read more than 1000 pages of tweets at one page load. So they don't have a button for "load more" and they don't need one.
So there are two solutions :
Use infinite scrolling ( you should consider how much content you load and how rich it is )
Use button : "Load More"
Most of all, you should not delete already loaded content of a list.
Nowadays everything is Javascript and Javascript has garbage collection, so it is very hard to unload the DOM ( if it has Javascript, not plain text ) and manage to remove the Garbage from Javascript. Which means that you won't free the whole allocated memory of the unloaded content from the browser.
Also think about of your requests, why would you need to load again something, that you have already loaded at first place. It costs another server request, meaning another database request and so on.
I have worked with this before and here are some of my thoughts:
a) If you are appending data to the memory page(s) at a time then it is not an issue, some browsers might not respond well but most of the lastest browsers will render without any problem so long as there is enough memory on the target machine, you could probably see how the ram usage increases as you append pages. Use chrome for this as each page is a separate process and it has an inbuilt task manager
b) regarding usage of html(), it indeed removes the markup but it does so at a heavy cost as it tries to take care of special conditions and has an overhead and accesses all the controls nested within the container that you are replacing (not sure about the last pat), but it has a cost. A simpler way to clear the DOM would be to use the innerHTML property and set it to empty, jquery does this but it is at a later point in the html() api. open up the api and look at the method.
using innerHTML
$("<selector>")[0].innerHTML=""
Also deletion of pages sounds weird to me as a user, what if I want to go back to the initial comments and please dont think about making it an infinite scroller too.. I have tried and given up after the number of bugs raised but we had a genuine use case for it and I had to stick a button up there, but this wasnt when the user scrolled away from the first page, this is when the user landed on a 3rd page but now needs to see the results above it.
Hope that answers your question and btw infinte scrolling is your friend use it, dont over engineer a case which will probably only be tested by your QA team. Its better to spend your effort somewhere else.
Yes it will, if i may suggest an idea after let's say 5 pages just delete the first page and append the new one instead of deleted all of the previous pages. good luck :)

Page elements don't visibly update during load

I'm probably missing something really obvious here...
I'm showing a dialog box with progress bar during page load. The dialog and progress bar are both jQueryUI widgets. There are a couple of phases of loading - the page makes a load of jQuery $.get() requests to load resources, then on the $(document).ajaxStop() event, does things with those resources. I'm updating the progress bar and some status text throughout this process.
The issue is that as soon as the ajaxStop event fires, updates stop. The code works nicely during resource loading, but then freezes and I don't see any of the updates during processing. If I put a breakpoint on a post-ajaxStop update in Chrome and step through the code, the screen updates correctly so I know that the code works.
Can anyone explain why everything updates nicely during my AJAX loading phase, but then stops on the ajaxStop event? Is there an easy way to make updates continue afterwards?
Thanks!
Several hours of searching later, the following blog pointed me in the right direction:
There's a jQuery extension described in the entry which allows you to define two functions, one to compute and one to update the UI. It schedules them alternately using the setTimeout function.
I've had to rewrite my code in something akin to continuation passing style so that each function schedules its continuation to run using setTimeout. This returns control to the browser for long enough for the screen to be updated.
This feels like a bit of a hack though to get round browser/Javascript limitations. Anyone know of a better way?

How can I update a webpage when a change occurs on the server?

Is there a way to push pages on change rather than putting a timer on a web page to refresh every x mins? I guess what Im trying to do is not refresh an entire page when only a portion of it may have changed. I have seen on FB when an update happens, it has message saying new content available.
Perhaps you could MD5 a page then when an update happens the MD5 changes and the page could be checking this. Not exactly push but it would reduce the traffic of an entire page.
How can I update a webpage when a change occurs on the server?
a good practice to "reduce the traffic" is to load content through AJAX requests.
the "timer" you mentioned above is my preferred method with my previous comment and a bit of extra logic. This is know as long-polling.
One way is to watch for specific keyboard events and/or mouse events and update the page if certain criteria is met within those events.

Categories

Resources