I'm probably missing something really obvious here...
I'm showing a dialog box with progress bar during page load. The dialog and progress bar are both jQueryUI widgets. There are a couple of phases of loading - the page makes a load of jQuery $.get() requests to load resources, then on the $(document).ajaxStop() event, does things with those resources. I'm updating the progress bar and some status text throughout this process.
The issue is that as soon as the ajaxStop event fires, updates stop. The code works nicely during resource loading, but then freezes and I don't see any of the updates during processing. If I put a breakpoint on a post-ajaxStop update in Chrome and step through the code, the screen updates correctly so I know that the code works.
Can anyone explain why everything updates nicely during my AJAX loading phase, but then stops on the ajaxStop event? Is there an easy way to make updates continue afterwards?
Thanks!
Several hours of searching later, the following blog pointed me in the right direction:
There's a jQuery extension described in the entry which allows you to define two functions, one to compute and one to update the UI. It schedules them alternately using the setTimeout function.
I've had to rewrite my code in something akin to continuation passing style so that each function schedules its continuation to run using setTimeout. This returns control to the browser for long enough for the screen to be updated.
This feels like a bit of a hack though to get round browser/Javascript limitations. Anyone know of a better way?
Related
I'm trying to identify roughly when the DOM is finished updating after a page is loaded via AJAX on any arbitrary website.
My current method first listens for the chrome.webNavigation.onHistoryStateUpdated event in a background script, then executes a content script in which a MutationObserver detects changes to the website's body. From there, unfortunately, it seems like it's a bit more finicky. If I just wait for the first mutation where nodes are added to the DOM, I wind up in many cases (YouTube, to give one example) where the page is still blank. Other more hacky approaches I've considered include things like just using setTimeout or waiting for the page to reach a certain length, but those seem clearly wide open to exception cases.
Is there a more fool-proof way to detect that the DOM has roughly finished updating? It doesn't necessarily have to be perfectly precise, and erring on the side of triggering late in my use case is better than triggering early. Also it isn't important at all that resources like video and images be fully loaded, just that the text contents of the page are basically in place.
Thanks for your help!
So I have a very specific problem that presented itself recently (right before our planned launch day tomorrow) and I am not completely sure how to solve it. I have built our website of an HTML-template with my modest front-end skills and we are very pleased with it. However, I can't seem to solve this.
The problem:
I have a filter system that allows a user to filter articles that are presented on a page. A user can even fill in this filter on the home page, direct to the page with the articles and have the filter applied. However, if then the filter is broadened (less strict) and new articles present itself, the pictures do not show up. Found out this is the case because the flexslider behind it has to be initialized again which happens on a window load (e.g. when the window is resized). The function that controls the initialization of the flexslider is in an external js file and I am not sure whether I can call on it from my own custom.js file, so I am thinking of just calling a resize/reload window function to active it.
The question:
Can I run a resize window function (or something that activates the flexslider) without hindering user experience (more specifically, without ACTUALLY resizing/reloading the window)? I will run this on a change in the filter.
I know this is a very specific question but hopefully somebody can help me out.
Take care!
p.s. it would be ideal if I could run the actual function that loads the flexslider but this is located in an external js file.
EDIT:
Briefly some additional info. If I go straight to the article page, it has no filter active and thus shows all articles, if I then start flipping through the filter, all is good. It is however only if I arrive from the homepage with a set filter that the problems arise. You then arrive on the article page which shows only the articles that are within the boundaries, and when the filter is taken away it has problems loading the images of the new articles showing up. As if it had not loaded these because they were not open on window load the first time.
You can trigger a resize event by creating a new event and passing it into the dispatchEvent command on window. There's a nice guide here. You'll want the type of event to be resize, since that's what it's listening for.
window.dispatchEvent(new Event('resize'))
This will work for events that were added via jQuery as well as events added via addEventListener.
I managed to solve it after all by delaying the function that drops the filter values into my inputs so it loads in all images initially before applying the filter. It happens at such speed it's hardly noticeable.
Also, I did try to initiate a window resize function, it did work without actually resizing anything, but unfortunately the images did not load in properly (overlap and such).
Anyway, it has been solved. Thanks for all the input!
I'm creating a SPA which of course uses AJAX to dynamically load pages in a div. My layout is a side accordion menu where when you click on an item, it loads information on a div next to it. My problem is that if a person decides to click menu items as fast as they can per second, I start getting blank content in my pages. So my question is, what are some best practices (if any) to avoid pages breaking when you have people clicking so fast?
Things I'm doing for performance:
Using html templates and json files to build pages before application starts, keeping the (simple) pages in memory to grab - instead of calling ajax every click
For now, on each click I'm using at clearTimeout and setTimeout set to some milliseconds to slow down the process of switching pages - not ideal
Unfortunately, I can't post code because it's too long, but I'm hoping I can get some help with this information.
Thanks guys!
**Note: This is only front-end work, no databases because the load isn't heavy.
EDIT:
Here is a little snippet of my setTimeout and clearTimeout:
if (clickTimeout) { clearTimeout(clickTimeout); }
clickTimeout = setTimeout(function()
{
clickLink(menuItem.attr('href'));
}, 500);
I usually just disable the UI element that caused the event before sending the ajax, and then the client side code re-enables it after the data has arrived. This of course can only be done if you have code that runs when the data arrives. You might also be making mistakes when updating the DOM with the data if that is how you display it. Sometimes blank DIVs only appear after a while.
I think the title of the question is wrong and should be more along the lines of effectively repeatedly updating the DOM with data without breaking the page.
I am having a backbone.js application that I am writing.
When user press a "Search" button, I show a loading.gif image (by making it block), while I let the javascript code to continue. Once the javascript code is complete, I unhide the loading image (changing the display to none).
I am able to see it working in Firefox. In safari/and chrome, the change of CSS don't get applied until the javascript code is completed, and thus user don't see the loading image when the search is being performed.
Any way to fix this?
Thanks
A couple of things strike me as odd.. but to answer your question first:
Most DOM/css changes do not get applied until the executing Javascript returns. To get around this you can make your DOM change and then set a timeout to execute the rest of your Javascript code.
ex:
// make your image visible
function continuation() {
// Put the javascript task that you need to execute here
}
// setTimeout will release execution control back to the browser so your CSS/DOM updates
// can be applied. Once those updates are applied, continuation will be called
// by the browser and your remaining javascript can run.
setTimeout(continuation, 0);
Now it seems odd that you would have any javascript that would take so long to run that you'd have time to even see a loading gif. It would make sense to see a loading image if your are firing an XHR (ajax) request but if you are doing that then you shouldn't be having the issue you are describing. What exactly is this javascript task of yours doing?
I had a similar issue with a loading image which turned out to be because the image hadn't been loaded into the browser and for whatever reason it didn't display until something else completed. I believe in my case an XHR was somehow blocking the loading or display of the image. From memory, this only happened the first time the loading image was displayed, after that it was fine. I ended up adding an element to the page html to load the loading image and then hid it with javascript. This solved the problem..
Is JavaScript intended to be running as little as possible on a website/webapp? By that I mean is the usual intention to run through all your js files as soon as the page loads and put them aside, and then when functions come up to execute them right away and be done with it?
I'm working on a project using google maps and I have a custom marker object scripted out, and a debugger has told me that the browser runs through all my js files before anything even appears on the page.
My problem comes in here: I wanted to animate certain markers to bounce up and down continuously with jQuery (similar to OS X icons in the dock) and my several attempts at infinite loop functions all just crash the browser. So I understand that the browser doesn't like that, but is there a way to have a simple script be repeating itself in the background while the user navigates the page? Or is JavaScript just not supposed to be used that way?
(I worked with Flash for a long time so my mindset is still there.)
Yes, Javascript functions should just do their bit and exit as soon as possible. The GUI and the scripts run on the same single thread, so as long as you are inside a Javascript function, nothing shows up in the browser. If you try to use an infinite loop, the browser will appear to freeze.
You use the window.setInterval and window.setTimeout methods to trigger code that runs at a specific time. By running an interval that updates something several times a second, you can create an animation.
You have to set a timer to execute a script after a defined time.
var timer = setTimeout(code, milliseconds);
will execute code in so-and-so milliseconds. Each execution of the script can set a new timer to execute the script again.
You can cancel a timed event using clearTimeout(timer).
Use setTimeout() or setInterval(). The MDC articles on it are pretty good.
You'll need to update inside of functions that run quickly, but get called many times, instead of updating inside of a loop.
Since you said that you are using jQuery, consider using its effects API (e.g., jQuery.animate()), it will make your life much easier!
Personally, I save as much code as possible for execution after the page has loaded, partly by putting all my <script>s at the bottom of <body>. This means a (perceived) reduction in page load time, whilst having all my JS ready to run when need be.
I wouldn't recommend going through everything you need to do at the beginning of the document. Instead, bind things to events such as clicks of buttons, etc.