Wait Until Infinite Scroll Finishes Loading - Javascript - javascript

I'm working with an "infinite scroll" page that calls up to 40 elements at a time when a user scrolls to the bottom of the page. How do I detect the moment at which all the content of the most recent set has been loaded?
This only works for initial page load:
$(window).load(function() {
// Do something
});
Is there something similar for when there's a "load" long after the page has already been done loading?
Sorry if this is a repeat of another question. I was unable to find a solution anywhere else.

After some more digging around, I've used a variation of this question/answer.
My end result looks something like this:
function ($container, previousCount, callback) {
var _imgs = $container.find('img'),
_newImages = _imgs.slice(previousCount), // 'start index' for new images
imgCount = _newImages.length, // count how many new ones exist
counter = 0;
_imgs.on('load', function() { // Count loaded
runCallback(20, callback);
}).on('error', function() { // Count errors, anyway, to reach total
runCallback(20, callback);
});
function runCallback(counterLimit, callback) {
// if counter meets new count or hits counter limit, run callback
counter++;
if (counter == imgCount || counter == counterLimit) {
callback();
}
}
}
Some quirks to why I needed it this way. previousCount can actually be greater than total number of images currently in the DOM, this is because people can jump throughout different portions of the infinite scroll. Because of this, there's also the check for counter == counterLimit.
Also, I need to run a script when the new set of images is done loading, regardless of whether they succeeded or failed. This is why I'm using, both, .on('load' and .on('error' to run up the count. I noticed that sometimes an image can error out (which is fine in this particular case) and it will throw off the count and never fire the callback.
==================
EDIT 04/27/16: I've since discovered a plugin that can handle this for you and seems to work well (with the exception of several subsequent loads before previous load is done loading). Check out imagesLoaded.

Related

setTimeout firing too soon(?)

Alright, right now I'm writing a little JavaScript code that I can just simply copy paste into the Firefox Console and run. (I'm sorry, I'm still a massive noob, and I want to write a little script that basically, opens a certain web page and collects information form certain divs in it.)
However, I'm struggling at the moment. I would like to open a certain webpage, and then, after it is entirely loaded, execute a certain function. (for simplyfying reasons, this function just counts from 0 to 99.)
function openGroupsPage() {
window.location.replace(groupURL);
setTimeout(function() {
for (i = 0; i < 100; i++) {
console.log(i)
}
} , 10000)
}
openGroupsPage()
My problem is : The incrementing function never gets called (or atleast it seems like it because i can never see any kind of output in the console.) Why is my setTimeout not working or what is another option to accomplish what I would like to do? I would just really like to run a specific function when the newly accessed website is finished loading entirely.
When you change the location, the window object and all of its associated things (including timers) are discarded and a new one created. You can't schedule code to run in the new document from within the old document (not even from the browser console). You'll have to paste and execute your code after navigating to the new page, not before, which means you can't navigate to it from within your code.
You might look at tools like TamperMonkey or GreaseMonkey and such that let you run code in response to pages loading that match certain URLs.
window.location.replace() exits the current page and loads a new one. So any remaining JavaScript of the current page isn't executed anymore
Your function is working fine the only problem is window.localtion line reload the website with the url you provided so the entire page is getting reload from start and your page lost your function.
try the below to understand
function openGroupsPage() {
//window.location.replace('http://www.google.com');
setTimeout(function() {
for (i = 0; i < 100; i++) {
console.log(i)
}
} , 1000)
}
openGroupsPage()
You could add an eventListener do the document of the page which fires when the page is loaded.
document.addEventListener('DOMContentLoaded' function(e) {
// page is loaded do you fancy stuff in here
// no timeout needed
});
EDIT: Overlooked that he want to do it over the console on a random page. This won't work because on locationchange all current scripts are stopped and global objects are destroyed.
Use something like Greasemonkey for that.

Why does function only run once?Trying to run function multiple times,after previous invokation complete, using an counter function

I am currently working on a book with page turn effect in jQuery (no plugin). The page turn effect works fine so far, as long as you click through the pages one by one. But now I want to include a dropdown selection (i.e. a select element) so the user can directly jump to the selected content. I tried to make this work with loops and with the .each() method, so that the turnRightPage/ turnLeftPage function is called repeatedly, until the page with the selected content is shown. But after quite a bit of trial and error and a lot of research, I think loops iterate too fast for my turnRightPage /turnLeftPage()-function (which are the transform functions that turn the respective page), in that the loop is done, before the function has completed. I think, what I need to do, is find a way to pause the loop until the function has finished executing and then resume with the next iteration. I think the most promising approach would be using a function with an iteration counter, like it was suggested here:
Javascript: wait for function in loop to finish executing before next iteration (Thanks to jfriend00 at this point) I have also read
Invoking a jQuery function after .each() has completed and
wait for each jQuery
among others, where similar solutions were suggested.
Below is how I tried to implement jfriend00's callback. I added a return statement to break out of that "callback loop", once the number of page turns is completed.
//determine whether to flip pages forward or back - first forward
if(currentPagePos < foundPagePos){ // => turn right page
//determine how many times need to turn page
if (pageDifference > 1 && pageDifference % 2 !=0) {
var numPageTurns = (pageDifference-1)/2;
pageForward (numPageTurns);
} //else if ... rest omitted for brevity
}
function pageForward (numPageTurns){
var i = 0;
function next(){
i++;
if (i <= numPageTurns){
turnRightPage ();
} else {
return;
}
}
next();
};
The full code can be seen here: http://jsfiddle.net/snshjyxr/1/
It DOES turn the page, but only once! What am I missing?
I am still very new to javascript / jQuery so my apologies, if the problem seems all too obvious. Any pointers appreciated. Thx!
The thing is all the page turns are fired, but all at once. You have to wait until each transition is finished to start the next one.
Use a callback function in your turnRightPage and turnLeftPage functions. Example for turnRightPage :
function turnRightPage(callback) {
[...]
//change class AFTER transition (frm. treehouse-site)
$page.on('webkitTransitionEnd otransitionend oTransitionEnd msTransitionEnd transitionend', function () {
//need to double-set z-index or else secondtime turning page open setting z-index does not work (tried in Chrome 38.0.2125.111 m)
$page.css("z-index", turnedZindex + 1);
$(".turned").removeClass("turned");
$page.addClass("turned");
if(typeof callback == "function") {
callback();
}
});
};
And in your pageForward function, use turnRightPage recursively:
function pageForward(numPageTurns) {
console.log("number of FORWARD page turns: " + numPageTurns);
if(numPageTurns > 0) {
turnRightPage(function(){
pageForward(numPageTurns - 1);
});
}
};
Here is your updated jsfiddle. As you can see, there's a remaining bug when you make several page changes which is caused by the fact that you're adding listeners on the transition end every time a page is turned, and never removing them. So they're all executing every time.
EDIT: jsfiddle updated again without the annoying last bug. As you can see, all it took was to unbind the event listener as soon as it's fired.

Is it possible to show an element just before entering a long running sync process?

This is a very simple use case. Show an element (a loader), run some heavy calculations that eat up the thread and hide the loader when done. I am unable to get the loader to actually show up prior to starting the long running process. It ends up showing and hiding after the long running process. Is adding css classes an async process?
See my jsbin here:
http://jsbin.com/voreximapewo/12/edit?html,css,js,output
To explain what a few others have pointed out: This is due to how the browser queues the things that it needs to do (i.e. run JS, respond to UI events, update/repaint how the page looks etc.). When a JS function runs, it prevents all those other things from happening until the function returns.
Take for example:
function work() {
var arr = [];
for (var i = 0; i < 10000; i++) {
arr.push(i);
arr.join(',');
}
document.getElementsByTagName('div')[0].innerHTML = "done";
}
document.getElementsByTagName('button')[0].onclick = function() {
document.getElementsByTagName('div')[0].innerHTML = "thinking...";
work();
};
(http://jsfiddle.net/7bpzuLmp/)
Clicking the button here will change the innerHTML of the div, and then call work, which should take a second or two. And although the div's innerHTML has changed, the browser doesn't have chance to update how the actual page looks until the event handler has returned, which means waiting for work to finish. But by that time, the div's innerHTML has changed again, so that when the browser does get chance to repaint the page, it simply displays 'done' without displaying 'thinking...' at all.
We can, however, do this:
document.getElementsByTagName('button')[0].onclick = function() {
document.getElementsByTagName('div')[0].innerHTML = "thinking...";
setTimeout(work, 1);
};
(http://jsfiddle.net/7bpzuLmp/1/)
setTimeout works by putting a call to a given function at the back of the browser's queue after the given time has elapsed. The fact that it's placed at the back of the queue means that it'll be called after the browser has repainted the page (since the previous HTML changing statement would've queued up a repaint before setTimeout added work to the queue), and therefore the browser has had chance to display 'thinking...' before starting the time consuming work.
So, basically, use setTimeout.
let the current frame render and start the process after setTimeout(1).
alternatively you could query a property and force a repaint like this: element.clientWidth.
More as a what is possible answer you can make your calculations on a new thread using HTML5 Web Workers
This will not only make your loading icon appear but also keep it loading.
More info about web workers : http://www.html5rocks.com/en/tutorials/workers/basics/

How to throttle "actions" (updates) to take on a changed source on a page

I searched a lot for a solution to this certainly-not-unique problem, but I have not found anything that will work in my context of an HTML page.
I have an input text that contains some kind of source-code that generates something, and I can show a preview of that something on the same HTML page (by updating the background image, for example). Note that the source could be a LaTeX file, an email, a Java program, a ray-trace code, etc. The "action" to generate the preview has a certain cost to it, so I don't want to generate this preview at each modification to the source. But I'd like the preview to auto-update (the action to fire) without the user having to explicitly request it.
Another way to phrase the problem is to keep a source and sink synchronized with a certain reasonable frequency.
Here's my solution that's too greedy (updates at every change):
$('#source-text').keyup(function(){
updatePreview(); // update on a change
});
I tried throttling this by using a timestamp:
$('#source-text').keyup(function(){
if (nextTime "before" Now) { // pseudocode
updatePreview(); // update on a change
} else {
nextTime = Now + some delay // pseudocode
}
});
It's better, but it can miss the last updates once a user stops typing in the source-text field.
I thought of a "polling loop" for updates that runs at some reasonable interval and looks for changes or a flag meaning an update is needed. But I wasn't sure if that's a good model for an HTML page (or even how to do it in javascript).
Use setTimeout, but store the reference so you can prevent it from executing if further editing has occurred. Basically, only update the preview once 5 seconds past the last keystroke has passed (at least in the below example).
// maintain out of the scope of the event
var to;
$('#source-text').on('keyup',function(){
// if it exists, clear it and prevent it from occuring
if (to) clearTimeout(to);
// reassign it a new timeout that will expire (assuming user doesn't
// type before it's timed out)
to = setTimeout(function(){
updatePreview();
}, 5e3 /* 5 seconds or whatever */);
});
References:
clearTimeout
setTimeout
And not to self-bump, but here's another [related] answer: How to trigger an event in input text after I stop typing/writing?
I tweaked #bradchristie's answer, which wasn't quite the behavior I wanted (only one update occurs after the user stops typing - I want them to occur during typing, but at a throttled rate).
Here's the solution (demo at http://jsfiddle.net/p4u2mhb9/3/):
// maintain out of the scope of the event
var to;
var updateCount = 0;
var timerInProgress = false;
$('#source-text').on('keyup', function () {
// reassign a new timeout that will expire
if (!timerInProgress) {
timerInProgress = true;
to = setTimeout(function () {
timerInProgress = false;
updatePreview();
}, 1e3 /* 1 second */ );
}
});

Javascript : setTimeout and interface freezing

Context
I've got about 10 complex graphs which take 5sec each to refresh. If I do a loop on these 10 graphs, it takes about 50 seconds to refresh. During these 50 seconds, the user can move a scrollbar. If the scrollbar is moved, the refresh must stop and when the scrollbar stops to move, the refresh occurs again.
I'm using the setTimeout function inside the loop to let the interface refresh.
the algorithm is :
render the first graph
setTimeout(render the second graph, 200)
when the second graph is rendered, render the third one in 200ms, and so on
The setTimeout allows us to catch the scrollbar event and to clearTimeout the next refresh to avoid to wait 50sec before moving the scrollbar...
The problem is that it does not run anytime.
Take the simple following code (you can try it in this fiddle : http://jsfiddle.net/BwNca/5/) :
HTML :
<div id="test" style="width: 300px;height:300px; background-color: red;">
</div>
<input type="text" id="value" />
<input type="text" id="value2" />
Javascript :
var i = 0;
var j = 0;
var timeout;
var clicked = false;
// simulate the scrollbar update : each time mouse move is equivalent to a scrollbar move
document.getElementById("test").onmousemove = function() {
// ignore first move (because onclick send a mousemove event)
if (clicked) {
clicked = false;
return;
}
document.getElementById("value").value = i++;
clearTimeout(timeout);
}
// a click simulates the drawing of the graphs
document.getElementById("test").onclick = function() {
// ignore multiple click
if (clicked) return;
complexAlgorithm(1000);
clicked = true;
}
// simulate a complexe algorithm which takes some time to execute (the graph drawing)
function complexAlgorithm(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds){
break;
}
}
document.getElementById("value2").value = j++;
// launch the next graph drawing
timeout = setTimeout(function() {complexAlgorithm(1000);}, 1);
}
The code does :
when you move your mouse into the red div, it updates a counter
when you click on the red div, it simulates a big processing of 1sec (so it freezes the interface due to javascript mono thread)
after the freezing, wait 1ms, and resimulate the processing and so on until the mouse move again
when the mouse move again, it breaks the timeout to avoid infinite loop.
The problem
When you click one time and move the mouse during the freeze, I was thinking that the next code that will be executed when a setTimeout will occurs is the code of the mousemove event (and so it will cancel the timeout and the freeze) BUT sometimes the counter of click gains 2 or more points instead of gaining only 1 point due to the mouvemove event...
Conclusion of this test : the setTimeout function does not always release resource to execute a code during a mousemove event but sometimes kept the thread and execute the code inside the settimeout callback before executing another code.
The impact of this is that in our real example, the user can wait 10 sec (2 graphs are rendered) instead of waiting 5 seconds before using the scrollbar. This is very annoying and we need to avoid this and to be sure that only one graph is rendered (and other canceled) when the scrollbar is moved during a render phase.
How to be sure to break the timeout when the mouse move ?
PS: in the simple example below, if you update the timeout with 200ms, all runs perfectly but it is not an acceptable solution (the real problem is more complex and the problem occurs with a 200ms timer and a complex interface). Please do not provide a solution as "optimize the render of the graphs", this is not the problem here.
EDIT : cernunnos has a better explanation of the problem :
Also, by "blocking" the process on your loop you are ensuring no event can be handled until that loop has finished, so any event will only be handled (and the timeout cleared) inbetween the execution of each loop (hence why you sometimes have to wait for 2 or more full executions before interrupting).
The problem is exactly contains in bold words : I want to be sure to interrupt the execution when I want and not to wait 2 or more full executions before interrupting
Second EDIT :
In summary : takes this jsfiddle : http://jsfiddle.net/BwNca/5/ (the code above).
Update this jsfiddle and provide a solution to :
Mouse move on the red div. Then click and continue moving : the right counter must raise only once. But sometimes it raises 2 or 3 times before the first counter can run again... this is the problem, it must raise only once !
The BIG problem here is setTimeout is unpredictable once it started, and especially when it is doing some heavy lifiting.
You can see the demo here:
http://jsfiddle.net/wao20/C9WBg/
var secTmr = setTimeout(function(){
$('#display').append('Timeout Cleared > ');
clearTimeout(secTmr);
// this will always shown
$('#display').append('I\'m still here! ');
}, 100);
There are two things you can do to minimize the impact on the browser performance.
Store all the intances of the setTimeoutID, and loop through it when you want to stop
var timers = []
// When start the worker thread
timers.push( setTimeout(function () { sleep(1000);}, 1) );
// When you try to clear
while (timers.length > 0) {
clearTimeout(timers.pop());
}
Set a flag when you try to stop process and check that flag inside your worker thread just in case clearTimeout failed to stop the timer
// Your flag
var STOPForTheLoveOfGod = false;
// When you try to stop
STOPForTheLoveOfGod = true;
while (timers.length > 0) {
clearTimeout(timers.pop());
}
// Inside the for loop in the sleep function
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if (STOPForTheLoveOfGod) {
break;
}
// ...
}
}
You can try out this new script.
http://jsfiddle.net/wao20/7PPpS/4/
I may have understood the problem but assuming you are trying to block the interface after a click for a minimum of 1 second and unblocking it by moving the mouse (after that 1 second minimum):
This is not a good implementation of sleep, as you are keeping the process running the whole time (doing nothing != sleeping), this results in a waste of resources.
Why not create an overlay (a semi/fully transparent div), put it on top of the rest of the interface (position fixed, full width and full height) and use it to prevent any interaction with the underlying interface. Then destroy it when the conditions are right (a second has passed and the user moved the mouse).
This behaves more like a sleep (has some initial processing time but then releases the processor for a given amount of time) and should help you achieve the behavior you need (assuming i understood it right).
It has the added bonus of allowing you to give the user some visual cue that some processing is being done.
Edit:
Also, by "blocking" the process on your loop you are ensuring no event can be handled until that loop has finished, so any event will only be handled (and the timeout cleared) inbetween the execution of each loop (hence why you sometimes have to wait for 2 or more full executions before interrupting).
Surprising enough you have not figured out that, when you setTimeout(); you can input a check after that. A variable is true then trash the wait, or trash it. Now there is a method that you can check to scroll with a scroll bar. After you have checked it true inside a variabled using the means, then you will find this will repeat inifite times as they scroll the bar, making many executing times of 5 seconds. To fix this add a 1 second wait to make sure it doesn't over repeat. Your welcome :)
Any long-running function is going to tie up your browser window. Consider moving your complexAlgorithm() outside of your main javascript code using WebWorkers.
The answer is in your question
...the refresh must stop and when the scrollbar stops to move, the
refresh occurs again.
You should write complexAlgorithm in such way that you can almost instantly brake it in a middle (just when you know you will have to re run)
so main code should look something like
stopAllRefresh; //should instantly(or after completing small chunk) stop refresh
setTimeout(startRefresh, 100);
and render graph in small chunks (each runs < 1sec) in setTimeout
like
var curentGraph = 0;
var curentChunk = 0;
function renderGraphChunk(){
if (needToBreak) //check if break rendering
{exit};
// Render chunk here
render(curentGraph, curentChunk);
curentChunk +=1;
setTimeout(renderGraphChunk, 1);
}
this is just a idea sketch, real implementation can be completely different
What you want to do can not be done without web worker, that is only implemented in some latest browser specially Chrome.
Otherwise, you have to break your algorithm in queue. Just like jQuery UI puts every next animation calculation in queue. http://api.jquery.com/jQuery.queue/
It is a simple queue and next instruction set is queued with help of setTimeout.
for (i=0; i <1000; i++)
{
process (i) ;
}
Can be translated to
function queue(s,n, f)
{
this.i=s;
this.n=n;
this.f=f;
this.step = function(){
if ( this.i <this.n)
{
this.f(this.i);
this.i = this.i +1;
var t = this;
setTimeout( function ( ) { t.step(); } , 5);
}
}
this.step();
}
queue ( O, 1000, function(i){
process(i);
}) ;
This is just an example of how Synchronous for loop can be written to execute same logic asynchronously using smaller independent iteration.
Try and check out web workers. I think it will be useful in this situation.
http://en.wikipedia.org/wiki/Web_worker
http://www.html5rocks.com/en/tutorials/workers/basics/

Categories

Resources