jQuery $.each()-problem - javascript

im making a wordpress plugin and i have a function where i import images, this is done with a $.each()-loop that calls a .load()-function every iteration. The load-function page the load-function calls is downloading the image and returns a number. The number is imported into a span-element. The source and destination Arrays is being imported from LI-elemnts of a hidden ULs.
this way the user sees a counter counting from zero up to the total number of images being imported. You can se my jQuery code below:
jQuery(document).ready(function($) {
$('#mrc_imp_img').click(function(){
var dstA = [];
var srcA = [];
$("#mrc_dst li").each(function() { dstA.push($(this).text()) });
$("#mrc_src li").each(function() { srcA.push($(this).text()) });
$.each(srcA, function (i,v) {
$('#mrc_imgimport span.fc').load('/wp-content/plugins/myplugin/imp.php?num='+i+'&dst='+dstA[i]+'&src='+srcA[i]);
});
});
});
This works pretty good but sometimes it looks like the load function isn't updating the DOM as fast as it should because sometimes the numbers that the span is updated with is lower than the previous and almost everytime a lower number is replacing the last number in the end. How can i prevent this from happening and how can i make it hide '#mrc_imp_img' when the $.each-loop is ready?

AJAX calls which have been called earlier are not guaranteed to finish earlier so the smaller number can overwrite the bigger. One solution is to simply increment the counter on each successful call:
jQuery(function($) {
$('#mrc_imp_img').click(function(){
var dstList = $("#mrc_dst li");
var srcList = $("#mrc_src li");
dstList.each(function(i) {
var dst = $(this).text();
var src = srcList[i].text();
$.post('/wp-content/plugins/myplugin/imp.php?num='+i+'&dst='+dst+'&src='+src, function() {
$('#mrc_imgimport span.fc').text($('#mrc_imgimport span.fc').text()+1);
});
});
});
});
(Changed the code to avoid unnecessary array operations, changed onready call to use shorthand, changed AJAX call to use POST which should be used for operations that change state.)

Most servers likely have a finite number of threads running. If you're firing off 10 calls at once, and your server only has 5 threads, 5 of them will fail.
Also - once you max out all the running threads, no other users can access the server, so you're essentially DOS-ing the server.
If you don't mind slowing it down to one call at a time, do what Tgr recommended which serializes the calls, waiting until each one completes before starting the next one.
I would prefer what Yoda suggested. What you can do is turn it into one server call that processes the entire array. If you really want to update a counter client-side, that one server call can update a counter in the database - and then a 2nd ajax call can poll the server every few seconds to find out where the counter is. Obviously wont be guaranteed to be sequential but will be better for your server health. You could also fake the sequential aspect (if you're on #3 and the next call yields a #6 - increment it client side one by one)
As far as not seeing an alert, there is probably a javascript error before or on the alert line. Try using firebug and the console.log statement, or even bette, step through it with the firebug debugger.

Related

Refreshing UI parts on each loop of an AJAX sync call

Okay, it might be a simple question, but so far I didn't find anything helpful by searching, so I am giving it a try here.
I am using plain old javascript/jquery on asp.net core on some project I am working on.
I am currently performing some actions on some employees in a foreach loop.
For each employee I am calling synchronously via ajax an API.
What I want is the UI to be updated, showing the current employee being processed in a progress bar.
While on debug, the process seems to work fine, but during normal process, it seems that the UI thread is not updated, only after all the work has been done. As such, as soon as I start processing the employees, the screen is stuck and closes after the work has been done. No progress bar is shown.
I only managed to show the progress animation and the first employee using the below trick
$('#applyYes').click(function (e) {
e.preventDefault();
var year = $('#yearCombo').val();
$('#applyInfo').hide();
$('#applyLoad').show();
$('#applyAction').prop('innerText', 'Calculating...');
setTimeout(function () {
var employeeIDs = multipleEmployees_Array();
for (var i = 1; i <= employeeIDs.length; i++) {
employeeID = employeeIDs[i - 1];
applyAmount(employeeID, year); //1. Updates progress bar 2. Ajax sync call
}
}, 0);
})
As far as I understand the e.preventDefault seems to move the timeout function being processed after UI thread finishes.
What is the optimal way of achieving what I want?
PS: No external libraries if possible. I am working on an third-party platform, that makes it difficult to add external libraries (policies, permissions etc.)
Synchronous HTTP requests are deprecated for precisely this reason. Don't use them.
Use Asynchronous requests instead.
If you want to run them sequentially then either:
Trigger i+1 in i's success function or
Use a Promise based API and await the results

Executing a Function from a separate page

I'm not entire sure that this is possible, but here's what I'm looking at doing.
I have a list of buttons, that when pressed, modify a database that they are attached to, (for example, clicking button 1 would add 10 points to the score of player 1)
What I am looking to do in tandem, is to call a javascript function that lives on a separate page, a sort of a visual confirmation that the points were added to player 1's account.
Why am I organizing it like this?
I am running a Twitch channel where I am building a series of web views to display stats and information. I wish to control WHAT the audience sees from one page, and have the results display on a different page.
Here's what I have so far:
HTML (admin page):
<button class="addPoints" data-id="player1" data-name="Player 1">Player 1</button>
JS (admin page):
$(".addPoints").on('click', function(){
var $id = $(this).data('id');
var $name = $(this).data('name');
//AJAX MAGIC TO INSERT POINTS INTO DATABASE SO I CAN KEEP SCORE//
tallyPopup($id, $name);
});
HTML (display page):
<div class="displayScreen"></div>
JS (display page):
function tallyPopup(member, name){
$('.displayScreen').append(<div class='tallyPopup' id='"+member+"'><div class='portrait' id='"+member+"'></div><span class='name'>"+name+"</span><span class='score'>+10</span></div>);
$('.tallyPopup').animate({
opacity: 1
}, 3000, function(){
$(this).remove();
});
}
I know what I have does not connect the two pages, because I haven't the first clue on how to do that, if it's even possible. OR, is there a way for the Display HTML to check if the database has been updated and THEN run the tallyPopup function?
Thanks in advance.
You cannot call a function on another client (including your own clients) running your website.
To continuously check for points on the display page, use var intv = setInterval(function () {}, nMilliseconds) to repeatedly run a function until you call clearInterval(intv), which you might not do since you may want this to run forever, but perhaps only once every minute (nMilliseconds = 60000).
setInterval(function () { $.get('/point-count').then(tallyPopup) }, 60000)
tallyPopup would receive the data argument from the AJAX response.
Of course on the admin side you must fill in that line to update the amount of points via AJAX, either by PUT, POST, or PATCH. I would consider using PATCH just as a matter of semantics.
Also consider storing the return value of $(this) (var $this = $(this)) instead calling it multiple times, use promises, use CSS animations instead of $.animate (these perform much better). Consider making the element opaque and then visible (perhaps off screen when invisible), instead of using $.remove (also a performance improvement).

Angular Timeout Queue

I have an angular service that makes an Web API call out to retrieve my search results. The problem I'm having is the angular controller & UI is set up in a way that allows the search to be called multiple times per second causing the service to be queued up. I tried resolving/defer the http call when a new one comes in but it doesnt seem like the best solution. I would rather queue up all the search calls I get within a certain time period and then only execute the last one. Any ideas on how I could do that?
timeout(function(){
var length = queue.length
var item = queue[length - 1];
queue.splice(0, length);
processItem(item);
} , <yourtime:number>)
keep adding your requests to the queue. and add the processing logic to the processItem function.
this might do the needful
*note - please consider this as a pseudo code. might have compilations errors
Alternatively you can just create a bool variable which is referred every time a request is about to be made and done make the request till its true. Somethign like this
function processItem(item){
if(process){
process = false;
//YOUR ACTUAL PROCESSING CODE
}
}
$timeout(function(){
process = true;
}, <yourtime in milli seconds>)

How to do nested looping over many pages in CasperJS

I don't have a clue where to start with this. Basically I need CasperJS to run through about 15 different pages, each page that it runs through it needs to get the data for 150 different locations that need to be set as cookie values. For each location, I need to check the data for 5 different dates.
Any one of these seems pretty straight forward, but trying to get all three to happen is confusing me.
I tried to set it up this way:
for(Iterate through URLs){
for(Iterate through locations){
for(Iterate through dates){
phantom.addCookie({
// Cookie data here based on location and date
});
casper.start(url)
.then(function(){
// Do some stuff here
})
.run();
}
}
}
Essentially what it does is loop through everything, then load the page based on the last link, at the last location, on last date. But every other location gets skipped. Is there an easier way to do this? Perhaps better, is there a way to tell my JavaScript loop to wait for casper to finish doing what it needs to do before jumping to the next loop iteration?
I'm happy to provide more details if needed. I tried to simplify the process as best I can without cutting out needed info.
That's pretty much it. Two things to look out for:
casper.start() and casper.run() should only be called once per script. You can use casper.thenOpen() to open different URLs.
Keep in mind that all casper.then*() and casper.wait*() functions are asynchronous step functions and are only scheduled for execution after the current step. Since JavaScript has function level scope, you need to "fix" the iteration variables for each iteration otherwise you will get only the last URL. (More information)
Example code:
casper.start(); // deliberately empty
for (var url in urls) {
for (var location in locations) {
for (var date in dates) {
(function(url, location, date){
casper.then(function(){
phantom.addCookie({
// Cookie data here based on location and date
});
}).thenOpen(url)
.then(function(){
// Do some stuff here
});
})(url, location, date);
}
}
}
casper.run(); // start all the scheduled steps
If you use Array.prototype.forEach instead of the for-loop, then you can safely skip the use of the IIFE to fix the variables.
I'm not sure, but you may need to first open a page to then add a cookie for that domain. It may be possible that PhantomJS only accepts a cookie when that domain for that cookie is currently open.

Getting large AJAX responses in Chrome without crashing

I am working on a project which is aimed at the Chrome browser. Our goal which we would like to accomplish is to get a one million record array into the browser to work with the data. When I generated a test file that contained a million records it was a bit more than one gigabyte.
For reasons I will explain, I believe we can accomplish the goal if can get the browser to collect the garbage when necessary. I believe the browser holds the text of the AJAX responses when it doesn't need to and crashes for that reason.
Now, I can generate a million records within the browser and manipulate it as I need to. However, I have trouble sending the AJAX to the browser without crashing it.
Since sending one million crashes it, I tried sending batches of one hundred thousand. I can get two such batches across and parse the JSON. If I do not have a onreadystatechange on my AJAX call, I can make the call a number of times. Also, if I receive a hundred thousand records, I can go over it ten times and make the full array.
Because I seem to be able to actually hold one million records, I believe that, as I said, holding the response texts is overwhelming the browsers.
In order to try to get better memory management, I have pushed the AJAX resquests and parsing into a web worker. When the webworker gets the AJAX and makes the hundred thousand record array, it pushes it to the DOM thread. When the DOM thread has taken the data it has the web worker do another AJAX.
However, it still crashes.
I am open to using websockets or something else, if that would help somehow.
Here is the code in the DOM thread:
var iterations=3;
var url='hunthou.json';
var worker=new Worker('src/worker.js');
var count=0;
worker.addEventListener('message',function(e){
alert('count: '+count);
//bigArr=bigArr.concat(e.data);
console.log('e.data length: '+e.data.length);
bigArr[count]=e.data;
console.log('bigArr length: '+bigArr.length);
if(count<(iterations-1)){
worker.postMessage(url);
} else{
alert('done');
console.log('done');
worker.terminate();
console.log('bye');
}
count++;
});
worker.postMessage(url);
Here is the webworker:
var arr=[];
var request = new XMLHttpRequest();
request.onreadystatechange = function () {
var DONE = this.DONE || 4;
if (this.readyState === DONE){
arr=JSON.parse(request.responseText);
self.postMessage(arr);
arr.length=0;
request.responseText.length=0;
console.log('okay');
}
};
self.addEventListener('message', function(e) {
var url=e.data;
console.log('url: '+url);
request.open("GET",'../'+url,true);
request.send(null);
}, false);
Instead of sending the whole data at once.
You can create multiple requests which will retrieve chunks of data instead of retrieving whole data at once, this will prevent your browser from crashing.

Categories

Resources