I have series of OData calls which I make from JavaScript file (myfile.js) using Promises like shown below. The main entry point is the function MakePreferredCustomer(AccountNo). This function is called on load of a web page Page1.htm, which loads the JavaScript file myfile.js also. These series of OData calls (its more than the 5 shown below) take around 180-200 seconds to complete. This is async so, page load of Page1.htm is not affected. Also user does not come to know that these calls are happening in the background (which is a requirement to do this without asking user). But the problem is - what happens if user switches to some other page from this Page before the 180-200 seconds. I am not sure if the async tasks keep on happening in the background or will it stop in between (depending on when user moves away from Page1.htm)?
Is there a definite way this will behave or will this change depending on browser being used or some other external criteria? Please guide.
void MakePreferredCustomer(AccountNo)
{
GetAccountDataFromServer() //This internally does " return Common.MakeWebApiRequest"
.then(HandleAccountDataResponse(request){
//Parse the response using JSON.Parse() and get account no
//Make WebApiRequest to get more details using account no
return Common.MakeWebApiRequest("GET", uri, additionalAccountData, headers);
})
.then(HandleAdditionalAccountDetails(request){
//Parse the response using JSON.Parse() and get additional acc details
//Store these details in additional variables in this function scope
//Make OData call to get Product Details
return Common.MakeWebApiRequest("GET", uri, Productdata, headers);
})
.then(HandleProductDetails(request){
//Parse the response using JSON.Parse() and get product details
//Check if this account had earlier porchased this product.
//Make OData call to get no of times this product should be purchased to become preferred customer
return Common.MakeWebApiRequest("GET", uri, PolicyData, headers);
}) .then(HandlePolicyDetails(request){
//Parse the response using JSON.Parse() and get policy details
//Check No if times prodocut should be bought and how may times this account has bought
//if condition meet, update preferred customers detials with this account as per below OData call
Common.MakeWebApiRequest("POST", uri, ThisAccountIsNewPreferredCustomerData, headers);
})
.catch(HandleException(e){
})
}
Each of the Functions call a common function MakeWebAPIRequest:
Common.MakeWebApiRequest = function (action, uri, data, headers)
{
//Do basci checks in input arguments
return new Promise(function (resolve, reject)
{
var request = new XMLHttpRequest();
request.open(action, encodeURI(uri), true); //async call
//Set OData specific headers using request.setRequestHeader
request.onreadystatechange = function ()
{
if (this.readyState === 4)
{
//Do Handling
}
}
if ("POST" === action && data === "")
{
request.send();
}
else if ("GET" === action)
{
request.send();
}
else
{
request.send(JSON.stringify(data));
}
});
}
When the user navigates away, the XMLHttpRequests will be aborted (or at least ignored on the client side), and the load handlers will be dropped and never executed. So no, javascript processes do not continue to work in the background.
The async requests, once initiated, will continue. As well, the promise handlers will also get invoked and run when the dependent Promise fulfills (or is rejected).
If these processes are independent of the content on Page1.html, then you should be okay. If there is a dependency (example, when one of your promise handlers kick-off and they rely on data available within Page1) then you will have to find a way to put those dependencies within the handlers or preserve what you need through closures.
Related
I have a function that should only continue after an AJAX call has been completed, but I want to be able to skip that AJAX call if I already have the data from last session in my localstorage.
My current code:
$.when(getAddresses()).done(function (data) {
addresses = data.data;;
localStorage['addresses'] = JSON.stringify(addresses);
{{Rest of the code that should be executed after the AJAX call}}
}
Thanks in advance!
Do it the other way around.
Check for the data locally and don't even send the request if you already have it.
Wrap the data in a promise so it will always have the same API no matter where you fetch it from.
async function get_data() {
let addresses = localStorage.getItem('addresses');
if (addresses) {
return JSON.parse(addresses);
}
let ajaxData = await getAddresses();
addresses = ajaxData.data;
localStorage.setItem('addresses', JSON.stringify(addresses));
return addresses;
}
get_data().then(data => {
// Rest of the code that should be executed after the AJAX call
});
Another approach would be to forget about localStorage and just have the web service set suitable caching headers. Then you can make the HTTP request, but if the cache information shows that the browser cache contains up to date data it won't make the HTTP request at all.
You don't need to reinvent local caching of data. HTTP has it baked in.
I am working on a scenario, where i need to cancel third party library request based on some condition and then unblock if condition evaluates to false. Let us assume that third party URL is being loaded with following URL:
https://cdn-thirdparty.com/...
Now, once it is loaded, it captures user clicks on the application and sends data as another URL:
https://cdn-info-thirdpart.com/...
Now, say i want to block all the requests to URL which contains thirdparty in it.. How do i achieve this in Vanilla Javascript...
P.S: I do not have access to remove the library from code, instead i have to do some engineering that requests are getting blocked based on some conditions(we can assume any) and then getting unblocked on falsy condition.
The code i tried to intercept all XMLHttpRequest is as below and i do get URL, method of call but i need to block it now and then unblock.
let oldXHROpen = window.XMLHttpRequest.prototype.open;
window.XMLHttpRequest.prototype.open = function(method, url, async, user, password) {
// do something with the method, url and etc.
this.addEventListener('load', function() {
// do something with the response text
console.log('load: ' + url, method);
});
return oldXHROpen.apply(this, arguments);
}
Source of above code: https://medium.com/#gilfink/quick-tip-creating-an-xmlhttprequest-interceptor-1da23cf90b76
I'm writing some JavaScript/AJAX code.
Is there anyway to ensure that the server receives the XML requests in the order that they are sent?
If not with plain Ajax, do I get this guarantee if I send everything over a single WebSocket?
Thanks!
If it is of utmost importance that they're received in the proper order, and attaching an iterating id to the form isn't enough:
msg_number = 1; sendAJAX(msg_number); msg_number++;
Then I'd suggest building your own queue-system, and send each subsequent file as the callback of the previous one.
Rather than each element having its own AJAX-access, create one centralized spot in your application to handle that.
Your different AJAX-enabled sections don't even need to know that it is a queue:
AJAX.send({ url : "......", method : "post", success : func(){}, syncronous : true });
On the other side of that, you could have something like:
AJAX.send = function (obj) {
if (obj.synchronous) {
addToSyncQueue(obj); checkQueue();
} else { fireRequest(); }
};
Inside of your sync queue, all you'd need to do is wrap a new function around the old callback:
callback = (function (old_cb) {
return function (response) {
checkQueue();
old_cb(response);
};
}(obj.success));
obj.success = callback;
AJAX.call(obj);
Inside of checkQueue, you'd just need to see if it was empty, and if it wasn't, use
nextObj = queue.shift(); (if you're .push()-ing objects onto the queue -- so first-in, first-out, like you wanted).
A couple of options come to mind:
Send them synchronously, by waiting for a successful response from the server after each XML request is received (i.e. make a queue).
If you know the number of requests you'll be sending beforehand, send the request number as a tag with each request, e.g. <requestNum>1</requestNum><numRequests>5</numRequests>. This doesn't guarantee the order that they're received in, but guarantees that they can be put back in order afterwards, and has the added benefit of being sure that you have all the data.
At my company we use this little ajaxQueue plugin, written by one of the core jQuery contributors:
http://gnarf.net/2011/06/21/jquery-ajaxqueue/
I have a background script that is responsible for getting and setting data to a localStorage database. My content scripts must communicate with the background script to send and receive data.
Right now I send a JSON object to a function that contains the command and the data. So if I'm trying to add an object to the database Ill create JSON that has a command attribute that is addObject and another object that is the data. Once this is completed the background scripts sends a response back stating that it was successful.
Another use case of the function would be to ask for data in which case it would send an object back rather than a success/fail.
The code gets kind of hacky once I start trying to retrieve the returned object from the background script.
It seems like there is probably a simple design problem to follow here that I'm not familiar with. Some people have suggested future/promise design problems but I haven't found a very good example.
Content Script
function sendCommand(cmdJson){
chrome.extension.sendRequest(cmdJson, function(response){
//figure out what to do with response
});
}
Background script
if (request.command == "addObject"){
db[request.id]= JSON.stringify(request.data);
sendResponse("success");
}
else if(request.command == "getKeystroke"){
var keystroke = db[request.id];
sendResponse(keystroke);
}
Your system looks OK and here are some minor improvements.
For each remote command send back the same type of object (with possibly empty fields):
var response = {
success: true, // or false
data: {},
errors: [],
callback: ''
}
Also, if you have multiple different commands which send back data, you may replace if-else with an object lookup:
var commands = {
addObject: function () { /* ... */ },
getKeystroke: function (request, response) {
response.data = db[request.id]
}
}
Then if you have any data to response with, just add it to the object. And send the same object for any command:
var fn = commands[request.commands]
fn(request, response)
As for figuring out what to do with response, I'd pass a callback into the sendCommand function and let the content scripts request and process the response data as they see fit:
function sendCommand(cmdJson, callback){
chrome.extension.sendRequest(cmdJson, callback)
}
We've all seen some examples in AJAX tutorials where some data is sent. They all (more or less) look like:
var http = createRequestObject(); // shared between printResult() and doAjax()
function createRequestObject() { /* if FF/Safari/Chrome/IE ... */ ... }
function printResult()
{
if (http.readyState == 4) { ... }
}
function doAjax() {
var request = 'SomeURL';
http.open('post', request);
http.onreadystatechange = printResult;
data = ...; // fill in the data
http.send(data);
}
// trigger doAjax() from HTML code, by pressing some button
Here is the scenario I don't understand completely: what if the button is being pressed several times very fast? Should doAjax() somehow re-initialize the http object? And if if the object is re-initialized, what happens with the requests that are being already on air?
PS: to moderator: this question is probably more community-wiki related. As stated here (https://meta.stackexchange.com/questions/67581/community-wiki-checkbox-missing-in-action) - if I've got it right - please mark this question appropriately.
Since AJAX has asynchronus nature, with each button click you would raise async event that would GET/POST some data FROM/TO server. You provide one callback, so it would be triggered as many times as server finishes processing data.
It is normal behaviour by default, you should not reinitialize of http object. If you want to present multiple send operation you have to do that manually (e.g. disabling button as first call being made).
I also suggest to use jQuery $.ajax because it incapsulate many of these details.
Sure that numerous libraries exist nowadays that perform a decent job and should be used in production environment. However, my question was about the under-the-hood details. So here I've found the lamda-calculus-like way to have dedicated request objects per request. Those object will obviously be passed to the callback function which is called when response arrives etc:
function printResult(http) {
if (http.readyState == 4) { ... }
...
}
function doAjax() {
var http = createRequestObject();
var request = 'SomeURL';
http.open('get', request);
http.onreadystatechange = function() { printResult(http); };
http.send(null);
return false;
}
Successfully tested under Chrome and IE9.
I've used a per-page request queue to deal with this scenario (to suppress duplicate requests and to ensure the sequential order of requests), but there may be a more standardized solution.
Since this is not provided by default, you would need to implement it in JavaScript within your page (or a linked script). Instead of starting an Ajax request, clicking a button would add a request to a queue. If the queue is empty, execute the Ajax request, with a callback that removes the queued entry and executes the next (if any).
See also: How to implement an ajax request queue using jQuery