Ajax server side setup and teardown - javascript

I have the following situation:
Server-side, I have a relational database, and am doing some rather computationally-intensive requests on it. For a request, there is a resource intensive setup task. Many of my requests require this same setup step, which I don't want to repeat, which leads me to want a single call. However, each request takes a lot of time, which makes me want to issue them asynchronously as several calls, so that the user gets stuff as it becomes available. I could try to just "keep the setup step around", but I don't really want the server to try to guess when the client is done, and it can't rely on the client to tell it when to clean up.
Here's what I would LIKE to happen:
- Client-side, I gather up the requests that I want to make, A,B and C. They all require the same setup step.
- So that I don't have to repeat the setup step, I issue one ajax call for A,B and C simultaneously.
- So that the user doesn't have to wait forever for results, I return answers for A,B and C asynchronously, handing them back to the client as the results become available.
Something like:
$.ajax({
type: "GET",
url: "/?A=1&B=2&C=3",
partialSuccess: function (data) {
if (partialSuccess.which == "A") {
doStuffForTaskA();
}
},
success: function (data) {
console.log("all tasks complete!");
}
});
I'm pretty sure the sort of thing I have in the code above is not possible. Any thoughts on the best way to accomplish what I'm trying to do? (I am also the author of the server-side code. It happens to be c sharp, but I'm somewhat more interested in this as a 'which protocol, and how does it work' question)

Related

NodeJS Returning data to client browser

I think we need some help here. Thanks in advance.
I have been doing programming in .Net for desktop applications and have used Timer objects to wait for a task to complete before the task result are shown in a data grid. Recently, we switched over to NodeJs and find it pretty interesting. We could design a small application that executes some tasks using PowerShell scripts and return the data to the client browser. However, I would have to execute a Timer on the client browser (when someone clicks on a button) to see if the file, that Timer receives from the server, has "ENDOFDATA" or not. Once the Timer sees ENDOFDATA it triggers another function to populate DIV with the data that was received from the server.
Is this the right way to get the data from a server? We really don't want to block EventLoop. We run PowerShell scripts on NodeJS to collect users from Active Directory and then send the data back to the client browser. The PowerShell scripts are executed as a Job so EventLoop is not blocked.
Here is an example of the code at NodeJs:
In the below code can we insert something that won't block the EventLoop but still respond to the server once the task is completed? As you can see in the code below, we would like to send the ADUsers.CSV file to the client browser once GetUsers.PS1 has finished executing. Since GetUSers.PS1 takes about five minutes to complete the Event Loop is blocked and the Server can no longer accept any other requests.
app.post("/LoadDomUsers", (request, response) => {
//we check if the request is an AJAX one and if accepts JSON
if (request.xhr || request.accepts("json, html") === "json") {
var ThisAD = request.body.ThisAD
console.log(ThisAD);
ps.addCommand("./public/ps/GetUsers.PS1", [{
name: 'AllParaNow',
value: ScriptPara
}])
ps.addCommand(`$rc = gc ` + __dirname + "/public/TestData/AD/ADUsers.CSV");
ps.addCommand(`$rc`);
ps.invoke().then((output) => {
response.send({ message: output });
console.log(output);
});
}
});
Thank you.
The way you describe your problem isn't that clear. I had to read some of the comments in your initial question just to be sure I understood the issue. Honestly, you could just utilize various CSV NPM packages to read and write from your active directory with NodeJS.
I/O is non-blocking with NodeJS, so you're not actually blocking the EventLoop. You can handle multiple I/O requests, since NodeJS will just create threads for each one,
and continue execution on the main thread until the I/O operations complete and send back the data to its function reference, adding them as functions to the callstack and resuming program execution from those function's references. After you get the I/O data, you just send it back to the client through the response object. There should be no timers needed.
So is the issue once the powershell script runs, you have to wait for that initial script to complete before being able to handle pending requests? I'm still a bit unclear...

Several asynchronous requests from javascript to PHP apache not been asynchronous

This behavior was not present all the time, just come from nothing about a month ago and then disappeared suddenly. The problem is that I can't identify what happened. I have no server debug tool because only take place in production.
Roughly 100 ajax request are triggered at the same time using some loop like :
let url = "example.com/";
var methods = ["method1", "method2", "method3", "method4"]; //Roughly 100
$.each(methods, function(index, value) {
$.ajax({
url: url + value,
method: "POST",
data: { params: "whatever", otherParams: "whatever" }
}).done(function(data) {
console.log(data);
});
});
In server side (apache+php) there are selects, updates and inserts in a relational database. Each request performs an individual thread since apache is hearing.
When I see the network console, all requests starts at the same time (roughly), but here is the problem. The response happen one after the other finish. If request 1 starts at 0 and spend 5 seconds, request 2 starts at 5, and request 3 starts when request 2 was finished. All browser have the same behavior.
The best logic explanation I thought, is that database is blocking some table when performs an update or insert. Some tables are huge and without indexes could spend too much time. Well, staging environment points to the same database and works perfectly asynchronously. So what is going on? It is possible that php or Apache could be stucked in this way for some reason? I thought other crazy idea that is some writing problems with log files in the OS (debian) but I have no idea how that works. So I would be glad if anyone could give me any suggestion. Maybe I could reproduce the problem in a controlled environment and do something to prevent this can happen again.
Some additional information, the API have two clients one in angular the other in javascript+php. It's exactly the same behavior with both clients.

How to propagate the ajax.abort() to the controller

I have a problem with propagating the aborting of my ajax call to the controller.
I have a JavaScript function in View code that may be called by a user at any point. This then transmits a value to the controller using ajax. The controller then does a time consuming opertion on the Input and returns a result.
What I want is that when user calls the function if it is already doing the time consuming opertation to either:
Stop and to start again with the new Input. In essence I need to propagate the abort call up to my controller code and deal with it accordingly
OR
I need to be able to run multiple simultaneous instances of the controller function.
Is this possible? and what is the best way to do it.
View Code
var AJAXSetPalette = null;
function DoSometing(Input) {
if (AJAXSetPalette)
AJAXSetPalette.abort();
AJAXSetPalette = $.ajax({
type: "POST",
url: "ImagesAnalysis/DoSomething",
datatype: "json",
traditional: true,
data: Input,
success: function (Data) {
DoJSFunction(Data);
}
return;
}
Controller
public int DoSomething(int Input)
{
int RetVal
//Calculate RetVal from Input, very Time Consuming
Return RetVal
}
This is a client-server issue. Connections between a client and a server in HTTP are not persistent. The client opens a connection to the server and makes a request. The connection is then closed. The server processes the request, opens a connection back to the client, and sends the response.
Note: As of HTTP 1.1, this is not technically true any more. Connections are actually persisted in many cases over HTTP 1.1, but merely to reduce the delay from having to re-establish the connection. In principle, both the client and server still behave as if the connection has been closed.
The point is that once your AJAX request is sent, the server is merrily on its way processing the request. If the client should abort the request, there's no notification given to the server. When the server attempts to send the response, it will simply be refused, and the server will disregard it and move on to the next request.
That's how the TCP/IP and HTTP protocols were designed to behave, and it's what makes the Internet possible as a loosely connected network of nodes that can drop off or come online at will.
Long and short, there's no way to cancel the request on the server-side from the client once it's been sent.
For your scenario, the best thing would be to simply disable the user's ability to issue another request until the server has responded or some timeout period has elapsed. If the request is so resource intensive and you can call it as many times as you want as fast as you want, that's a huge opportunity for a DoS attack

What is the right way to handle asynchronous behavior on Meteor's client side?

I'm using Meteor specifically.
I'd like to make a call to a Facebook API (using Meteor's HTTP) to display pictures on Meteor's client side. I've seen the use of Fiber Futures, the storage of data in Sessions, and using the client to invoke a synchronous server call, but I'm not sure what is currently the best way or if other methods are now obsolete.
This is a common use case, acceptably solved, no need to think too deeply.
Making HTTP requests, and using the results. If you get an URL back, save to session. Set a template to be based on that session var, and it will automatically refresh once the callback updates the session.
http://docs.meteor.com/#http
HTTP.call("POST", "http://api.twitter.com/xyz",
{data: {some: "json", stuff: 1}},
function (error, result) {
if (result.statusCode === 200) {
Session.set("twizzled", true);
}
});
***************************8
To the rabid downvoters: This is a brand new technology, rapidly changing, and just getting started. It is not hard to find info from a few months back that is inaccurate. Let's be a little less quick to criticize newbies, until the flow of questions becomes harder to handle. Just skip a question quicker if it bores you.

UI unresponsive during AJAX calls

I have a dashboard screen that needs to make about 20 AJAX requests on load, each returning a different statistics. In total, it takes about 10 seconds for all the requests to come back. However during that 10 seconds, the UI is pretty much locked.
I recall reading a JS book by Nick Zakas that described techniques for maintaining UI responsiveness during intensive operations (using timers). I'm wondering if there is a similar technique for dealing with my situation?
*I'm trying to avoid combining the AJAX calls for a number of reasons
$(".report").each(function(){
var container = $(this)
var stat = $(this).attr('id')
var cache = db.getItem(stat)
if(cache != null && cacheOn)
{
container.find(".value").html(cache)
}
else
{
$.ajax({
url: "/admin/" + stat,
cache: false,
success: function(value){
container.find(".value").html(value.stat)
db.setItem(stat, value.stat);
db.setItem("lastUpdate", new Date().getTime())
}
});
}
})
If you have access to jQuery, you can utilize the $.Deferred object to make multiple async calls simultaneously and perform a callback when they all resolve.
http://api.jquery.com/category/deferred-object/
http://api.jquery.com/deferred.promise/
If each of these callbacks are making modifications to the DOM, you should store the changes in some temporary location (such as in-memory DOM objects) and then append them all at once. DOM manipulation calls are very time consuming.
I've had similar problems working heavily with SharePoint web services - you often need to pull data from multiple sources to generate input for a single process.
To solve it I embedded this kind of functionality into my AJAX abstraction library. You can easily define a request which will trigger a set of handlers when complete. However each request can be defined with multiple http calls. Here's the component (and detailed documentation):
DPAJAX at DepressedPress.com
This simple example creates one request with three calls and then passes that information, in the call order, to a single handler:
// The handler function
function AddUp(Nums) { alert(Nums[1] + Nums[2] + Nums[3]) };
// Create the pool
myPool = DP_AJAX.createPool();
// Create the request
myRequest = DP_AJAX.createRequest(AddUp);
// Add the calls to the request
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [5,10]);
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [4,6]);
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [7,13]);
// Add the request to the pool
myPool.addRequest(myRequest);
Note that unlike many of the other solutions provided this method does not force single threading of the calls being made - each will still run as quickly (or as slowly) as the environment allows but the single handler will only be called when all are complete. It also supports the setting of timeout values and retry attempts if your service is a little flakey.
In your case you could make a single request (or group related requests - for example a quick "most needed" request and a longer-running "nice to have" request) to call all your data and display it all at the same time (or in chunks if multiple requests) when complete. You can also specifically set the number of background objects/threads to utilize which might help with your performance issues.
I've found it insanely useful (and incredibly simple to understand from a code perspective). No more chaining, no more counting calls and saving output. Just "set it and forget it".
Oh - concerning your lockups - are you, by any chance, testing this on a local development platform (running the requests against a server on the same machine as the browser)? If so it may simply be that the machine itself is working on your requests and not at all indicative of an actual browser issue.

Categories

Resources