Several asynchronous requests from javascript to PHP apache not been asynchronous - javascript

This behavior was not present all the time, just come from nothing about a month ago and then disappeared suddenly. The problem is that I can't identify what happened. I have no server debug tool because only take place in production.
Roughly 100 ajax request are triggered at the same time using some loop like :
let url = "example.com/";
var methods = ["method1", "method2", "method3", "method4"]; //Roughly 100
$.each(methods, function(index, value) {
$.ajax({
url: url + value,
method: "POST",
data: { params: "whatever", otherParams: "whatever" }
}).done(function(data) {
console.log(data);
});
});
In server side (apache+php) there are selects, updates and inserts in a relational database. Each request performs an individual thread since apache is hearing.
When I see the network console, all requests starts at the same time (roughly), but here is the problem. The response happen one after the other finish. If request 1 starts at 0 and spend 5 seconds, request 2 starts at 5, and request 3 starts when request 2 was finished. All browser have the same behavior.
The best logic explanation I thought, is that database is blocking some table when performs an update or insert. Some tables are huge and without indexes could spend too much time. Well, staging environment points to the same database and works perfectly asynchronously. So what is going on? It is possible that php or Apache could be stucked in this way for some reason? I thought other crazy idea that is some writing problems with log files in the OS (debian) but I have no idea how that works. So I would be glad if anyone could give me any suggestion. Maybe I could reproduce the problem in a controlled environment and do something to prevent this can happen again.
Some additional information, the API have two clients one in angular the other in javascript+php. It's exactly the same behavior with both clients.

Related

NodeJS Returning data to client browser

I think we need some help here. Thanks in advance.
I have been doing programming in .Net for desktop applications and have used Timer objects to wait for a task to complete before the task result are shown in a data grid. Recently, we switched over to NodeJs and find it pretty interesting. We could design a small application that executes some tasks using PowerShell scripts and return the data to the client browser. However, I would have to execute a Timer on the client browser (when someone clicks on a button) to see if the file, that Timer receives from the server, has "ENDOFDATA" or not. Once the Timer sees ENDOFDATA it triggers another function to populate DIV with the data that was received from the server.
Is this the right way to get the data from a server? We really don't want to block EventLoop. We run PowerShell scripts on NodeJS to collect users from Active Directory and then send the data back to the client browser. The PowerShell scripts are executed as a Job so EventLoop is not blocked.
Here is an example of the code at NodeJs:
In the below code can we insert something that won't block the EventLoop but still respond to the server once the task is completed? As you can see in the code below, we would like to send the ADUsers.CSV file to the client browser once GetUsers.PS1 has finished executing. Since GetUSers.PS1 takes about five minutes to complete the Event Loop is blocked and the Server can no longer accept any other requests.
app.post("/LoadDomUsers", (request, response) => {
//we check if the request is an AJAX one and if accepts JSON
if (request.xhr || request.accepts("json, html") === "json") {
var ThisAD = request.body.ThisAD
console.log(ThisAD);
ps.addCommand("./public/ps/GetUsers.PS1", [{
name: 'AllParaNow',
value: ScriptPara
}])
ps.addCommand(`$rc = gc ` + __dirname + "/public/TestData/AD/ADUsers.CSV");
ps.addCommand(`$rc`);
ps.invoke().then((output) => {
response.send({ message: output });
console.log(output);
});
}
});
Thank you.
The way you describe your problem isn't that clear. I had to read some of the comments in your initial question just to be sure I understood the issue. Honestly, you could just utilize various CSV NPM packages to read and write from your active directory with NodeJS.
I/O is non-blocking with NodeJS, so you're not actually blocking the EventLoop. You can handle multiple I/O requests, since NodeJS will just create threads for each one,
and continue execution on the main thread until the I/O operations complete and send back the data to its function reference, adding them as functions to the callstack and resuming program execution from those function's references. After you get the I/O data, you just send it back to the client through the response object. There should be no timers needed.
So is the issue once the powershell script runs, you have to wait for that initial script to complete before being able to handle pending requests? I'm still a bit unclear...

Ajax server side setup and teardown

I have the following situation:
Server-side, I have a relational database, and am doing some rather computationally-intensive requests on it. For a request, there is a resource intensive setup task. Many of my requests require this same setup step, which I don't want to repeat, which leads me to want a single call. However, each request takes a lot of time, which makes me want to issue them asynchronously as several calls, so that the user gets stuff as it becomes available. I could try to just "keep the setup step around", but I don't really want the server to try to guess when the client is done, and it can't rely on the client to tell it when to clean up.
Here's what I would LIKE to happen:
- Client-side, I gather up the requests that I want to make, A,B and C. They all require the same setup step.
- So that I don't have to repeat the setup step, I issue one ajax call for A,B and C simultaneously.
- So that the user doesn't have to wait forever for results, I return answers for A,B and C asynchronously, handing them back to the client as the results become available.
Something like:
$.ajax({
type: "GET",
url: "/?A=1&B=2&C=3",
partialSuccess: function (data) {
if (partialSuccess.which == "A") {
doStuffForTaskA();
}
},
success: function (data) {
console.log("all tasks complete!");
}
});
I'm pretty sure the sort of thing I have in the code above is not possible. Any thoughts on the best way to accomplish what I'm trying to do? (I am also the author of the server-side code. It happens to be c sharp, but I'm somewhat more interested in this as a 'which protocol, and how does it work' question)

AJAX queries failing for exactly 60 seconds per time

I have a Javascript function that runs every 5 seconds and requests information from the same server via a jQuery AJAX call. The function runs indefinitely once the page is loaded.
For some reason the AJAX query is failing about once every minute or two, and showing
ERR_EMPTY_RESPONSE
in the console. The odd thing is, it fails for exactly 60 seconds, then starts working fine for another minute or two.
So far I've tried with no success:
Different browser
Different internet connection
Changing the polling time of the function. (Still fails for 60 second intervals. eg run every 10 seconds, it fails 6 times. Or 5x12 or 1x60)
Web searches which suggesting flushing ip settings on my computer
I never had any problem on my last server which was a VPS. I'm now running this off shared hosting with GoDaddy and wonder if there's a problem at that end. Other sites and AJAX calls to the server are working fine during downtimes though.
I also used to run the site over HTTPS, now it's over plain HTTP only. Not sure if relevant.
Here's the guts of the function:
var interval = null;
function checkOrders() {
interval = window.setInterval(function () {
$.ajax({
type: "POST",
dataType: "json",
url: "http://www.chipshop.co.nz/ajax/check_orders.php",
data: {shopid : 699},
error: function(errorData) {
//handle error
},
success: function(data) {
//handle success
}
});
}, 5000); // repeat until switched off, polling every 5 seconds
}
Solved: It turned out the problem was with GoDaddy hosting. Too many POST requests resulted in the 60 second 'ban' from accessing that file. Changing to GET avoided this.
This page contains the answer from user emrys57 :
For me, the problem was caused by the hosting company (Godaddy)
treating POST operations which had substantial response data (anything
more than tens of kilobytes) as some sort of security threat. If more
than 6 of these occurred in one minute, the host refused to execute
the PHP code that responded to the POST request during the next
minute. I'm not entirely sure what the host did instead, but I did
see, with tcpdump, a TCP reset packet coming as the response to a POST
request from the browser. This caused the http status code returned in
a jqXHR object to be 0.
Changing the operations from POST to GET fixed the problem. It's not
clear why Godaddy impose this limit, but changing the code was easier
than changing the host.

Is data cached on server or client or not at all in AngularJS when an error occurs in a promise?

Keep in mind, I'm running an old version of AngularJS (1.0?), so things may have changed, but I have code that looks like:
promise = $http.get(urlFormattedString).success(function (data) {
data.forEach(function (result) {
//do something with result and $scope});
promises.push(promise);
$q.all(promises).then(function (data) {
//do something good when everything works!
};
When no errors are thrown, everything "works", but my question is what happens when one of the promises throws an error (say 1 out of 20)? Let's make this more interesting (and closer to my application) and assume that each promise is requesting data from a database (MongoDB in my case).
If I have to re-run everything, does that mean necessarily that all the data needs to be fetched again? Am I relying on the database to cache the data so the repeated requests run much faster? Or maybe the server (NodeJS in my case) caches the data? Along these lines, when are the data actually sent to the client from the server? Is it only upon success of all promises or is the data returned to the client from each promise separately? And if so, does Angular do the caching?
Just laying this out here makes me realize this is pretty complex and lots of scenarios to consider. Would appreciate any help or pointers to reading/documentation on this subject. Thanks!
Suggest you familiarize yourself with the network tab of your browser dev tools.
You can see every request made for all resources from html to images to scripts as well as ajax requests. This will give you a far better feel for how the app works in general
As for errors ... unless you implement error handling your app will simply fail for that request with no indication given to user that anything went wrong.
As for caching ... your ajax requests won't be cached client side and unless you have implemented caching mechaanisms on server they won't be cached there either

display number of message dynamically using javascript or asp.net

I will do my best to explain my problem to avoid people pointing me into different directions.
I got an assignment from business people. I don't know the terminology. It is very similar to email notification, message notification on facebook, notification on social media games.
For example, people are sending 20 email messages 5 minutes ago. the screen will display 20 (see attachment). Now, 3 more messages have arrived, the web page should update the number to 23.
Facebook has similar concepts when our friends like/comment message. The notification changes. Same thing is true on social media game. Any status changes on our game, it will reflect it.
I kind of have idea on how to do it cosmetically (on CSS). How to do it using javascript/asp.net. Do I need to postback in order to refresh the message. I never pay attention to that on facebook/yahoo email/social media games. All I know is something is happening, the webpage is updating the status.
Sorry the question is too broad. If someone can help me to point to the right direction, I appreciate any help
HTML5 introduced a very interesting concept call Server-Sent Events in which server can dynamically connect to the client and send data.
Eg:
var source = new EventSource("demo_sse.asp");
source.onmessage = function(event) {
document.getElementById("result").innerHTML = event.data + "<br>";
};
And on server side you can write,
<%
Response.ContentType = "text/event-stream"
Response.Expires = -1
<!--Writing "data:" is important-->
Response.Write("data: The server time is: " & now())
Response.Flush()
%>
However, some old browsers may not support this.
One other way to accomplish this task is to use Ajax call as,
function checkNewMessages(totalMessages){
return $.ajax({
url: 'demo.asp',
type: 'GET',
cache: false,
data: {
totalMessages: totalMessage;
}
});
}
checkNewMessages(totalMessages).success(function (data) {
// display the data wherever you want
});
//Checking for new messages every 5 seconds
setInterval(checkNewMessages(totalMessages,5000));
Whatever you write within your Write() in server side will be displayed here. Now to constantly check for the new messages you can call the above ajax function with the help of setInterval()
There are many ways to do this, depending on how real time you need it to be.
The most common way is to use JavaScript with an XmlHttpRequest to call an ASP.NET page which returns the number of messages, I recommend you use a JSON object for this. The benefit of this approach allows you to request data from the server without the user experiencing a full page refresh. You can use JavaScript to set it to call every x seconds depending on your requirements.
Collectively that is known as AJAX, using a JavaScript library such as JQuery can make this much easier.

Categories

Resources