jQuery $.get() is blocking other requests - javascript

I'm developing a web application and use jQuery to make asynchronous HTTP requests to my API. I have a detail view where you can see a lot of information of a specific object stored in the database. Because there is a lot of information and data that is linked to other objects, I make different calls to my API to gather different information for my views.
In the 'detail view' I have some kind of widgets that show the requested information. For that, I make about 5-7 HTTP GET requests to my API. When using the debugger (both Safari and Firefox), I can see that some requests are blocking other requests and the page takes a lot of time until everything is loaded and shown to the user.
I make a request like this:
$.get("api/api.php?object=myobject&endpoint=someendpoint", function(data) {
// data is JSON formatted
$("#my-widget input").val(data["name"]);
});
And another one e.g. like this:
$.get("api/api.php?object=anotherobject&endpoint=anotherendpoint", function(data) {
// data is JSON formatted
$("#other-widget input").val(data["somekey"]);
});
If the first request takes a little longer to finish, it blocks the second request until the callback function of the first request finished. But why? I thought that those calls are asynchronous and non-blocking.
I want to build a fast web application for a company where the requests are only made inside the local network, so a request should only take about 10-50ms (or even less). But the page takes about 10 seconds to show up with all information.
Am I doing something wrong? Or is there a JavaScript framework that can be used for exactly this problem? Any help is appreciated!
EDIT: As you can see in the screenshot, the requests have to wait some seconds, and if the request is fired, it takes a few seconds until a response comes back.
If I call the URL directly in my browser or do a GET request using curl it is a lot faster.
EDIT2: Thanks #CBroe! The session file write lock was the problem. As long as the session file is locked, no other script can run until the previous script finished. I just called session_write_close() immediately after session_start() and it runs a lot faster now.
Attention: Use session_write_close() only if you don't need to write to the $_SESSION array. Reading is possible after that, but writing not. (See this topic for further details: https://stackoverflow.com/a/50368260/1427878)

Related

Why would replacing a div via a GET request through AJAX cause my website to slow down so much compared to no JavaScript at all?

So the whole reason I am using AJAX is to make page changes seem smoother. However I have realized that using AJAX is actually slowing down the website significantly. I am using localhost with apache. I am running php on the backend to access a database for various pages.
It's taken up to 5 seconds just to load a single page.
Here is some AJAX:
$(function() {
$(".menu_nav").click(function() {
$.ajax({
type: "GET",
url: "menu.php",
dataType: 'html',
success: function(data) {
var toInsert = $(data).filter(".placeholder");
var style = $(data).filter("#style");
$("#style").replaceWith(style);
$(".placeholder").replaceWith(toInsert);
window.scrollTo(0,0);
}
});
});
});
'menu_nav' and 'home_nav' are both divs with click events attached to them, and on click they are performing a GET request to the server and asking for a div in the .php as well as it's styling sheet. It then will replace the div and style sheet on this page with what it retrieved from the GET request. Where I am having trouble understanding though is why is this taking up to 5 seconds to perform the GET request, whereas without any javascript I am getting minuscule load times, just less "pretty"?
I looked at the timeline and network tabs in the web inspector, and had noticed that every time I perform one of these requests, I get a new file from the server, rather than reading the one I've already got, which makes sense because there might be new data in the page since the last visit, however I don't see a duplicate being added to the list of sources when I am not using AJAX. For example:
Whereas without AJAX, there is only one. This makes sense since I am initiating a GET request to the server, but the same is happening when you click a link without AJAX.
Regardless, I still don't understand what is making it so slow as opposed to not using JavaScript. I understand it is doing more in addition to just a GET request, but is filtering and replacing text after a response really what is causing this issue?
Side question: This is outside the scope of this question, but in regards to AJAX, when I perform a request to the server, is the PHP within the file still executing before it gives me the HTML? So on pages where a user must have certain permissions, will the PHP that catches that still be ran?
EDIT: I am hosting a MySQL database through a free subscription to a cloud hosting service. This issue occurs when I access my website through both localhost, and when accessing the website that way deployed via the free cloud hosting service, though it is way slower when I use the cloud service. I am also using various resources from the MAMP (MacOS Apache, MySQL, PHP; If you're on windows and interested, WAMP is also available fore free) installation.
I'm not sure what is causing your slowness issues, but you could try doing some profiling to narrow down the issue. My guess is that while changing your code to use ajax, you also introduced or revealed some bug that's causing this slowness issue.
Is there an issue with the javascript? You can place console.time() and console.timeEnd() in different places to see how long a chunk of javascript takes to execute. (e.g. at the start and end of your ajax callback). Based on what you posted, this is likely not the issue, but you can always double check.
Is it PHP that's running slow? You can use similar profiling functions in PHP to make sure it's not hanging on something.
Are there network issues? You could, for example, log the timestamp of when javascript sent the request and when PHP received it, and vice versa. (this should work OK on localhost, but in other environments you have to be careful of clocks being out of sync)
There's a lot that could be going wrong here, so its hard to give a more specific answer, but hopefully that gives you some tools to help you start looking.
As for your side question: you are correct - PHP will start sending the HTML while it continues to execute. For example:
<div>
<?php someLongBlockingFunction(); ?>
</div>
<div> will get sent to the browser, then PHP will stall on the long-running-function before it finally sends out the ending </div>. The browser will piece together the chunks, and your event listener won't get called until PHP has finished sending the entire file.

Dojo: all xhr / ajax calls seem to be synchronous and block other calls

I am working on a CRM we inherited. Long story short - there is a button that calls a php script which should run in background and we don't need to wait for response.
request(idata+'transferInTimeExec.php',{
sync: false,
preventCache:true,
method:'GET'
});
Now, the transferInTimeExec.php takes an hour to run, it's a very complex script that deals with weekly timesheets for a recruitment company, processes them, does a lot of DB operations etc.
Using Chrome. Every time I press the button to run it, it blocks all the xhr calls until it finishes. CRM is "ajax heavy" and while the script is running, the user can't do anything, if they navigate to another subpage, no xhr requests will resolve until that process we started has finished. Even when I open a new browser tab and try to do something, it won't do it. If I open the CRM in another browser (Firefox) while the script is running, I can use the CRM.
In Network tab - the first one is pending, and as you can see all the subsequential calls to a different Ajax call wait (all have sync:false)
I even replaced the whole logic with PHP function sleep(30) to make it just do nothing for 30 seconds before returning anything - same issue.
I tried XHR in plain javascript, inside onClick on the button HTML markup, rather than Dojo methods - same issue.
I've done a brutal Search/Replace on the whole project, replacing sync:true to sync:false - nothing changed.
I have run out of ideas, maybe anyone here can help to figure this out? Is there a global switch for sync/async? What else could it be if not ajax issue?
Your script transferInTimeExec.php is probably using session. When thats the case, other AJAX Call would not initiate and instead wait for this AJAX Call to Finish so as to not overwrite the session data. Setting AJAX call to asynchronous does not change the behavior.
If this script takes 1 hour to run it is a bad idea to call it as an AJAX from UI. You should setup a Cron, hourly or daily, and perform all the operations in the backend. Of course, you will need to make some adjustment to the script if it is using session.
EDIT
You could use session_write_close(); at the top of your script. This tells the PHP that this script would not write anything to the Session so other AJAX Calls are free to use the Session. However be careful to not write anything to Session after this as this will result in an error.

Node.js Request drops before Response is received

The project that I am working on is to receive a request where in the main and/or most part of that request consists of data coming from a database. Upon receiving, my system proceeds with its function which is to parse all the data and ultimately concatenates the needed information to form a query, then insert those data using the mentioned query into my local database.
It is working fine and no issue at all. Except for the fact that it takes too long to process when the request has over 6,000,000 characters and over 200,000 lines (or maybe less but still with large numbers).
I have this tested with my system being used as a server (the supposed setup in production), and with Postman as well, but both drops the connection before the final response is built and sent. I have already tested and seen that although the connection drops, my system still proceeds with processing the data even up to the query, and even until it sends its supposed response. But since the request dropped somewhere in the middle of the processing, the response is ignored.
Is this about connection timeout in nodejs?
Or limit in 'app.use(bodyParser.json({limit: '10mb'}))'?
I really only see 1 way around this. I have done similar in the past. Allow the client to send as much as you need/want. However, instead of trying to have the client wait around for some undetermined amount of time (at which point the client may timeout), instead send an immediate response that is basically "we got your request and we're processing it".
Now the not so great part but it's the only way I've ever solved this type of issue. In your "processing" response, send back some sort of id. Now the client can check once in a while to see if it's request has been finished by sending you that id. On the server end you store the result for the client by the id you gave them. You'll have to make a few decisions about things like how long a response id is kept around and if it can be requested more than once, things like that.

Last-modified of a file - loop

I have the following webpage that pulls in the "last-modified" date of each file loaded:
http://f150.atwebpages.com/list.html
Now, do you see how it loads the dates 1 by 1 as it makes the calls to the header? If it didn't do it for you, hit F5 to reload the page.
Is there any way to call a global function, ajax/jquery call or something that sends all of the requests at once so it doesn't cause the page to load slowly?
(some of my pages have around 300 documents on them)
Thanks!
No there are limits in how many requests you can do at once, and it doesn't make sense to have such an implementation that requires you to hammer the web server.
Instead make a handler on the backend using your favorite language where you can request a list of dates in one request

jQuery: Using a single Ajax call, receive progressive statuses instead of one single response?

I'm just wondering..is it possible to receive multiple responses from a single ajax call?
I'm thinking purely for aesthetic purposes to update the status on the client side.
I have a single ajax method that's called on form submit
$.ajax({
url: 'ajax-process.php',
data: data,
dataType: 'json',
type: 'post',
success: function (j) {
}
});
I can only get one response from the server-side. Is it possible to retrieve intermittent statuses? Such as:
Default (first): Creating account
Next: Sending email confirmation
Next: Done
Thanks for your help! :)
From a single ajax call, I don't think it is possible.
What you could do is check frequently where the process is (it's what is used for the upload bars in gmail for example). You do a first ajax request to launch the process, and then a series of ajax request to ask the server how he is doing. When the server answers "I'm done", you're good to go, and until that you can make the server respond and say the current state.
There is something called comet which you can set up to "push" requests to client, however it is probably way more than what you are wanting to invest in, time-wise.
You can open up a steady stream from the server, so that it continues to output, however I'm not sure how client-side script can handle these as individual "messages". Think about it like a server that outputs some info to the browser, does more work, outputs some more to the browser, does more work, etc. This shows up more or less in real time to the browser as printed text. It is one long response, but it is still one response. I think ajax only handles a response once it finished being sent, but maybe someone else will know more than me on the topic.
But you couldn't have the server output several individual responses without reloading itself, at least not with PHP, because once you start outputting the response, the response has begun and you can't chop that up without finishing the response, which happens when the script is done executing.
Your best bet is with the steady stream, but again, I'm not sure how ajax handles getting responses in chunks.
Quick Update
Based on the notes for this plugin:
[http://plugins.jquery.com/project/ajax-http-stream]
things don't look promising. Specifically:
Apparently the trend is to disallow access to the xmlhttprequest.responseText before the request is complete (stupid imo). Sorry there's nothing I can do to fix this
Thus, not only can you not get what you want in one request, you probably can't get it multiple requests, unless you want to break up the actual server-side process into several parts, and only have it continue to the next step when an ajax function triggers it.
Another option would be to have your script write it's status at specific points to another file on the server, call it "status.xml" or "status.txt". Have your first ajax function initialize the process, and have a second ajax function that queries this status file and outputs that to the user.
It is possible, but it has more to do with your backend script. As Anthony mentioned there is a tech called comet. Another term I've heard is called "Long polling". The idea is that you delay the time in which your php(insert language of choice) script finished processing.
In php you can do something like this:
while($response !== 'I'm done'){
sleep(1);
}else{
return $some_value;
exit();
}
This code stops your script from completely finishing. sleep(1) allows the script to stop and lets the server rest for 1 millisecond, before it loops back through. You can adjust the sleep time based on your needs. In php the amount of time the script sleeps is not counted agains your server timeout time.
You'll obviously need to make more checks for you code. You'll probably also want to allow for an abort script call. Something like sending a get request to kill the backend script. Maybe on the javascript unload event.
In the tests that I've done. I made the initial ajax call, and when the value was returned, I made another ajax call, that way your back end script wont time out.
I've only played around with this on my local server, so i'm not sure how real world this is, but it works.

Categories

Resources