Browser locks during server side expensive process from async request - javascript

So I have a quite expensive and complex PHP process which makes its execution long lasting, lets call it function "expensive_process()".
I have an interface which through a press of a button calls an ajax request to a PHP script which in turn initiates "expensive_process()". Here's the javascript code:
$('#run_expensive_process_button').click( function(){
var url = "initiate_expensive_process.php";
$.ajax({
url: url
});
});
And initiate_expensive_process.php code:
<?php
session_start();
run_expensive_process();
?>
Simple and trivial. Now the issue with this is that while expensive_process() is running, the browser is losing the ability to navigate the domain. If I refresh the browser window it hangs indefinitely while the process last. If I redirect to a different url under the same domain, same thing. This happens in all browsers. However, if I relaunch the browser (close and open a new window, not a tab), navigation works normally, even though expensive_process() is still running.
I've inspected network traffic, and the HTTP request to initiate_expensive_process.php doesn't get a response while expensive_process() is running, but I'm assuming this shouldn't be locking the browser given the asynchronous nature of the request..
One more thing, which I believe is relevant. This situation is happening on a replica server. On my local machine, where I run WAMP and the same source code, this is not happening, i.e., while expensive_process() is running, I'm still able to navigate the hosting domain without having to relaunch the browser. This seems to be an indication of a server configuration problem of some sort, but I'm not sure I can rule out other possible reasons.
Anyone know what might be causing this or what can be done to figure out the source of the problem?
Thanks

Most likely the other PHP scripts also session variables. Only one script process can access a session at a time; if a second script tries to access the session while the first script is still running, it will be blocked until the first script finishes.
The first script can unlock the session by calling session_write_close() when it's done using the session. See If call PHP page via ajax that takes a while to run/return (and it sets session variables), will a 2nd ajax call see those session changes? for more details about how you can construct the script.

I wonder whether it might be due to ajax. The javascript is being executed client-side.
Maybe you might consider a stringified JSON call instead of ajax?

Related

Why would replacing a div via a GET request through AJAX cause my website to slow down so much compared to no JavaScript at all?

So the whole reason I am using AJAX is to make page changes seem smoother. However I have realized that using AJAX is actually slowing down the website significantly. I am using localhost with apache. I am running php on the backend to access a database for various pages.
It's taken up to 5 seconds just to load a single page.
Here is some AJAX:
$(function() {
$(".menu_nav").click(function() {
$.ajax({
type: "GET",
url: "menu.php",
dataType: 'html',
success: function(data) {
var toInsert = $(data).filter(".placeholder");
var style = $(data).filter("#style");
$("#style").replaceWith(style);
$(".placeholder").replaceWith(toInsert);
window.scrollTo(0,0);
}
});
});
});
'menu_nav' and 'home_nav' are both divs with click events attached to them, and on click they are performing a GET request to the server and asking for a div in the .php as well as it's styling sheet. It then will replace the div and style sheet on this page with what it retrieved from the GET request. Where I am having trouble understanding though is why is this taking up to 5 seconds to perform the GET request, whereas without any javascript I am getting minuscule load times, just less "pretty"?
I looked at the timeline and network tabs in the web inspector, and had noticed that every time I perform one of these requests, I get a new file from the server, rather than reading the one I've already got, which makes sense because there might be new data in the page since the last visit, however I don't see a duplicate being added to the list of sources when I am not using AJAX. For example:
Whereas without AJAX, there is only one. This makes sense since I am initiating a GET request to the server, but the same is happening when you click a link without AJAX.
Regardless, I still don't understand what is making it so slow as opposed to not using JavaScript. I understand it is doing more in addition to just a GET request, but is filtering and replacing text after a response really what is causing this issue?
Side question: This is outside the scope of this question, but in regards to AJAX, when I perform a request to the server, is the PHP within the file still executing before it gives me the HTML? So on pages where a user must have certain permissions, will the PHP that catches that still be ran?
EDIT: I am hosting a MySQL database through a free subscription to a cloud hosting service. This issue occurs when I access my website through both localhost, and when accessing the website that way deployed via the free cloud hosting service, though it is way slower when I use the cloud service. I am also using various resources from the MAMP (MacOS Apache, MySQL, PHP; If you're on windows and interested, WAMP is also available fore free) installation.
I'm not sure what is causing your slowness issues, but you could try doing some profiling to narrow down the issue. My guess is that while changing your code to use ajax, you also introduced or revealed some bug that's causing this slowness issue.
Is there an issue with the javascript? You can place console.time() and console.timeEnd() in different places to see how long a chunk of javascript takes to execute. (e.g. at the start and end of your ajax callback). Based on what you posted, this is likely not the issue, but you can always double check.
Is it PHP that's running slow? You can use similar profiling functions in PHP to make sure it's not hanging on something.
Are there network issues? You could, for example, log the timestamp of when javascript sent the request and when PHP received it, and vice versa. (this should work OK on localhost, but in other environments you have to be careful of clocks being out of sync)
There's a lot that could be going wrong here, so its hard to give a more specific answer, but hopefully that gives you some tools to help you start looking.
As for your side question: you are correct - PHP will start sending the HTML while it continues to execute. For example:
<div>
<?php someLongBlockingFunction(); ?>
</div>
<div> will get sent to the browser, then PHP will stall on the long-running-function before it finally sends out the ending </div>. The browser will piece together the chunks, and your event listener won't get called until PHP has finished sending the entire file.

Dojo: all xhr / ajax calls seem to be synchronous and block other calls

I am working on a CRM we inherited. Long story short - there is a button that calls a php script which should run in background and we don't need to wait for response.
request(idata+'transferInTimeExec.php',{
sync: false,
preventCache:true,
method:'GET'
});
Now, the transferInTimeExec.php takes an hour to run, it's a very complex script that deals with weekly timesheets for a recruitment company, processes them, does a lot of DB operations etc.
Using Chrome. Every time I press the button to run it, it blocks all the xhr calls until it finishes. CRM is "ajax heavy" and while the script is running, the user can't do anything, if they navigate to another subpage, no xhr requests will resolve until that process we started has finished. Even when I open a new browser tab and try to do something, it won't do it. If I open the CRM in another browser (Firefox) while the script is running, I can use the CRM.
In Network tab - the first one is pending, and as you can see all the subsequential calls to a different Ajax call wait (all have sync:false)
I even replaced the whole logic with PHP function sleep(30) to make it just do nothing for 30 seconds before returning anything - same issue.
I tried XHR in plain javascript, inside onClick on the button HTML markup, rather than Dojo methods - same issue.
I've done a brutal Search/Replace on the whole project, replacing sync:true to sync:false - nothing changed.
I have run out of ideas, maybe anyone here can help to figure this out? Is there a global switch for sync/async? What else could it be if not ajax issue?
Your script transferInTimeExec.php is probably using session. When thats the case, other AJAX Call would not initiate and instead wait for this AJAX Call to Finish so as to not overwrite the session data. Setting AJAX call to asynchronous does not change the behavior.
If this script takes 1 hour to run it is a bad idea to call it as an AJAX from UI. You should setup a Cron, hourly or daily, and perform all the operations in the backend. Of course, you will need to make some adjustment to the script if it is using session.
EDIT
You could use session_write_close(); at the top of your script. This tells the PHP that this script would not write anything to the Session so other AJAX Calls are free to use the Session. However be careful to not write anything to Session after this as this will result in an error.

what causes to be only 1 concurrent connection?

I have a PHP web app.
When multiple simultaneous AJAX requests occur, it seems they are queued on the server side like only one process is run at one time. It only happens when all the requests are done from one browser.
The weirdest thing is that sometimes it runs as it should, simultaneously (screen: https://imgur.com/8oDGV8t ) and after like 10 minutes it waits one process to be done and only then it runs another process doing them one-by-one (screen: https://imgur.com/OPkzYNh ).
The code for test screenshots:
sleep(5);
exit();
P.S. when these AJAX requests are queued, also normal html requests are 'waiting in the queue'.
I think it is highly likely that this has something to do with session management.
What happens is that a new request waits for the session in the previous request to be closed.
This only happens because session data is accessed, and thus a lock is obtained on the session file.
You can avoid this by not starting the session in the first place. If you need the session you need to close the session right after it was started. If you need to set $_SESSION variables you need this before closing the session. You can do this like so:
session_start();
$_SESSION['some'] = 'value';
session_write_close(); // From here on out, concurrent requests are no longer blocked
$_SESSION variables will still be available after closing the session.
See also: https://codingexplained.com/coding/php/solving-concurrent-request-blocking-in-php

Server sent response but Client not loading the sent content

I have simple node server (nodejs + express + ejs rendering), when user tried to load a particular page server sends that compiled html and sends that in response.
In order to track users I have added 2 counters
a. counter which increments when server receives request for that page
b. when client loads the page it contains one some code which will make HTTP requests back to my server which I use as counter
Now the issue is that as time passes on the difference between sentResponse counter and clientLoad counter increases increases so much so that I get sentResponse = 7000 and clientLoad = 3600.
Any suggesstions on what could cause that kind of behavior
Note: I also have setup Cloudfare before requests reaches my server and I paused it but still I was getting huge differences ?
Note: I also noticed that lots of users are making requests to the page frequently like multiple times under 4s but I am sure that I am sending valid html and I different is almost 50% so I dont think that every user that visits the page is pressing ctrl+r 2 or more times under 4s.
Code:
server.js
app.get('/dashboard', (res, res) => {
...
...
...
Tracker.incrementCount('sentResponse');
res.render(page.ejs, {...});
});
app.get('/client-load-start', (req, res) => {
Tracker.incrementCount('clientLoadStart');
res.send(200);
});
page.ejs
<html>
<head>
...
<script src='/client-load-start?version=<some_random_4_digit_number>'></script>
...
</head>
<body>
...
...
</body>
</html>
I can think of the following possibilities on why the count could be different.
Setting caching headers (Cache-Control) could cause the browser to cache requests.
JS is not enabled in client's browser and so there is no possibility in making an AJAX call to the server to update clientLoad counter (This is less possible , but still is one of the possibility).
People interacted with the page even before the page loaded causing the AJAX call to not fire. People reloaded the page before the page could load.
If you are simply looking to track
Page hits , you can use the "sentResponse" counter.
Active Users , think about web sockets (Counting the number of hits to my webpage)
You can try to use cache-busting methods. Every time the browser loads /dashboard the browser is getting a script source to your /client-load-start route. The problem is the only time /client-load-start is being requested is when the browser's cache is cleared, or the first time the site loads. This will cause the problem where the two counts can differ greatly over time.
Try to make the browser think /client-load-start is a new script that needs to be downloaded. You can do this by adding query parameters to the src.
<script src='/client-load-start?<%- Date.now()%>=1'></script>
Every time the browser goes to /dashboard it'll attempt to download the new script because the query parameters have changed thus busting the cache.
Sequence of Events
Browser requests /dashboard
Server increases sentResponse, renders and sends page.ejs to the browser
Browser receives page.ejs and detects the <script> tag
Browser checks if the script URL is in the browser cache
The sequence ends here if the script URL is found in the cache; clientLoad remains unchanged
Browser requests script from your server
Server executes get request, increases clientLoad
Problem
The "problem" here is that the browser - or any proxy between the browser and your server - will cache the returned script unless you instruct it otherwise.
Note that caching is actually expected and intended behaviour in most cases. It’s also the browsers’s default behaviour.
Solution
To instruct the browser (or proxies) to avoid caching the file, you can set cache control headers.
app.get('/client-load-start', (req, res) => {
Tracker.incrementCount('clientLoadStart');
res
.set('cache-control', 'max-age=0; private; no-cache')
.send(200);
});
This would instruct the browser to never cache the request and always get a new copy, thus increasing clientLoadStart with every request.
Another way is "cache busting", meaning you intentionally change the script’s URL, e.g. by appending a random number as you did above, thus "breaking" the cache key.

Forcing an HTTP request to fail in browser

Is it possible to make an http request that has been sent to a server by the browser fail without having to alter the javascript?
I have a POST request that my website is sending to the server and we are trying to test how our code reacts when the request fails (e.g. an HTTP 500 response). Unfortunately, the environment that I need to test it in has uglified and compressed javascript, so inserting a breakpoint or altering the javascript isn't an option. Is there a way for us to utilize any browser to simulate a failed request?
The request takes a long time to complete, so using the browser's console to run a javascript command is a possibility.
I have tried using window.stop(), however, this does not work since I need to failure code to execute.
I am aware of the option of setting up a proxy server, but would like to avoid this is possible.
In Chrome (just checked v63), you can actually block a specific URL (or even a whole domain) from the Network tab. You only need to right-click on the entry and select Block request URL (or Block request domain.)
One possible solution is to modify the XMLHttpRequest objects that will be used by the browser. Running this code in a javascript console will cause all future AJAX calls on the page to be redirected to a different URL (which will probably give a 404 error):
XMLHttpRequest.prototype._old_open =
XMLHttpRequest.prototype._old_open || XMLHttpRequest.prototype.open;
XMLHttpRequest.prototype.open = function(method, url, async, user, pass) {
return XMLHttpRequest.prototype._old_open.call(
this, method, 'TEST-'+url, async, user, pass);
};
Don't overlook the simplest solution: disconnect your computer from the Internet, and then trigger the AJAX call.
Chrome's dev tools have an option to "beautify" (i.e. re-indent) minified JavaScript (press the "{}" button at the bottom left). This can be combined with the "XHR breakpoint" option to break when the request is made. XHR breakpoints don't support modifying the response though AFAIK, but you should be able to find a way to do it via code.
To block a specific URL and make an API call failure, you just need to follow below steps:
Go to Network tab in your browser.
Find that API call which needs to fail(as per your requirement).
Right click on that API call and
Click on 'Block Request URL', you can unblock as well in same manner as the option will turn into 'Unblock'
Just type at the brower a changed URL, e.g. the well formed URL e.g. http://thedomain.com/welcome/ by another placing "XX": http://thedomain.com/welcomeXX/ , that will cause a 404 error (not found)

Categories

Resources