How to avoid synchronous AJAX without spawning sessions - javascript

My page loads all necessary data from the server at startup via AJAX. This includes user's language settings, various classifiers, some business data etc.
The problem I am facing is that when the user first comes to the page, all these different AJAX calls are kicked off at the same time. This means that on the server side, most of them are assigned different JSESSIONID-s (I am using Spring on Tomcat 8 without any complex configuration). As a result, some of the data is initialized on the server side in one session, but the browser might end up using a different session in the end and does not have access to the data set up by earlier ajax calls.
I wanted to solve this by using a fast synchronous AJAX call in the very beginning so that after it returns and gets a JSESSIONID, all subsequent calls would be made in this original session.
$.ajax("api/language", {
type: "GET",
cache: false,
async: false,
success: function(data) {
//do stuff;
}
});
// more AJAX calls
It works, but I get warning messages that synchronized XMLHttpRequest on main thread is deprecated. Now - I understand the reasons why such a synchronized call is bad for UI in general, but what other options are there available for me if I want to force all AJAX calls to use the same server side session?
I can achieve the same result by using a callback and just placing all the rest of my page initialization code in there, executing it in the 'success' section of the first AJAX call, but that wouldn't that have exactly the same effect as synchronizing on main?

I'd initiate the session when loading the HTML document rather than when requesting something from the API.
Alternatively, trigger the subsequent API calls from the success callback of the first one.

"Hacky" solution
You really give your own solution at the end: wrap everything in an asynchronous AJAX call. It is similiar to the synchronous solution, but this way you can set up a loading animation, or something similar.
"Nice" solution
Another, possible nicer solution. When the user arrives, you can redirect to the starting page of your web application with the generated jsessionid. This can be done with a servlet. I am quite sure that Tomcat can be configured to do this without writing your own code.

Related

jQuery $.get() is blocking other requests

I'm developing a web application and use jQuery to make asynchronous HTTP requests to my API. I have a detail view where you can see a lot of information of a specific object stored in the database. Because there is a lot of information and data that is linked to other objects, I make different calls to my API to gather different information for my views.
In the 'detail view' I have some kind of widgets that show the requested information. For that, I make about 5-7 HTTP GET requests to my API. When using the debugger (both Safari and Firefox), I can see that some requests are blocking other requests and the page takes a lot of time until everything is loaded and shown to the user.
I make a request like this:
$.get("api/api.php?object=myobject&endpoint=someendpoint", function(data) {
// data is JSON formatted
$("#my-widget input").val(data["name"]);
});
And another one e.g. like this:
$.get("api/api.php?object=anotherobject&endpoint=anotherendpoint", function(data) {
// data is JSON formatted
$("#other-widget input").val(data["somekey"]);
});
If the first request takes a little longer to finish, it blocks the second request until the callback function of the first request finished. But why? I thought that those calls are asynchronous and non-blocking.
I want to build a fast web application for a company where the requests are only made inside the local network, so a request should only take about 10-50ms (or even less). But the page takes about 10 seconds to show up with all information.
Am I doing something wrong? Or is there a JavaScript framework that can be used for exactly this problem? Any help is appreciated!
EDIT: As you can see in the screenshot, the requests have to wait some seconds, and if the request is fired, it takes a few seconds until a response comes back.
If I call the URL directly in my browser or do a GET request using curl it is a lot faster.
EDIT2: Thanks #CBroe! The session file write lock was the problem. As long as the session file is locked, no other script can run until the previous script finished. I just called session_write_close() immediately after session_start() and it runs a lot faster now.
Attention: Use session_write_close() only if you don't need to write to the $_SESSION array. Reading is possible after that, but writing not. (See this topic for further details: https://stackoverflow.com/a/50368260/1427878)

How to fix jQuery AJAX call preventing user from leaving page? [duplicate]

This question already has an answer here:
Asynchronous ajax request locking browser
(1 answer)
Closed 4 years ago.
I have page that has two main queries - an item and similar items.
The item is fetched as normal but the similar items are loaded as a separate jQuery ajax call which automatically triggers as the page loads. The similar items are then appended to the page. The second query is separated because it can occasionally be slow and affect user experience.
$.post(url, function(output) {
if (output) {
var data = $.parseJSON(output);
if (data.success) {
$("div").html(data.html);
}
}
});
I'm now noticing that the ajax query holds up the page even when trying to navigate away. The browser won't allow a user to leave the page until the ajax query returns something - either similar items or no results.
I haven't dealt with this scenario before. Is there a way to structure this JQuery ajax call in a way that wouldn't force the user to wait for it to complete? Addressing the speed of the server-side response is not an option at the moment.
UPDATE/ANSWER:
Posted below as an answer.
The behavior you describe looks to be "the browser is locked while the request is active".
You possibly have something like the code below somewhere:
$.ajaxSetup({
async: false
// And other parameters...
})
From the documentation:
Note that synchronous requests may temporarily lock the browser, disabling any actions while the request is active.
So... The only way to revert that is to find that code chunk and change it... OR to just redefine the async parameter again, just before the $.post() executes:
$.ajaxSetup({
async: true
})
I eventually found the answer here:
Asynchronous ajax request locking browser
PHP on the server is locking the session and it can be unlocked using php session_write_close() in the server-side code called by the ajax. Although nothing has been returned by the ajax call, clicking on a link needs the session for the next page, which is still hanging from processing the ajax call.

What is the right way for Searching functions in a Website with Javascript?

Its known that interactions between Javascript and SQL-Databases are not very secure. But most Websites use it cause the Webside doesent reload to show matches in a search.
With PHP it isn't possible to change Page-Contents without a completely Page-Refreshing.
Witch is the right way to get Data from SQL with Javascript without security-neglects.
Aspeccialy for a Searching function with directly matches in a list.
You can use 2 way to get data from db by using js;
1. Ajax:
function refresh() {
$.ajax({
url:"your url",
method: "GET",
data: your_params,
success: function(response) {
$("#specific_div_id").html(response);
}
});
}
You can do this within an interval like;
setInterval(refresh, 5000);
in order to get content in every 5 sec.
2. Websockets
In AJAX, you are requesting in every 5 secs to get updated content from server. Think that, you are not getting content server pushes updated content to you. In other words, server notifies you on any updated data. You can have a look at Socket.io for an example implementation of websockets. When server notifies you, you can take data and put it related html area
As mention in the commentaries, the best way is to use AJAX, which is an acronym that stands for Asynchronous Javascript and XML.
The last part, XML, is a bit misleading. It kept that name because that's what is was first use for. But AJAX can now be use to make HTTP request and interface with any language, including PHP.
Depending on the technology you are built on, there are several implementation available. Chances are you have jQuery installed already. In that case, jQuery Ajax, and particularly jQuery.get() would address your concerns.
If you are using a router on the backend, you can simply call a route, specifying it as the url, first argument of the function. Otherwise, you can directly call a file by using the relative path from the html page the javascript is embedded in.
jQuery.get will return anything you echo within you server script. In other words, anything that is directly rendered on the page. You can use a callback catch the data returned and process it.
Example :
$.get('/path/to/file.php', function (data) {
console.log('Here is the data received from the server!', data)
// Process data here
});

how to silently guarantee executing an ASP.NET MVC3 action on page unload

I need to execute an action of a controller when a user leave a page (close, refresh, go to link, etc.). The action code is like:
public ActionResult WindowUnload(int token)
{
MyObjects[token].Dispose();
return Content("Disposed");
}
On window download I do Ajax request to the action:
$(window).unload(function ()
{
$.ajax({
type: "POST",
url: "#Url.Action("WindowUnload")",
data: {token: "#ViewData["Token"]"},
cache: false,
async: true
});
//alert("Disposing.");
})
The above ajax request does not come to my controller, i.e., the action is not executed.
To make the code above to work I have to uncomment the alert line, but I don't want to fire alert on a user.
If I change async option to false (alert is commented), then it sometimes works. For example, if I refresh the page several times too fast then the action will not be executed for every unload.
Any suggestions how to execute the action on every unload without alert?
Note, I don't need to return anything from action to the page.
Updated: answers summary
It is not possible reliably to do request on unload, since it is not proper or expected behavior on unload. So it is better to redesign the application and avoid doing HTTP request on window unload.
If it is not avoidable, then there are common solutions (described in the question):
Call ajax synchronously, i.e., async: false.
Pros: works in most cases.
Pros: silent
Cons: does not work in some cases, e.g, when a user refreshes the windows several times too fast (observed in Firefox)
Use alert on success or after ajax call
Pros: seems to work in all cases.
Cons: is not silent and fires pop up alert.
According to unload documentation, with async: false it should work as expected. However, this will always be a bit shaky - for example, user can leave your page by killing/crashing the browser and you will not receive any callback. Also, browser implementations vary. I fear you won't get any failproof even.
HTTP is stateless and you can never get a reliable way to detect that the user has left your page.
Suggested events:
Session timeout (if you are using sessions)
The application is going down
A timer (need to be combined with the previous suggestion)
Remove the previous token when a new page is visited.
Why does this need to happen at all?
From the code snippet you posted you are attempting to use this to dispose of objects server side? You are supposed to call Dispose to free up any un-managed resources your objects are using (such as Database connections).
This should be done during the processing of each request. There shouldn't be any un-managed resources awaiting a dispose when the client closes the browser window.
If this is the way you are attempting this in the manner noted above the code needs to be reworked.
Have you tried onbeforeunload()?
$(window).bind('beforeunload', function()
{
alert('unloading!');
}
);
or
window.onbeforeunload = function() {
alert('unloading!');
}
From the comment you made to #Frazzell's answer it sounds like you are trying to manage concurrency. So on the chance that this is the case here are two common method for managing it.
Optimistic concurrency
Optimistic concurrency adds a timestamp to the table. When the client edits the record the timestamp is included in the form. When they post their update the timestamp is also sent and the value is checked to make sure it is the most recent in the table. If it is, the update succeeds. If it is not then someone else got in sooner with an update so it is discarded. How you handle this is then up to you.
Pessimistic concurrency
If you often experience concurrency clashes then pessimistic concurrency may be better. Here when the client edits the record a flag is set on that row to lock it. This will remain until the client completes the edit and no other user can edit that row. This method avoids users loosing changes but add an administration over head to the application. Now you need a way to release unwanted locks. You also have to inform the user through the UI that a row is locked for edit.
In my experience it is best to start with optimistic concurrency. If I have lots of people reporting problems I will try to find out why people are having these conflicts. It maybe that I have to break down some entities in to smaller types as they have become responsible for doing too many jobs.
This wont work and even if you are able to somehow make it work it will give you lots of headaches later on, because this is not how the browser/HTTP is supposed to be used. When the page is unloading (in browser) the browser will call the unload event and then unload the page (you cannot make it wait, not even my making sync ajax calls) and in case the call was going on and the browser after executing the code unload the page, the call will also get cancelled and thats why you see the call on server sometimes and sometimes it doesn't work. If you could tell use why you want to do this we could suggest you a better approach.
You can't. The only thing you can do is prompt the user to stay and hope for the best. There are a whole host of security concerns here.

jQuery: Using a single Ajax call, receive progressive statuses instead of one single response?

I'm just wondering..is it possible to receive multiple responses from a single ajax call?
I'm thinking purely for aesthetic purposes to update the status on the client side.
I have a single ajax method that's called on form submit
$.ajax({
url: 'ajax-process.php',
data: data,
dataType: 'json',
type: 'post',
success: function (j) {
}
});
I can only get one response from the server-side. Is it possible to retrieve intermittent statuses? Such as:
Default (first): Creating account
Next: Sending email confirmation
Next: Done
Thanks for your help! :)
From a single ajax call, I don't think it is possible.
What you could do is check frequently where the process is (it's what is used for the upload bars in gmail for example). You do a first ajax request to launch the process, and then a series of ajax request to ask the server how he is doing. When the server answers "I'm done", you're good to go, and until that you can make the server respond and say the current state.
There is something called comet which you can set up to "push" requests to client, however it is probably way more than what you are wanting to invest in, time-wise.
You can open up a steady stream from the server, so that it continues to output, however I'm not sure how client-side script can handle these as individual "messages". Think about it like a server that outputs some info to the browser, does more work, outputs some more to the browser, does more work, etc. This shows up more or less in real time to the browser as printed text. It is one long response, but it is still one response. I think ajax only handles a response once it finished being sent, but maybe someone else will know more than me on the topic.
But you couldn't have the server output several individual responses without reloading itself, at least not with PHP, because once you start outputting the response, the response has begun and you can't chop that up without finishing the response, which happens when the script is done executing.
Your best bet is with the steady stream, but again, I'm not sure how ajax handles getting responses in chunks.
Quick Update
Based on the notes for this plugin:
[http://plugins.jquery.com/project/ajax-http-stream]
things don't look promising. Specifically:
Apparently the trend is to disallow access to the xmlhttprequest.responseText before the request is complete (stupid imo). Sorry there's nothing I can do to fix this
Thus, not only can you not get what you want in one request, you probably can't get it multiple requests, unless you want to break up the actual server-side process into several parts, and only have it continue to the next step when an ajax function triggers it.
Another option would be to have your script write it's status at specific points to another file on the server, call it "status.xml" or "status.txt". Have your first ajax function initialize the process, and have a second ajax function that queries this status file and outputs that to the user.
It is possible, but it has more to do with your backend script. As Anthony mentioned there is a tech called comet. Another term I've heard is called "Long polling". The idea is that you delay the time in which your php(insert language of choice) script finished processing.
In php you can do something like this:
while($response !== 'I'm done'){
sleep(1);
}else{
return $some_value;
exit();
}
This code stops your script from completely finishing. sleep(1) allows the script to stop and lets the server rest for 1 millisecond, before it loops back through. You can adjust the sleep time based on your needs. In php the amount of time the script sleeps is not counted agains your server timeout time.
You'll obviously need to make more checks for you code. You'll probably also want to allow for an abort script call. Something like sending a get request to kill the backend script. Maybe on the javascript unload event.
In the tests that I've done. I made the initial ajax call, and when the value was returned, I made another ajax call, that way your back end script wont time out.
I've only played around with this on my local server, so i'm not sure how real world this is, but it works.

Categories

Resources