I try to do some simple Ajax call with refresh of page content with this code snippet:
$("a.ajaxify-watched").bind("click", function(event) {
$item = $(this);
$.ajax({
url: $(this).attr("href"),
global: false,
type: "GET",
data: ({
callback : "inline"
}),
dataType: "json",
async:false,
success: function(msg){
if (msg.status == 200) {
toggleStatus($item, msg)
}
}
});
return false;
});
This works perfect and for me theres nothing to worry about the code but the speed it gets executed.
The first time everything works really fine: 47ms for the operation. But after that every other Ajax call get a constant delay of 2.6 seconds - everytime. I checked with Firebug and found that it is shown as "Waiting Time".
I can't really say whats happening here.
We recently switched from pure Apache2 to Nginx Caching Reverse Proxy with load balancing with Apache as Backend Php Interpreter. We could see a huge performance boost and everything is working really fine. I can't tell when my problem first appeared and if it really has something to with our new server setup.
I just found out today, that there is a problem with jQuery so I just wanted to give as much info as possible.
Thank you and let me know if I should provide additional information.
If Firebug indicates waiting it looks like a server issue.
Is there a way to call the pages directly without ajax? If so try to do that and check if the pages load fast or slow?
Also, check that you do not have any events triggering on ajax that might affect this.
Most servers will only handle a single page load from the same server so multiple page loads is handled in queue, so if you have parallell ajax calls they might have to wait for each other.
Related
My page loads all necessary data from the server at startup via AJAX. This includes user's language settings, various classifiers, some business data etc.
The problem I am facing is that when the user first comes to the page, all these different AJAX calls are kicked off at the same time. This means that on the server side, most of them are assigned different JSESSIONID-s (I am using Spring on Tomcat 8 without any complex configuration). As a result, some of the data is initialized on the server side in one session, but the browser might end up using a different session in the end and does not have access to the data set up by earlier ajax calls.
I wanted to solve this by using a fast synchronous AJAX call in the very beginning so that after it returns and gets a JSESSIONID, all subsequent calls would be made in this original session.
$.ajax("api/language", {
type: "GET",
cache: false,
async: false,
success: function(data) {
//do stuff;
}
});
// more AJAX calls
It works, but I get warning messages that synchronized XMLHttpRequest on main thread is deprecated. Now - I understand the reasons why such a synchronized call is bad for UI in general, but what other options are there available for me if I want to force all AJAX calls to use the same server side session?
I can achieve the same result by using a callback and just placing all the rest of my page initialization code in there, executing it in the 'success' section of the first AJAX call, but that wouldn't that have exactly the same effect as synchronizing on main?
I'd initiate the session when loading the HTML document rather than when requesting something from the API.
Alternatively, trigger the subsequent API calls from the success callback of the first one.
"Hacky" solution
You really give your own solution at the end: wrap everything in an asynchronous AJAX call. It is similiar to the synchronous solution, but this way you can set up a loading animation, or something similar.
"Nice" solution
Another, possible nicer solution. When the user arrives, you can redirect to the starting page of your web application with the generated jsessionid. This can be done with a servlet. I am quite sure that Tomcat can be configured to do this without writing your own code.
at the beginning I would like to say sorry for my bad english.
jquery v2.0.0 in Google Chrome, Mozilla Firefox, Opera last versions
Today I had a problem
timer_multy_update = setInterval(
function()
{
$.get(
'test.php',
function (result){
parseAndUpdateData(result);
},
"json"
);
}, 500)
The problem is that if the server hangs (I don't know how to say it correctly), i.e. time to get answer from the server more the 0,5 second, but timer not stay and continues to send request, so before the server answer it can be send 2-4 request, all this answer return for a little time and, now a problem, in firebug all request correct, but variable result contains only one answer from the first answer from the server. maybe I did not express myself clearly, I want to say that 2-4 request to the server return different answer, but in result gets all 2-4 times the first answer from the server, and it is big problem.
I tried to find information on the Internet, but found nothing.
I do not know why but the first thought was that the error in jquery, I began to look at the source code, and found some mention about heder and it's hashes.
So i try to change my script and find to way
$.get
(
'/php/mine/update_cells.php',
't='+Math.random(),
function (result)
{
parseAndUpdateData(result);
},
"json"
);
it works correctly
so I want to now, bug it is or my mistake and not understanding
This isn't a bug, it's caching. It's a lot more efficient for a browser to cache a resource for a while then have to go and get it every time someone wants it. This is fine for static resources, i.e. ones that don't change much but for a web service that returns different results frequently for the same URL you will want to disable the caching. If you control the server side code, add a Cache-Control: no-cache header to the response. You can disable caching in your jQuery but as far as I know, you have to use the ajax() function - there's no way to do it with get().
$.ajax({
url: "/php/mine/update_cells.php",
success: function(result){
parseAndUpdateData(result);
},
cache: false,
dataType: 'json'
});
I have a single-page application that handles business data. I'm using jQuery .ajax().
On submit of a form, we load in a request via AJAX. No problem there, the "page" is now fully loaded. When it finishes loading, I send off another AJAX request that contains a lot of information - it's a box that loads a bunch of business statistics. This can take up to 15 seconds. During this time, the browser will not process another AJAX request, which in this case is a series of navigation links that load other pages through AJAX. It will start the request, but it appears to be unable to load anything until the statistics-AJAX finishes its load. This is true even when I am accessing a navigation page that is merely HTML. (so it should load instantly)
I am not using async: false. A request looks like:
$.ajax({
type:'POST',
url: 'load_navigation',
data: passed_data,
success: function(data) {
$('#div-to-put-result').html(data);
}
});
The only idea I have is connection limitation.
Please refer to: Max parallel http connections in a browser?.
I read that it can be limited stronger when you using low bandwidth connection, but do not have reference at this time.
As a test you can put hidden IFRAME with get request instead of post ($.param(passed_data)). To get data use something like:
var uploadIframe = $("IFRAME");
uploadIframe.load(function() {
var iframeBody = uploadIframe.contents().find("body");
data = iframeBody.text();
});
Good luck!
I am using jquery for all my ajax thing, I don't know if that is fine but I use that for now.
I have one text input when user type characters in it I call server side get some values and add them on the view.
Code that I use bellow works fine but I want to improve it a little.
How can I make this ajax call so that users that want to investigate my page source code can't see what I call here?
So basically I want to hide from page source what url, what type and data send I use here, is it possible?
$(function () {
$("#txtSearch").keyup(function (evt) {
$.ajax({
url: "/Prethors/Users/SearchUsers",
type: "POST",
data: "text=" + this.value,
success: function (result) {
$("#searchResult").prepend("<p>" + result + "</p>");
}
});
});
});
No, a user will always be able to figure out what calls you are making if you include it in javascript.
You can compress and minify the javascript, but a determined person will always be able to find your url calls.
Here's a js compression site, for example.
http://jscompress.com/
overall, you shouldn't worry about this. there is no way I'm aware of to hide your ajax calls, but you shouldn't need to.
-you could encrypt the info.
-you could use comet to stream the data on a persistent connection. (super complicated).
-follow good server security practices and not worry about it.
source: here
If you are really worried about this, you could set up kind of an anonymous URL, which will then redirect to where you really want to go based on some variable which is arbitrary.
for example, instead of going to "/Prethors/Users/SearchUsers"
go to "/AnonymousCall?code=5"
from which you could execute the code you want for searchusers
You can't hide client-side code. You can disguise it with minification but sensitive data should always be stored and processed on the server-side.
Use console.clear(); after you ajax calls :P
It just clears the reqs from the console but you still cannot hide client side calls.
The answer on Auto refreshing with Javascript? seemed like exactly what I needed, but after a while I found something not working like I wanted it to :( When I made a timer to check the web page every 5 seconds, it kept returning the same string even after the page changed. I think this is happening because it's doing the equivilant of F5; re downloading the page only if the php script has been changed by me, and if not just sending my Javascript what's in the browser's cache, if that makes any sense. The problem is, the page isn't actually being re-uploaded every five seconds, the reason the page would change is because of the database content the PHP is displaying. What I would like is a function similar to $.get but will act more like Ctrl-F5, not using any cache, just re downloading the whole page. Sorry if this doesn't make any sense...
UPDATE: What I'm asking for is not a Javascript script that Ctrl-F5's the page, what I'm asking for is a function like $.get that downloads from the server no matter what. urgle, see, $.get only downloads from the server if the page has been edited since x time (and if it hasn't it'll return a copy of the page from the browser's cache), but I don't want it to do the x time thing, I just want it to download the page no matter the last-edited time.
I always throw useless query like this:
$.get(url + '?v=' + Math.random(), success: function (data) {
// stuff
})
That math.random basically tricks the browser into not caching that request.
Use jQuery.ajax(). Pass through cache : false as one of the parameters.
For example, straight from the documentation:
$.ajax({
url: "test.php",
cache: false,
success: function(){
//whatever
}
});
Source: http://api.jquery.com/jQuery.ajax/
The best solution I can imagine is to update/redirect to a page that redirects you back to current. I'm not sure if javascript has a ctrl+f5 function.
If using jQuery use the cache option with the .ajax call. Alternatively append a date stamp to the url to force the browser to fetch each time.