I need to perform semi-continuous AJAX requests to display data based on the latest entry into a DB. This all works fine with a setInterval() but now I notice the continuously increasing number of resources and size in the Web Inspector (see image). I imagine that this may become an issue if the app is open for long periods of time? Or is the size displayed (1) merely network activity? How could I prevent this? I have set the jQuery ajax cache to false.
Update:
Did not post any code because there's nothing special there. Just a basic jQuery ajax function, php script that queries DB based on data from the ajax function and echoes it back in a response.
So is the number of KB in the Web Inspector (1) network traffic or cached?
$(document).ready(function(){
setInterval(refresh, 2000);
})
function refresh(){
$.ajax({
type: "POST",
cache: false,
url: "../update.php",
data: dataString,
success: function(msg){
if(msg2 == 'same'){
// do nothing
}else{
$('#result').html(msg);
}
}
})
}
Related
I am rendering some stats on my page, as this takes a bit of time I made this request an ajax call after the page loads
<script type ="text/javascript">
$(document).ready(function () {
$.ajax({
url: '#Url.RouteUrl(Routes.MyAds.AjxCallFoAbc, new {advertId = Model.CreateAdvertHeader.SelectedAdvert.Id})',
type: 'GET',
cache: false,
success: function(response) {
$('.advert-performance').replaceWith(response);
}
});
});
</script>
This works perfectly for me, its causing grief when the user installs a ad-blocker, this content is being blocked, I have debugged the code-base and found the ajax call route is never being hit when the ad-blocker is enabled on the browser
What is the work-around for this, I need show the stats even if the ad-blocker is installed
Resolved it
The reason being, my route which the ajax was pointed to had a advert-stats as part of the url, which caused the blocker to block it,
simply changing the route has fixed it
I've searched SO, and every question seems to be asking how to wait for an AJAX call to complete. I am asking how not to wait for an AJAX call to complete.
I have an e-commerce site that does some heavy image manipulation via PHP. The manipulation is triggered via AJAX calls.
I am trying to speed up the user experience, so I have modified the code to first have PHP render a thumbnail (the operation completes quickly), then trigger a second AJAX call that tells PHP to render the full image (which can take a few minutes).
The "Add To Cart" operation is also an AJAX call. The problem is that the "Add to Cart" call is unable to complete until the previous AJAX call is completed.
Sequence:
AJAX call A requests thumbnail be generated.
The success callback for call A:
a. Displays / enables the "Add to Cart button"
b. Triggers AJAX call B, for the full image to be generated.
Clicking "Add to Cart" triggers AJAX call C, which does not complete until call B completes.
Relevant javascript:
/** Call A - make call for thumbnail generation **/
$.ajax({
url: 'index.php?route=decorable/image/thumbnail',
type: 'post',
data: image_data,
dataType: 'json',
success: function(json) {
if (json['success']) {
$('#button-cart').show();
/** Call B - make call for *full* layout generation **/
$.ajax({
url: 'index.php?route=decorable/image',
type: 'post',
data: image_data,
dataType: 'json'
});
});
/** Call C - this AJAX call starts, but does not complete, until call B completes **/
$('#button-cart').click(function() {
$.ajax({
url: 'index.php?route=checkout/cart/add',
type: 'post',
data: cart_data,
dataType: 'json',
success: function(json) {
if (json['success']) {
$('.success').fadeIn('slow');
}
}
});
});
Am I misunderstanding, or should call C be able to complete even if call B is not complete?
If you believe that the PHP is holding up the experience, then is there a good way for to me to trigger PHP to begin executing, but still send the response back for the thumbnail AJAX call?
In my experience this was caused by PHP sessions been used, when you're sure in the code (all ajax requests) that you will not have to modify sessions then you should call:
session_write_close()
This will then allow other requests to be made simultaneously.
Further reading: http://www.php.net/manual/en/function.session-write-close.php
Most browsers support at max two async request at time. You can do nothing.
Be aware, that only applying session_write_close() [answer of Jono20201] may not resolve the problem, if You have enabled output buffering (default in PHP 7+). You have to set output_buffering = Off in php.ini, otherwise session won't be closed immediately.
Thanks in advance for any help.
Is it bad practice and/or inefficient to use multiple $.ajax calls in one javascript function? I've been working on one and have a simple testing environment set up on my computer (apache server with php/mysql), and I've noticed that the server will crash (and restart) if I have multiple ajax calls.
I have two ajax calls currently: one passes 4 pieces of data to the php file and returns about 3 lines of code (pulling info from my sql database), the other simply gets the total rows from the table I'm working with and assigns that number to a javascript variable.
Is it just that my basic testing setup is too weak, or am I doing something wrong? See below for the two ajax calls I'm using:
$.ajax({
type: "GET",
url: "myURLhere.php",
cache: false,
data: {img : imageNumber, cap : imageNumber, width : dWidth, height : dHeight},
})
.done(function(htmlpic) {
$("#leftOne").html(htmlpic);
});
$.ajax({
type: "GET",
url: "myotherURLhere.php",
cache: false,
success: function(data) {
lastImage = data
}
})
Short answer: two ajax request on a page is absolutely fine.
Longer answer:
You have to find the balance between minimalising the number of ajax calls to the backend reducing the traffic and communication overhead; and still maintaining a maintainable architecture (so do not pass dozens of parameters in one call to retrieve everything - maybe only if you do it in a well designed way to collect every parameter to send)
Also most likely there's something wrong with your backend setup, try looking into webserver logs
I want only execute my ajax post 1 time, i try to avoid to the user refresh the page and execute so much times the ajax post,
I thought in create a cookies, but i don't know, and i'm no sure, somebody know how?
This is my jquery.
var t = jQuery.noConflict();
t( document ).ready(function() {
t.cookie("example", "foo", { expires: 7 }); // Sample 2
console.log( "ready!" );
alert(t.cookie("example"));
var data = '<?php echo json_encode($json_full);?>';
t.ajax({
url: 'my api url',
type: 'POST',
success: function(r) { alert(JSON.stringify(r)) },
dataType: 'JSON',
data: { data: data, }
})
});
/I need run this AJAX only one time because is a checkout page to send the order, and if i refresh the page, send every time the same order, and this i don't want/
Thanks a lot!
Things like these can not be safely controlled on the client's browser. Any user with minimal knowledge in JavaScript will be able to open up the developers tools for their browser and manipulate the code or any values you might have stored (such as deleting the cookie you have set).
This limitation should be implemented on the server.
It really depends on the scope of your application. You might be able to limit the requests per IP address, but that might prevent multiple people from the same office for example loading the page at the same time.
Using user authentication and persistent server storage you'll be able to limit the effect of the request, but you probably won't be able to prevent the actual request from being sent as anyone can make that request even from outside the browser. You could store the user_id of the user that initiated the request and only allow the resulting action to occur if a certain time has passed since the last request.
A better solution to avoid double submits, is to use a POST query for the submit request and let the server respond with a redirect to a normal (harmless) receipt/thankyou page.
Then if the user refreshes the receipt page they will simply repeat the GET request to the receipt page and not the post.
You should still add some checks server side to avoid multiple POST requests somehow (using sessions, timestamps or something), in case a malicious user deliberately tries to resubmit.
This will only work on IE8 and above, but you can use localStorage:
var t = jQuery.noConflict();
t( document ).ready(function() {
t.cookie("example", "foo", { expires: 7 }); // Sample 2
console.log( "ready!" );
alert(t.cookie("example"));
if(localStorage['submitted'] === undefined){
var data = '<?php echo json_encode($json_full);?>';
t.ajax({
url: 'my api url',
type: 'POST',
success: function(r) {
localStorage['submitted'] = true;
alert(JSON.stringify(r));
},
dataType: 'JSON',
data: { data: data, }
})
}
});
This way the first time it will run the AJAX because you haven't set the localStorage variable, but upon success you do and it will not resubmit on page refresh.
If you wanted to have the ability to send again upon a future visit, just use sessionStorage instead of localStorage. Same syntax and everything.
I have a python script that's doing around 8 or 9 specific steps. These steps are being logged in a file. For web GUI to display status change, or error messages, I am using the script belowjquery PeriodicalUpdater plugin.
I need the program to run simultaneously so that as the value in the file changes,it gets polled and displayed.
Please find my jquery code below.
Note the url "/primary_call/" takes around 2 and half minutes to execute. Problem is async :false is not working. The browser waits for 2.5 minutes, and then gets into the next level.
I tried in Firefox and Chrome and it gives the same result.
When I call the URL of another browser tab, it works perfectly, but I am unable to run both script components simultaneously, when I try calling from the same page.
What should I do so that the browser initiates "/primary_call/", which runs a Python script in the background, at the same time moving ahead to the portion called PeriodicUpdate.
$(document).ready(function()
$.ajax({
type: 'GET', // Or any other HTTP Verb (Method)
url: '/primary_call/',
async: false,
success: function(r){
return false;
},
error: function(e){
}
});
$.PeriodicalUpdater({
url : '/static/12.txt',
method: 'post',
maxTimeout: 6000,
},
function(data){
var myHtml = data + ' <br />';
$('#results').append(myHtml);
});
})
Setting async:false means you are making the process synchronous, so the browser will hang on it until it is finished -- it can't move on to your other method. Removing that option will make the call asynchronous (which it is by default, as it should be) at which point the browser will initialize each ajax call in a separate thread.
In short, remove async:false.