Prevent browser from caching AJAX requests - javascript

I've setup an app and it works fantastic on Opera and Firefox, but on Google Chrome it caches the AJAX request and will give stale data!
http://gapps.qk.com.au is the app. When ran in Chrome it doesn't even send the AJAX requests, but when tried in another browser it always does the AJAX request and returns data.
Is there any method (Apache/PHP/HTML/JS) to stop Chrome from doing this behavior?
The AJAX call:
function sendAjax(action,domain,idelement) {
//Create the variables
var xmlhttp,
tmp,
params = "action=" + action
+ "&domain=" + encodeURIComponent(domain)
xmlhttp = new XMLHttpRequest();
//Check to see if AJAX request has been sent
xmlhttp.onreadystatechange = function () {
if (xmlhttp.readyState === 4 && xmlhttp.status === 200) {
$('#'+idelement).html(xmlhttp.responseText);
}
};
xmlhttp.open("GET", "ajax.php?"+params, true);
xmlhttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
//console.log(params);
xmlhttp.send(params);
}
sendAjax('gapps','example.com','gapps');

The browser cache behaves differently on different settings. You should not depend on user settings or the user's browser. It's possible to make the browser ignore headers also.
There are two ways to prevent caching.
--> Change AJAX request to POST. Browsers don't cache POST requests.
--> Better Way & good way: add an additional parameter to your request with either the current time stamp or any other unique number.
params = "action=" + action
+ "&domain=" + encodeURIComponent(domain)
+ "&preventCache="+new Date();

Another alternative to the Javascript solution is to use custom headers:
In PHP it would look like this:
<?php
header("Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0");//Dont cache
header("Pragma: no-cache");//Dont cache
header("Expires: Thu, 19 Nov 1981 08:52:00 GMT");//Make sure it expired in the past (this can be overkill)
?>

I was using jQuery ajax request when I ran into this problem.
According to jQuery API adding "cache: false" adds a timestamp like explained in the accepted answer:
This only works with GET and HEAD requests though but if you're using POST the browser doesn't cache your ajax request anyways. There's a but for IE8, check it out in the link if needed.
$.ajax({
type: "GET",
cache: false,
});

Below line of code worked for me.
$.ajaxSetup({ cache: false });

Related

XMLHttpRequest strange on IE11, ok on Edge and Chrome

i try to do the following
but it doesn't work on IE11 in my case?
var request = new XMLHttpRequest();
function doStuff()
{
console.log("next request");
doRequest();
}
function doRequest() {
request.open("GET","http://127.0.0.1/poll.php", true);
request.onloadend = doStuff;
request.send();
}
doRequest();
The PHP script poll.php is sleeping for one second.
Now the Point: The Egde and Chrome are requesting round about once per second
but the IE doesen't, it spams the log with 1000 requests per second AND they dont even got execute.
If i remove the endless Loop, the IE is doing one request,
if it is a endless Loop, the IE is doing nothing except spamming the log.
Hope u can unterstand me and give me a hint,
how to solve my Problem.
Best Regards.
It seems IE caches the request, therefore the 2nd and subsequent requests finish immediately - after running the code once, even the first request would do so
As a side note, I'm actually surprised other browsers don't cache XHR requests - at least, they don't seem to if the question is an accurate reflection of browser behaviour!
One solution is to add a random "search" (or query) string to the URL
function doRequest() {
request.open("GET","http://127.0.0.1/poll.php?_"+Math.random(), true);
// ^^^^^^^^^^^^^^^^^
request.onloadend = doStuff;
request.send();
}
Another would be to have the server respond with response headers that tell the browser "don't cache this" - in the case of PHP (because the request clearly is made to PHP)
header("Cache-Control: no-store, no-cache, must-revalidate, max-age=0");
header("Cache-Control: post-check=0, pre-check=0", false);
header("Pragma: no-cache");
(shamelessly "borrowed" from https://stackoverflow.com/a/13640164/5053002 - so if it's wrong don't blame me, I haven't used PHP in a few years)

Ajax request: Refused to set unsafe header

I am trying to play an audio using Google Text-To-Speech. Therefore I need to post a request to their endpoint with the Referer and the User-Agent properly set. This call should return an MP3 that I can play.
However, I get Refused to set unsafe header errors. This is my code. How can I do this?
$.ajax({
url: 'http://translate.google.com/translate_tts?ie=UTF-8&q=Hello&tl=en&client=t',
beforeSend: function(xhr) {
xhr.setRequestHeader("Referer", "http://translate.google.com/");
xhr.setRequestHeader("User-Agent", "stagefright/1.2 (Linux;Android 5.0)");
}, success: function(data){
el.mp3 = new Audio(data);
el.mp3.play();
}
});
You can't. It is impossible.
The specification requires that the browser abort the setRequestHeader method if you try to set the Referer header (it used to be that User-Agent was also forbidden but that has changed)..
If you need to set Referer manually then you'll need to make the request from your server and not your visitor's browser.
(That said, if you need to be deceptive about the user agent or referer then you are probably trying to use the service in a fashion that the owner of it does not want, so you should respect that and stop trying).
Note that while jQuery wraps XHR, the same rules apply to fetch.
Empty Origin and Referer headers with GET XMLHttpRequest from <iframe>
Well actually, it is possible; at least for ordinary web pages.
The trick consists in injecting an XMLHttpRequest
function into an empty <iframe>.
The origin of an empty <iframe> happens to be about://blank, which results in empty Origin and Referer HTTP headers.
HTML:
<iframe id="iframe"></iframe>
JavaScript:
const iframe = document.getElementById('iframe');
const iframeWin = iframe.contentWindow || iframe;
const iframeDoc = iframe.contentDocument || iframeWin.document;
let script = iframeDoc.createElement('SCRIPT');
script.append(`function sendWithoutOrigin(url) {
var request = new XMLHttpRequest();
request.open('GET', url);
request.onreadystatechange = function() {
if(request.readyState === XMLHttpRequest.DONE) {
if(request.status === 200) {
console.log('GET succeeded.');
}
else {
console.warn('GET failed.');
}
}
}
request.send();
}`);
iframeDoc.documentElement.appendChild(script);
JavaScript evocation:
var url = 'https://api.serivce.net/';
url += '?api_key=' + api_write_key;
url += '&field1=' + value;
iframeWin.sendWithoutOrigin(url);
Having the possibility of sending empty Origin and Referer HTTP headers is important to safeguard privacy when using third-party API services. There are instances where the originating domain name may reveal sensitive personal information; like being suggestive of a certain medical condition for example. Think in terms of https://hypochondriasis-support.org :-D
The code was tested by inspecting the requests in a .har file, saved from the Network tab in the F12 Developer View in Vivaldi.
No attempt in setting the User-Agent header was made. Please, comment if this also works.
There are some header, which browser doesn't allow programmer to set its value in any of the javascript framework (like jQuery, Angular, etc.) or XMLHttpRequest ; while making AJAX request. These are called the forbidden headers: Forbidden Header

js php, ajax remembers previously loaded content

I have the weird issue, I have been looking for the solution for a while with no result. I'm developing a website, decided to load every subpage(content) dynamically via AJAX(also .js and .css files for each subpage). Now, when I'm working on it and change scripts/css for some file and refresh the page, it doesn't load them. It's like AJAX remembers the previous version, because when I turn it all of and come back after few hours it changes (!). Any ideas for this one? I want to avoid remembering anything by site memory(or anything) so I could work on this normally. I don't use jquery, I use pure js and my own function for ajax connection, maybe I should add there something? Here it is:
function sendQuery( data )
{
var http = new XMLHttpRequest();
var url = data["url"];
var params = "";
var query = data["params"];
for(var key in query)
params+=key+"="+query[key]+"&";
params=params.substring(0,params.length-1);
http.open("POST", url, true);
http.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
http.setRequestHeader("Content-length", params.length);
http.setRequestHeader("Connection", "close");
http.send(params);
http.onreadystatechange = function()
{
if(http.readyState == 4 && http.status == 200)
{
data["result"](http.responseText);
}
}
}
Not sure entirely what you mean, but this sounds like a caching issue, why not append the current time or a similar random string to the request as a separate parameter? This level of uniqueness should prevent the request response from being cached by the browser.
e.g. '?nocache='+new Date().getTime()
You may also want to prevent the server side response from caching the content returned by the call by setting the appropriate headers (e.g.):
response.setHeader( "Pragma", "no-cache" );
response.setHeader( "Cache-Control", "no-cache" );
response.setDateHeader( "Expires", 0 );
It seems highly related with your caching policy.
If you're hosting site with apache,
Check .htaccess in your root directory you might see something like this:
# Set up 2 Hour caching on commonly updated files
<FilesMatch "\.(xml|txt|html|js|css)$">
ExpiresDefault A7200
Header append Cache-Control "proxy-revalidate"
</FilesMatch>
The setting above set expire time to 7200 seconds = 2 hours.
To disable cache under development:
# Force no caching for dynamic files
<FilesMatch "\.(xml|txt|html|js|css)$">
ExpiresActive Off
Header set Cache-Control "private, no-cache, no-store, proxy-revalidate, no-transform"
Header set Pragma "no-cache"
</FilesMatch>
Then it should works properly.
Another way is to change this line:
var url = data["url"];
To:
var url = data["url"]+"&ts="+(new Date().valueOf());
To avoid the cache. Note: it's just pseudo code. "?" Should be handled if there hasn't one.
Hope it helps :)

XMLHttpRequest receiving no data or just "undefined"

i try to make a Firefox Addon which runs a XMLHttp Request in Javascript. I want to get the data from this request and send it to *.body.innerhtml.
That's my code so far...
var xhr = new XMLHttpRequest();
xhr.open("GET", "http://xx.xxxxx.com", true);
xhr.send();
setTimeout(function() { set_body(xhr.responseHtml); }, 6000);
Instead of receiving the data, I get "undefined". If I change xhr.responseHtml to responseText I get nothing. I don't know why I'm getting nothing. I'm working on Ubuntu 12.04 LTS with Firefox 12.0.
If you need any more details on the script please ask!
Update:
set_body Function
document.body.innerHTML = '';
document.body.innerHTML = body;
document.close();
Update SOLVED:
I had to determine the RequestHeaders (right after xhr.open):
xhr.setRequestHeader("Host", "xxx");
For following Items: Host, Origin and Referer. So it seems there was really a problem with the same origin policy.
But now it works! Thanks to all!
when you set the last param of open to true you are asking for an async event. So you need to add a callback to xhr like so:
xhr.onReadyStateChange = function(){
// define what you want to happen when server returns
}
that is invoked when the server responds. To test this without async set the third param to false. Then send() will block and wait there until the response comes back. Setting an arbitrary timeout of 6 seconds is not the right way to handle this.
This code should work:
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function () {
if (xhr.readyState == 4) {
set_body(xhr.responseText);
}
};
xhr.open("GET", "http://xx.xxxxx.com", true);
xhr.send();
Make sure that you are getting a correct response from URL http://xx.xxxxx.com. You may have a problem with cross-domain calls. If you have a page at domain http://first.com and you try to do XMLHttpRequest from domain http://second.com, Firefox will fail silently (there will be no error message, no response, nothing). This is a security measure to prevent XSS (Cross-site scripting).
Anyway, if you do XMLHttpRequest from a chrome:// protocol, it is considered secure and it will work. So make sure you use this code and make the requests from your addon, not from your localhost or something like that.

Determining what jQuery .ajax() resolves a string of redirects to

I'm aware that redirects are followed automatically, and that I have little/no control over that process. This is fine, but I'm still very interested in where my request ultimately ends up. Is it possible to see what url my request finally ends up at?
I do not want to rely on the returned HTML itself to tell me where I am.
Sample Code:
var originalURL = '/this/will/be/redirected';
$.ajax({
url: originalURL,
dataType: "html",
success: function(data, statusText, jqXHR) {
var endPointURL = insertMagicHere();
alert("Our query to " + original + " ended up at " + endPointURL + "!");
}
});
I'm looking around in jqXHR for it, but no luck so far. (Though, I'm new to all this, probably right under my nose)
So far as I know (and have testet) its only possible to detect IF there has been a redirect and how many redirects were made (but not TO where).
You can have a look my code:
var xhr = $.ajax({
url: originalURL,
dataType: "html",
success: function(data, statusText, jqXHR) {
console.log(data);
console.log(statusText);
console.log(jqXHR.getAllResponseHeaders());
}
});
The jqXHR.getAllResponseHeaders() output on my dev machine is like that:
Date: Fri, 05 Aug 2011 01:29:20 GMT
Server: ...
X-Powered-By: ...
Content-Length: 5
Keep-Alive: timeout=15, max=98
Connection: Keep-Alive
Content-Type: text/html
The Keep-Alive: timeout=15, max=98 is worth to have a deeper look at. No redirect result in a max=99 while ONE redirect results in a max=98
XMLHttpRequest.responseXML is a document meaning that it has a baseURI property which will be the location that the data was downloaded from. The main problem is that responseXML will only be set if you get an XML document back. In Firefox using overrideMimeType() works, despite reporting a syntax error in the document:
var r = new XMLHttpRequest();
r.open("GET", "http://google.com");
r.overrideMimeType("text/xml");
r.onload = function()
{
alert(r.responseXML.baseURI);
}
r.send(null);
Unfortunately, in Chrome you need a real XML document, overrideMimeType() doesn't help. And MSIE doesn't even implement this method (which isn't a big deal given that determining document source there seems impossible).
I'm not sure what magic they're using server side, but www.longURL.com does what you're talking about.
Their code sends a request to their server:
They have a jquery plug in: http://plugins.jquery.com/project/longurl
I'm not sure how to get the intermediate steps from it, but their website includes the redirects, so they must have figured out some way of doing it, which means they likely have some way of getting that.
To use it you'll have to look at their jquery plugin to figure out where they request the actual data.
EDIT
Allow me to correct that abysmally inadequate answer:
http://www.longURL.com has a service that expands shortened URLs.
Their main website (upon expanding a URL) tracks every redirect until you reach your final destination.
If you did whatever they were doing you might be able to do the same.
Unfortunately I don't know what they're doing (apart from sending their request to a server that probably listens specifically for 303s).
Their jQuery plugin may or may not be useful. If it exposes the redirects and you could figure out how to jimmy rig the system you might be able to get it through their service, otherwise you could create a shortened link to the initial link and get the results through their service anyway...sounds painful, but if you're unable to/unwilling to do server stuff, then that's probably your best option.

Categories

Resources