How to get progress from XMLHttpRequest - javascript

Is it possible to get the progress of an XMLHttpRequest (bytes uploaded, bytes downloaded)?
This would be useful to show a progress bar when the user is uploading a large file. The standard API doesn't seem to support it, but maybe there's some non-standard extension in any of the browsers out there? It seems like a pretty obvious feature to have after all, since the client knows how many bytes were uploaded/downloaded.
note: I'm aware of the "poll the server for progress" alternative (it's what I'm doing right now). the main problem with this (other than the complicated server-side code) is that typically, while uploading a big file, the user's connection is completely hosed, because most ISPs offer poor upstream. So making extra requests is not as responsive as I'd hoped. I was hoping there'd be a way (maybe non-standard) to get this information, which the browser has at all times.

For the bytes uploaded it is quite easy. Just monitor the xhr.upload.onprogress event. The browser knows the size of the files it has to upload and the size of the uploaded data, so it can provide the progress info.
For the bytes downloaded (when getting the info with xhr.responseText), it is a little bit more difficult, because the browser doesn't know how many bytes will be sent in the server request. The only thing that the browser knows in this case is the size of the bytes it is receiving.
There is a solution for this, it's sufficient to set a Content-Length header on the server script, in order to get the total size of the bytes the browser is going to receive.
For more go to https://developer.mozilla.org/en/Using_XMLHttpRequest .
Example:
My server script reads a zip file (it takes 5 seconds):
$filesize=filesize('test.zip');
header("Content-Length: " . $filesize); // set header length
// if the headers is not set then the evt.loaded will be 0
readfile('test.zip');
exit 0;
Now I can monitor the download process of the server script, because I know it's total length:
function updateProgress(evt)
{
if (evt.lengthComputable)
{ // evt.loaded the bytes the browser received
// evt.total the total bytes set by the header
// jQuery UI progress bar to show the progress on screen
var percentComplete = (evt.loaded / evt.total) * 100;
$('#progressbar').progressbar( "option", "value", percentComplete );
}
}
function sendreq(evt)
{
var req = new XMLHttpRequest();
$('#progressbar').progressbar();
req.onprogress = updateProgress;
req.open('GET', 'test.php', true);
req.onreadystatechange = function (aEvt) {
if (req.readyState == 4)
{
//run any callback here
}
};
req.send();
}

Firefox supports XHR download progress events.
EDIT 2021-07-08 10:30 PDT
The above link is dead. Doing a search on the Mozilla WebDev site turned up the following link:
https://developer.mozilla.org/en-US/docs/Web/API/ProgressEvent
It describes how to use the progress event with XMLHttpRequest and provides an example. I've included the example below:
var progressBar = document.getElementById("p"),
client = new XMLHttpRequest()
client.open("GET", "magical-unicorns")
client.onprogress = function(pe) {
if(pe.lengthComputable) {
progressBar.max = pe.total
progressBar.value = pe.loaded
}
}
client.onloadend = function(pe) {
progressBar.value = pe.loaded
}
client.send()
I also found this link as well which is what I think the original link pointed to.
https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/progress_event

One of the most promising approaches seems to be opening a second communication channel back to the server to ask it how much of the transfer has been completed.

For the total uploaded there doesn't seem to be a way to handle that, but there's something similar to what you want for download. Once readyState is 3, you can periodically query responseText to get all the content downloaded so far as a String (this doesn't work in IE), up until all of it is available at which point it will transition to readyState 4. The total bytes downloaded at any given time will be equal to the total bytes in the string stored in responseText.
For a all or nothing approach to the upload question, since you have to pass a string for upload (and it's possible to determine the total bytes of that) the total bytes sent for readyState 0 and 1 will be 0, and the total for readyState 2 will be the total bytes in the string you passed in. The total bytes both sent and received in readyState 3 and 4 will be the sum of the bytes in the original string plus the total bytes in responseText.

<!DOCTYPE html>
<html>
<body>
<p id="demo">result</p>
<button type="button" onclick="get_post_ajax();">Change Content</button>
<script type="text/javascript">
function update_progress(e)
{
if (e.lengthComputable)
{
var percentage = Math.round((e.loaded/e.total)*100);
console.log("percent " + percentage + '%' );
}
else
{
console.log("Unable to compute progress information since the total size is unknown");
}
}
function transfer_complete(e){console.log("The transfer is complete.");}
function transfer_failed(e){console.log("An error occurred while transferring the file.");}
function transfer_canceled(e){console.log("The transfer has been canceled by the user.");}
function get_post_ajax()
{
var xhttp;
if (window.XMLHttpRequest){xhttp = new XMLHttpRequest();}//code for modern browsers}
else{xhttp = new ActiveXObject("Microsoft.XMLHTTP");}// code for IE6, IE5
xhttp.onprogress = update_progress;
xhttp.addEventListener("load", transfer_complete, false);
xhttp.addEventListener("error", transfer_failed, false);
xhttp.addEventListener("abort", transfer_canceled, false);
xhttp.onreadystatechange = function()
{
if (xhttp.readyState == 4 && xhttp.status == 200)
{
document.getElementById("demo").innerHTML = xhttp.responseText;
}
};
xhttp.open("GET", "http://it-tu.com/ajax_test.php", true);
xhttp.send();
}
</script>
</body>
</html>

If you have access to your apache install and trust third-party code, you can use the apache upload progress module (if you use apache; there's also a nginx upload progress module).
Otherwise, you'd have to write a script that you can hit out of band to request the status of the file (checking the filesize of the tmp file for instance).
There's some work going on in firefox 3 I believe to add upload progress support to the browser, but that's not going to get into all the browsers and be widely adopted for a while (more's the pity).

The only way to do that with pure javascript is to implement some kind of polling mechanism.
You will need to send ajax requests at fixed intervals (each 5 seconds for example) to get the number of bytes received by the server.
A more efficient way would be to use flash. The flex component FileReference dispatchs periodically a 'progress' event holding the number of bytes already uploaded.
If you need to stick with javascript, bridges are available between actionscript and javascript.
The good news is that this work has been already done for you :)
swfupload
This library allows to register a javascript handler on the flash progress event.
This solution has the hudge advantage of not requiring aditionnal resources on the server side.

Related

How to replace ajax with webrtc data channel

** JAVASCRIPT question **
I'm using regularly ajax via XMLHttpRequest. But in 1 case, I need 1 ajax call per seconds....
but long term wise and with growing number of simultaneous users, it could bloat easily...
I'm reading stuff about webRTC data channel and it seems interesting and promissing.
Here my working AJAX function as an example of how easy and there is a few lines of codes to communicate from the browser to the server and vice-versa
function xhrAJAX ( divID , param2 ) {
// random value for each call to avoid cache
var pcache = (Math.floor(Math.random() * 100000000) + 1);
// parameters
var params = "divID="+encodeURIComponent(divID)+"&param2="+encodeURIComponent(param2);
// setup XMLHttpRequest with pcache
var xhr = new XMLHttpRequest();
xhr.open("POST", "/file.php?pcache="+pcache, true);
// setup headers
xhr.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
// prepare onready scripts
xhr.onreadystatechange = function(e) { if (xhr.readyState == 4) { $("#"+divID).html(e.currentTarget.responseText) ; } }
// send the ajax call
xhr.send(params);
}
How can I "transpose" or "convert" this ajax workflow into a webRTC data channel ? in order to avoid to setup a setInterval 1000...
Note: I mean how to replace the javascript portion of the code. PHP here is only to illustrate, I don't want to do a webRTC via PHP...
Is there a simple few lines of code way to push/receive data like this ajax function ?
the answer I'm looking for is more like a simple function to push and receive
(once the connection with STUN, ICE, TURN is established and working...)
If I need to include a javascript library like jquery or the equivalent for webRTC, I'm welcoming good and simple solution.
*** The main goal is this kind of scenario :
I have a webapp : users in desktop and users within webview in Android and IOS
right now I have this workflow => ajax every 3 seconds to "tell" the main database that the user is still active and using the browser (or the app)
But I'd like to replace with this kind : when the user uses the browser => do a webrtc data chata in background between the browser and the server
While reading on the web I think that webRTC is a better solution than websocket.
** I did a bit of search and found peerjs....
https://github.com/jmcker/Peer-to-Peer-Cue-System/blob/main/send.html
I'll do some testing, but in the meantime, if someone can trow ideas, it could be fun.
Cheers

JavaScript: Effects of multiple AJAX in modern browsers [duplicate]

In Firefox 3, the answer is 6 per domain: as soon as a 7th XmlHttpRequest (on any tab) to the same domain is fired, it is queued until one of the other 6 finish.
What are the numbers for the other major browsers?
Also, are there ways around these limits without having my users modify their browser settings? For example, are there limits to the number of jsonp requests (which use script tag injection rather than an XmlHttpRequest object)?
Background: My users can make XmlHttpRequests from a web page to the server, asking the server to run ssh commands on remote hosts. If the remote hosts are down, the ssh command takes a few minutes to fail, eventually preventing my users from performing any further commands.
One trick you can use to increase the number of concurrent connections is to host your images from a different sub domain. These will be treated as separate requests, each domain is what will be limited to the concurrent maximum.
IE6, IE7 - have a limit of two. IE8 is 6 if you have a broadband - 2 (if it's a dial up).
The network results at Browserscope will give you both Connections per Hostname and Max Connections for popular browsers. The data is gathered by running tests on users "in the wild," so it will stay up to date.
With IE6 / IE7 one can tweak the number of concurrent requests in the registry. Here's how to set it to four each.
[HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Internet Settings]
"MaxConnectionsPerServer"=dword:00000004
"MaxConnectionsPer1_0Server"=dword:00000004
I just checked with www.browserscope.org and with IE9 and Chrome 24 you can have 6 concurrent connections to a single domain, and up to 17 to multiple ones.
According to IE 9 – What’s Changed? on the HttpWatch blog, IE9 still has a 2 connection limit when over VPN.
Using a VPN Still Clobbers IE 9 Performance
We previously reported
about the scaling back of the maximum
number of concurrent connections in IE
8 when your PC uses a VPN connection.
This happened even if the browser
traffic didn’t go over that
connection.
Unfortunately, IE 9 is affected by VPN
connections in the same way:
I have writen a single file AJAX tester. Enjoy it!!!
Just because I have had problems with my hosting provider
<?php /*
Author: Luis Siquot
Purpose: Check ajax performance and errors
License: GPL
site5: Please don't drop json requests (nor delay)!!!!
*/
$r = (int)$_GET['r'];
$w = (int)$_GET['w'];
if($r) {
sleep($w);
echo json_encode($_GET);
die ();
} //else
?><head>
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
<script type="text/javascript">
var _settimer;
var _timer;
var _waiting;
$(function(){
clearTable();
$('#boton').bind('click', donow);
})
function donow(){
var w;
var estim = 0;
_waiting = $('#total')[0].value * 1;
clearTable();
for(var r=1;r<=_waiting;r++){
w = Math.floor(Math.random()*6)+2;
estim += w;
dodebug({r:r, w:w});
$.ajax({url: '<?php echo $_SERVER['SCRIPT_NAME']; ?>',
data: {r:r, w:w},
dataType: 'json', // 'html',
type: 'GET',
success: function(CBdata, status) {
CBdebug(CBdata);
}
});
}
doStat(estim);
timer(estim+10);
}
function doStat(what){
$('#stat').replaceWith(
'<table border="0" id="stat"><tr><td>Request Time Sum=<th>'+what+
'<td> /2=<th>'+Math.ceil(what/2)+
'<td> /3=<th>'+Math.ceil(what/3)+
'<td> /4=<th>'+Math.ceil(what/4)+
'<td> /6=<th>'+Math.ceil(what/6)+
'<td> /8=<th>'+Math.ceil(what/8)+
'<td> (seconds)</table>'
);
}
function timer(what){
if(what) {_timer = 0; _settimer = what;}
if(_waiting==0) {
$('#showTimer')[0].innerHTML = 'completed in <b>' + _timer + ' seconds</b> (aprox)';
return ;
}
if(_timer<_settimer){
$('#showTimer')[0].innerHTML = _timer;
setTimeout("timer()",1000);
_timer++;
return;
}
$('#showTimer')[0].innerHTML = '<b>don\'t wait any more!!!</b>';
}
function CBdebug(what){
_waiting--;
$('#req'+what.r)[0].innerHTML = 'x';
}
function dodebug(what){
var tt = '<tr><td>' + what.r + '<td>' + what.w + '<td id=req' + what.r + '> '
$('#debug').append(tt);
}
function clearTable(){
$('#debug').replaceWith('<table border="1" id="debug"><tr><td>Request #<td>Wait Time<td>Done</table>');
}
</script>
</head>
<body>
<center>
<input type="button" value="start" id="boton">
<input type="text" value="80" id="total" size="2"> concurrent json requests
<table id="stat"><tr><td> </table>
Elapsed Time: <span id="showTimer"></span>
<table id="debug"></table>
</center>
</body>
Edit:
r means row and w waiting time.
When you initially press start button 80 (or any other number) of concurrent ajax request are launched by javascript, but as is known they are spooled by the browser. Also they are requested to the server in parallel (limited to certain number, this is the fact of this question). Here the requests are solved server side with a random delay (established by w). At start time all the time needed to solve all ajax calls is calculated. When test is finished, you can see if it took half, took third, took a quarter, etc of the total time, deducting which was the parallelism on the calls to the server. This is not strict, nor precise, but is nice to see in real time how ajaxs calls are completed (seeing the incoming cross). And is a very simple self contained script to show ajax basics.
Of course, this assumes, that server side is not introducing any extra limit.
Preferably use in conjunction with firebug net panel (or your browser's equivalent)
Wrote my own test. tested the code on stackoverflow, works fine tells me that chrome/FF can do 6
var change = 0;
var simultanius = 0;
var que = 20; // number of tests
Array(que).join(0).split(0).forEach(function(a,i){
var xhr = new XMLHttpRequest;
xhr.open("GET", "/?"+i); // cacheBust
xhr.onreadystatechange = function() {
if(xhr.readyState == 2){
change++;
simultanius = Math.max(simultanius, change);
}
if(xhr.readyState == 4){
change--;
que--;
if(!que){
console.log(simultanius);
}
}
};
xhr.send();
});
it works for most websites that can trigger readystate change event at different times. (aka: flushing)
I notice on my node.js server that i had to output at least 1025 bytes to trigger the event/flush. otherwise the events would just trigger all three state at once when the request is complete so here is my backend:
var app = require('express')();
app.get("/", function(req,res) {
res.write(Array(1025).join("a"));
setTimeout(function() {
res.end("a");
},500);
});
app.listen(80);
Update
I notice that You can now have up to 2x request if you are using both xhr and fetch api at the same time
var change = 0;
var simultanius = 0;
var que = 30; // number of tests
Array(que).join(0).split(0).forEach(function(a,i){
fetch("/?b"+i).then(r => {
change++;
simultanius = Math.max(simultanius, change);
return r.text()
}).then(r => {
change--;
que--;
if(!que){
console.log(simultanius);
}
});
});
Array(que).join(0).split(0).forEach(function(a,i){
var xhr = new XMLHttpRequest;
xhr.open("GET", "/?a"+i); // cacheBust
xhr.onreadystatechange = function() {
if(xhr.readyState == 2){
change++;
simultanius = Math.max(simultanius, change);
}
if(xhr.readyState == 4){
change--;
que--;
if(!que){
document.body.innerHTML = simultanius;
}
}
};
xhr.send();
});
I believe there is a maximum number of concurrent http requests that browsers will make to the same domain, which is in the order of 4-8 requests depending on the user's settings and browser.
You could set up your requests to go to different domains, which may or may not be feasible. The Yahoo guys did a lot of research in this area, which you can read about (here). Remember that every new domain you add also requires a DNS lookup. The YSlow guys recommend between 2 and 4 domains to achieve a good compromise between parallel requests and DNS lookups, although this is focusing on the page's loading time, not subsequent AJAX requests.
Can I ask why you want to make so many requests? There is good reasons for the browsers limiting the number of requests to the same domain. You will be better off bundling requests if possible.
A Good reason to move to http 2.0
With http2.0 the maximum number of connections per host is virtually unlimited: Is the per-host connection limit raised with HTTP/2?

Server doesn't respond to an xmlHTTP request using the get method

I'm doing a project with arduino in which I send different requests to the server (the arduino board) with the method XMLHttprequest and Get from a webpage. Except one of the request the others are used only for sending orders to the server, so I don't expect for an XML response. The other one is a request sent in intervals of 5 seconds for getting different values from the server.
The problem arrives with this last one. Actually the webpage sends the request (because I see it on the browser console and the arduino serial monitor) every 5 seconds, but it doesn't get anything, just the headers of the answer confirming the response but nothing about the XML file. Surprisingly, when I write a normal request using the get method in the browser I get instantly the XML file with the values, and It happens all the time I do that.
I'm going to write the javascript code I'm using on the webpage
setInterval(function tiempo()
{
var request = new XMLHttpRequest();
request.onreadystatechange = function()
{
if (this.readyState == 4) {
if (this.status == 200) {
if (this.responseXML != null) {
// extract XML data from XML file (containing switch states and analog value)
document.getElementById("input1").innerHTML = this.responseXML.getElementsByTagName('dato')[0].childNodes[0].nodeValue;
document.getElementById("input2").innerHTML = this.responseXML.getElementsByTagName('dato')[1].childNodes[0].nodeValue;
document.getElementById("input3").innerHTML = this.responseXML.getElementsByTagName('dato')[2].childNodes[0].nodeValue;
document.getElementById("input4").innerHTML = this.responseXML.getElementsByTagName('dato')[3].childNodes[0].nodeValue;
document.getElementById("input5").innerHTML = this.responseXML.getElementsByTagName('dato')[4].childNodes[0].nodeValue;
document.getElementById("input6").innerHTML = this.responseXML.getElementsByTagName('dato')[5].childNodes[0].nodeValue;
document.getElementById("input7").innerHTML = this.responseXML.getElementsByTagName('dato')[6].childNodes[0].nodeValue;
}
}
}
}
request.open("GET", "URL" + Math.random(), true);
request.send(null);
}
, 5000);
On the other hand, if I only write in the browser URL, I get the XML without any problem.
One las thing I have to say is that right now I'm using a webpage stored in my computer but before I was using a webpage stored in the arduino (on an SD card) and loaded also through the internet from arduino. The same code in that case worked perfectly. The reason because I changed It is because arduino ethernet is not too fast and It took so much time. With the webpage stored in my computer It goes faster because It only needs to send the orders.
Thanks!!
Finally, I figured out the problem. It is the browser. For any reason only Internet Explorer works correctly with the webpage. Neither firefox nor other web browsers got the xml file. I don't know the reason but I would like to find it.
If someone knows something about I would be glad of trying to resolve the problem.
Thanks!!

How to show loading status in percentage for ajax response?

I want to show the user percentage of the ajax response loaded with a progressbar.
Is there a way to achieve it?
Right now I am showing just an image.
Here is my code sample :
$('#loadingDiv').show();
$.ajax({
type : 'Get',
url : myUrl,
success : function(response) {
$('#loadingDiv').hide();
populateData(response);
},
error: function(x, e) {
$('#loadingDiv').hide();
if (x.status == 500 || x.status == 404) {
alert("no data found");
}
}
});
HTML code:
<div id="loadingDiv">
<img src="loading-img.png"/>
</div>
There are two ways to show real percentage. Briefly...
One - old school native JavaScript or jQuery ajax, for which you need server support as well, a different URL which can give you updates. And you keep hitting that URL on an interval.
Two - modern native native JavaScript in HTML5 browsers, supporting XMLHTTPRequest2, also known as AJAX 2, defined by new Web and HTML5 Standards.
If two, welcome to the new web!!
Multiple features have been added to Browsers that enhance connectivity - part of HTML5 features.
XMLHTTPRequest2 enables events in AJAX that help monitoring progress, as well as a lot of other things, from JavaScript itself. You can show the real percentage by monitoring the actual progress:
var oReq = new XMLHttpRequest();
oReq.addEventListener("progress", updateProgress, false);
oReq.addEventListener("load", transferComplete, false);
oReq.addEventListener("error", transferFailed, false);
oReq.addEventListener("abort", transferCanceled, false);
oReq.open();
Then you can define the handlers attached above (progress in your case):
function updateProgress (oEvent) {
if (oEvent.lengthComputable) {
var percentComplete = oEvent.loaded / oEvent.total;
// ...
} else {
// Unable to compute progress information since the total size is unknown
}
}
jQuery can be used in the second case as well. After all, jQuery is for helping you with less code, more doing!
Hoping that you are focusing on HTML5 and the new web solution, I would point you to Mozilla DOC - Monitoring Progress in AJAX from where I have taken this solution.
Every Browser now has a documentation for the web (like the one above from Mozilla) and additionally, all of them are contributing to a common venture called Web Platform, together with other influential Web and Internet giants - for a common updated Web Documentation. It is a work in progress, so not complete.
Also, there is no native functionality in the old AJAX, to monitor progress.
In the old-school way, you would have to create an interval function that would keep on hitting a separate URL to get the progress update. Your server also has to update the progress and send that as a response from that URL available from a different port.

Intermittent Cloudfront CDN failures (monitoring) - CDN Failover

For the past 2 months I have been experiencing Amazon Cloudfront intermittent failures (2-3 times a week) whereby the page would load from my web server but all the assets from the CDN would block in pending for minutes at the time (I confirmed that with shell curl from different datacenters some work some don't depending on the edge location - London?). Once the pending requests succeed all goes back to normal.
We have been reporting this to amazon but they always reply with "Don't expect reply from us. If gazillion people will complain only then will we consider looking into this" kind of message. Often it resumes normal operation before I'm done writing the support request.
I came to a conclusion that the best way to proceed due to lack of development time for migrating to other CDN is to add a script in the html header that will let us know whenever something similar happens. So say in the header try to download a tiny gif from the CDN if the request takes longer than N msec then call an arbitrary url within the root domain (for monitoring).
The question:
How does one reliably, across all popular browsers, request a file with callback on timeout. i.e.:
request file from CDN using AJAX - will not work due to cross-domain limitations?
setTimeout("callbackTimeout",2000) callbackTimeout(){getElementById() else ...HttpWebRequest...} - would that be blocked by pending HttpWebRequest request or will it work?
How else?
Thanks.
This has been briefly tested in IE.7&8, up to date FF on Windows & OSX as well as Chrome. I suggest you test it yourself. Minify! If you know better way of doing this please suggest your improvements. The way using i.e. script instead of an image has been considered and decided against probably mostly due to my ignorance.
The next version will write a cookie on timeout and the future requests will be handled on the server side (using relative asset path). The cookie will expire after say 30 minutes. Every consecutive timeout will renew that cookie. Not sure how I'll handle the first failover. Could be a redirect (not very elegant but simple). Perhaps I will figure out smarter way (possibly more elegant but more complex too).
<script type="text/javascript">
//<![CDATA[
// Absolute path to a picture on your CDN to be monitored
cdnImagePath = "http://YOURCDNADDRESS.net/empty.gif";
//this is relative path (cross domain limitation)
//will be followed by "timeout" or "other" as a reason i.e. /cdnMonitor.php?message=timeout
cdnMonitoringPath = "/cdnMonitor.php?message=";
// Recommended 3000 for 3 second(s) timeout
cdnTimeoutMilisec = 3000;
// Set to true to be notified after timeout (provides extra information)
cdnNotifyAfterTimeout = false;
// Handler methods
cdnOK = function(){
if (!cdnTimer && cdnNotifyAfterTimeout) cdnNotify('success');
}
cdnFail = function(reason){
if (reason != "timeout") {
if (cdnTimer) clearTimeout(cdnTimer);
message = "error"
} else {
message = reason;
}
cdnNotify(message);
}
cdnTimeout = function() {
cdnTimer = false;
if (cdnImage.complete == false) {
cdnFail("timeout");
}
}
cdnNotify = function(message) {
if (window.XMLHttpRequest) {
xmlhttp = new XMLHttpRequest();
xmlhttp.open("GET", cdnMonitoringPath + message, true);
xmlhttp.send();
} else {// code for IE6, IE5
xmlhttp=new ActiveXObject("Microsoft.XMLHTTP");
}
}
// Load test image and define event handlers
cdnTimer = setTimeout("cdnTimeout()", cdnTimeoutMilisec);
cdnImage = new Image();
cdnImage.onload = cdnOK;
cdnImage.onerror = cdnFail;
cdnImage.src = cdnImagePath + "?" + Math.floor(Math.random()*1000000);
//]]>
</script>
Also this is what I'll use for ad hoc monitoring on the server side cdnMonitor.php:
error_log(date('Y-m-d H:i:s.') .next(explode('.',microtime(1))). ' - '. $_GET['message'] . ' - '. $_SERVER['HTTP_X_REAL_IP']. ' - ' . $_SERVER['HTTP_USER_AGENT'] ."\n", 3, '/tmp/cdnMonitor.log');
You will need to change the "HTTP_X_REAL_IP" to REMOTE_ADDR or whatever suits your needs. I use reverse proxy so that's what I do.
Lastly I made some last minute changes in the post editor and might have broken something. Fingers crossed.

Categories

Resources