Find out whether current request is a proxy request in firefox? - javascript

I need to automatically perform authorization with a proxy server using an addon.
I also checked out the source of other extensions that do that, and they do it like this:
var httpRequestObserver =
{
observe: function(subject, topic, data)
{
if (topic === "http-on-modify-request") {
var channel = subject.QueryInterface(Components.interfaces.nsIHttpChannel);
channel.setRequestHeader("Proxy-Authorization", "Basic myauthorizationtoken" , false);
}
}
};
A request observer adds the authorization header to every outgoing request. That works fine if a regular proxy is set, however in my case i have a proxy pac url where only specific requests are done using the proxy, and others not!
In that case the authorization header which basically includes credentials in the clear is transmitted to every website not accessed via proxy.
Obviously that cannot be, therefore I need to find out whether the current request is a proxy request, and only then set the header.
Or find another way alltogether...
In chrome I did it like this:
chrome.webRequest.onAuthRequired.addListener(handleAuthRequest,
{urls: ["<all_urls>"]}, ["asyncBlocking"]);
function handleAuthRequest(details, callback) {
if (details.isProxy === true){
callback({authCredentials: {username: localStorage['login'],
password: localStorage['pass']}});
}
callback();
}
Clearly this is optimal! However I cannot see a nice way in firefox to do it.

turns out that subject has an attribute proxyInfo which is null or set, also proxyInfo has further attributes to examine

Related

Failed posting data with axios [duplicate]

I'm trying to load a cross-domain HTML page using AJAX but unless the dataType is "jsonp" I can't get a response. However using jsonp the browser is expecting a script mime type but is receiving "text/html".
My code for the request is:
$.ajax({
type: "GET",
url: "http://saskatchewan.univ-ubs.fr:8080/SASStoredProcess/do?_username=DARTIES3-2012&_password=P#ssw0rd&_program=%2FUtilisateurs%2FDARTIES3-2012%2FMon+dossier%2Fanalyse_dc&annee=2012&ind=V&_action=execute",
dataType: "jsonp",
}).success( function( data ) {
$( 'div.ajax-field' ).html( data );
});
Is there any way of avoiding using jsonp for the request? I've already tried using the crossDomain parameter but it didn't work.
If not is there any way of receiving the html content in jsonp? Currently the console is saying "unexpected <" in the jsonp reply.
jQuery Ajax Notes
Due to browser security restrictions, most Ajax requests are subject to the same origin policy; the request can not successfully retrieve data from a different domain, subdomain, port, or protocol.
Script and JSONP requests are not subject to the same origin policy restrictions.
There are some ways to overcome the cross-domain barrier:
CORS Proxy Alternatives
Ways to circumvent the same-origin policy
Breaking The Cross Domain Barrier
There are some plugins that help with cross-domain requests:
Cross Domain AJAX Request with YQL and jQuery
Cross-domain requests with jQuery.ajax
Heads up!
The best way to overcome this problem, is by creating your own proxy in the back-end, so that your proxy will point to the services in other domains, because in the back-end not exists the same origin policy restriction. But if you can't do that in back-end, then pay attention to the following tips.
**Warning!**
Using third-party proxies is not a secure practice, because they can keep track of your data, so it can be used with public information, but never with private data.
The code examples shown below use jQuery.get() and jQuery.getJSON(), both are shorthand methods of jQuery.ajax()
CORS Anywhere
2021 Update
Public demo server (cors-anywhere.herokuapp.com) will be very limited by January 2021, 31st
The demo server of CORS Anywhere (cors-anywhere.herokuapp.com) is meant to be a demo of this project. But abuse has become so common that the platform where the demo is hosted (Heroku) has asked me to shut down the server, despite efforts to counter the abuse. Downtime becomes increasingly frequent due to abuse and its popularity.
To counter this, I will make the following changes:
The rate limit will decrease from 200 per hour to 50 per hour.
By January 31st, 2021, cors-anywhere.herokuapp.com will stop serving as an open proxy.
From February 1st. 2021, cors-anywhere.herokuapp.com will only serve requests after the visitor has completed a challenge: The user (developer) must visit a page at cors-anywhere.herokuapp.com to temporarily unlock the demo for their browser. This allows developers to try out the functionality, to help with deciding on self-hosting or looking for alternatives.
CORS Anywhere is a node.js proxy which adds CORS headers to the proxied request.
To use the API, just prefix the URL with the API URL. (Supports https: see github repository)
If you want to automatically enable cross-domain requests when needed, use the following snippet:
$.ajaxPrefilter( function (options) {
if (options.crossDomain && jQuery.support.cors) {
var http = (window.location.protocol === 'http:' ? 'http:' : 'https:');
options.url = http + '//cors-anywhere.herokuapp.com/' + options.url;
//options.url = "http://cors.corsproxy.io/url=" + options.url;
}
});
$.get(
'http://en.wikipedia.org/wiki/Cross-origin_resource_sharing',
function (response) {
console.log("> ", response);
$("#viewer").html(response);
});
Whatever Origin
Whatever Origin is a cross domain jsonp access. This is an open source alternative to anyorigin.com.
To fetch the data from google.com, you can use this snippet:
// It is good specify the charset you expect.
// You can use the charset you want instead of utf-8.
// See details for scriptCharset and contentType options:
// http://api.jquery.com/jQuery.ajax/#jQuery-ajax-settings
$.ajaxSetup({
scriptCharset: "utf-8", //or "ISO-8859-1"
contentType: "application/json; charset=utf-8"
});
$.getJSON('http://whateverorigin.org/get?url=' +
encodeURIComponent('http://google.com') + '&callback=?',
function (data) {
console.log("> ", data);
//If the expected response is text/plain
$("#viewer").html(data.contents);
//If the expected response is JSON
//var response = $.parseJSON(data.contents);
});
CORS Proxy
CORS Proxy is a simple node.js proxy to enable CORS request for any website.
It allows javascript code on your site to access resources on other domains that would normally be blocked due to the same-origin policy.
CORS-Proxy gr2m (archived)
CORS-Proxy rmadhuram
How does it work?
CORS Proxy takes advantage of Cross-Origin Resource Sharing, which is a feature that was added along with HTML 5. Servers can specify that they want browsers to allow other websites to request resources they host. CORS Proxy is simply an HTTP Proxy that adds a header to responses saying "anyone can request this".
This is another way to achieve the goal (see www.corsproxy.com). All you have to do is strip http:// and www. from the URL being proxied, and prepend the URL with www.corsproxy.com/
$.get(
'http://www.corsproxy.com/' +
'en.wikipedia.org/wiki/Cross-origin_resource_sharing',
function (response) {
console.log("> ", response);
$("#viewer").html(response);
});
The http://www.corsproxy.com/ domain now appears to be an unsafe/suspicious site. NOT RECOMMENDED TO USE.
CORS proxy browser
Recently I found this one, it involves various security oriented Cross Origin Remote Sharing utilities. But it is a black-box with Flash as backend.
You can see it in action here: CORS proxy browser
Get the source code on GitHub: koto/cors-proxy-browser
You can use Ajax-cross-origin a jQuery plugin.
With this plugin you use jQuery.ajax() cross domain. It uses Google services to achieve this:
The AJAX Cross Origin plugin use Google Apps Script as a proxy jSON
getter where jSONP is not implemented. When you set the crossOrigin
option to true, the plugin replace the original url with the Google
Apps Script address and send it as encoded url parameter. The Google
Apps Script use Google Servers resources to get the remote data, and
return it back to the client as JSONP.
It is very simple to use:
$.ajax({
crossOrigin: true,
url: url,
success: function(data) {
console.log(data);
}
});
You can read more here:
http://www.ajax-cross-origin.com/
If the external site doesn't support JSONP or CORS, your only option is to use a proxy.
Build a script on your server that requests that content, then use jQuery ajax to hit the script on your server.
Just put this in the header of your PHP Page and it ill work without API:
header('Access-Control-Allow-Origin: *'); //allow everybody
or
header('Access-Control-Allow-Origin: http://codesheet.org'); //allow just one domain
or
$http_origin = $_SERVER['HTTP_ORIGIN']; //allow multiple domains
$allowed_domains = array(
'http://codesheet.org',
'http://stackoverflow.com'
);
if (in_array($http_origin, $allowed_domains))
{
header("Access-Control-Allow-Origin: $http_origin");
}
I'm posting this in case someone faces the same problem I am facing right now. I've got a Zebra thermal printer, equipped with the ZebraNet print server, which offers a HTML-based user interface for editing multiple settings, seeing the printer's current status, etc. I need to get the status of the printer, which is displayed in one of those html pages, offered by the ZebraNet server and, for example, alert() a message to the user in the browser. This means that I have to get that html page in Javascript first. Although the printer is within the LAN of the user's PC, that Same Origin Policy is still staying firmly in my way. I tried JSONP, but the server returns html and I haven't found a way to modify its functionality (if I could, I would have already set the magic header Access-control-allow-origin: *). So I decided to write a small console app in C#. It has to be run as Admin to work properly, otherwise it trolls :D an exception. Here is some code:
// Create a listener.
HttpListener listener = new HttpListener();
// Add the prefixes.
//foreach (string s in prefixes)
//{
// listener.Prefixes.Add(s);
//}
listener.Prefixes.Add("http://*:1234/"); // accept connections from everywhere,
//because the printer is accessible only within the LAN (no portforwarding)
listener.Start();
Console.WriteLine("Listening...");
// Note: The GetContext method blocks while waiting for a request.
HttpListenerContext context;
string urlForRequest = "";
HttpWebRequest requestForPage = null;
HttpWebResponse responseForPage = null;
string responseForPageAsString = "";
while (true)
{
context = listener.GetContext();
HttpListenerRequest request = context.Request;
urlForRequest = request.RawUrl.Substring(1, request.RawUrl.Length - 1); // remove the slash, which separates the portNumber from the arg sent
Console.WriteLine(urlForRequest);
//Request for the html page:
requestForPage = (HttpWebRequest)WebRequest.Create(urlForRequest);
responseForPage = (HttpWebResponse)requestForPage.GetResponse();
responseForPageAsString = new StreamReader(responseForPage.GetResponseStream()).ReadToEnd();
// Obtain a response object.
HttpListenerResponse response = context.Response;
// Send back the response.
byte[] buffer = System.Text.Encoding.UTF8.GetBytes(responseForPageAsString);
// Get a response stream and write the response to it.
response.ContentLength64 = buffer.Length;
response.AddHeader("Access-Control-Allow-Origin", "*"); // the magic header in action ;-D
System.IO.Stream output = response.OutputStream;
output.Write(buffer, 0, buffer.Length);
// You must close the output stream.
output.Close();
//listener.Stop();
All the user needs to do is run that console app as Admin. I know it is way too ... frustrating and complicated, but it is sort of a workaround to the Domain Policy problem in case you cannot modify the server in any way.
edit: from js I make a simple ajax call:
$.ajax({
type: 'POST',
url: 'http://LAN_IP:1234/http://google.com',
success: function (data) {
console.log("Success: " + data);
},
error: function (e) {
alert("Error: " + e);
console.log("Error: " + e);
}
});
The html of the requested page is returned and stored in the data variable.
To get the data form external site by passing using a local proxy as suggested by jherax you can create a php page that fetches the content for you from respective external url and than send a get request to that php page.
var req = new XMLHttpRequest();
req.open('GET', 'http://localhost/get_url_content.php',false);
if(req.status == 200) {
alert(req.responseText);
}
as a php proxy you can use https://github.com/cowboy/php-simple-proxy
Your URL doesn't work these days, but your code can be updated with this working solution:
var url = "http://saskatchewan.univ-ubs.fr:8080/SASStoredProcess/do?_username=DARTIES3-2012&_password=P#ssw0rd&_program=%2FUtilisateurs%2FDARTIES3-2012%2FMon+dossier%2Fanalyse_dc&annee=2012&ind=V&_action=execute";
url = 'https://google.com'; // TEST URL
$.get("https://images"+~~(Math.random()*33)+"-focus-opensocial.googleusercontent.com/gadgets/proxy?container=none&url=" + encodeURI(url), function(data) {
$('div.ajax-field').html(data);
});
<div class="ajax-field"></div>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
You need CORS proxy which proxies your request from your browser to requested service with appropriate CORS headers. List of such services are in code snippet below. You can also run provided code snippet to see ping to such services from your location.
$('li').each(function() {
var self = this;
ping($(this).text()).then(function(delta) {
console.log($(self).text(), delta, ' ms');
});
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<script src="https://cdn.rawgit.com/jdfreder/pingjs/c2190a3649759f2bd8569a72ae2b597b2546c871/ping.js"></script>
<ul>
<li>https://crossorigin.me/</li>
<li>https://cors-anywhere.herokuapp.com/</li>
<li>http://cors.io/</li>
<li>https://cors.5apps.com/?uri=</li>
<li>http://whateverorigin.org/get?url=</li>
<li>https://anyorigin.com/get?url=</li>
<li>http://corsproxy.nodester.com/?src=</li>
<li>https://jsonp.afeld.me/?url=</li>
<li>http://benalman.com/code/projects/php-simple-proxy/ba-simple-proxy.php?url=</li>
</ul>
Figured it out.
Used this instead.
$('.div_class').load('http://en.wikipedia.org/wiki/Cross-origin_resource_sharing #toctitle');

Why chrome.storage.local.get can't execute in time?

I am writing an extension for Chrome, I hope by this plugin I can modify the chrome HTTP request headers.
so I want the user is able to customize their heads, then when chrome sends a HTTP request, first to intercept the request, then a custom configuration for users, according to the configuration modify the request headers, and then send.
But in this process, I encountered some problems when I intercepted chrome request, I will get this time the user configuration, but it seems to get the user configuration needs some time, always when there is access to the user configuration when the HTTP request has been away, I don't know. How can let users access the configuration of this operation is blocked, only when it gets done will intercept the HTTP request to let go
Here is my code to intercept the HTTP request:
(Please forgive my poor English :P, thank you very much~)
chrome.webRequest.onBeforeSendHeaders.addListener(function(details){
var headers = details.requestHeaders,
blockingResponse = {};
// now I am trying to modify blockingRequest
chrome.storage.local.get(null, function(results) {
for(var result in results){
headers.push({"name":result, "value":results[result]});
}
}); // but the code above is useless.
// if you print the new headers , you will find the new headers doesn't change.
blockingResponse.requestHeaders = headers;
return blockingResponse;
},
{urls: [ "<all_urls>" ]},['requestHeaders','blocking']);

xhr caching values from getResponseHeader?

I'm running up against a very frustrating bug. I'm not exactly sure what is happening, but I think xhr is doing some kind of cache on the response headers.
My app is using devise_token_auth for the backend authentication service. We're using it with rotating access-tokens, and so I have written a function that runs after every request.
function storeAndGetResponseHeaders(xhr) {
const headersObj = {};
headerKeys.filter((key) => xhr.getResponseHeader(key))
.forEach((key) => {
headersObj[key] = xhr.getResponseHeader(key);
window.sessionStorage.setItem(key, xhr.getResponseHeader(key));
});
return headersObj;
}
where headerKeys is ['access-token', 'client', 'expiry', 'uid', 'token-type']. So any response that has these headers it should save them into sessionStorage and then return them in an object which gets stored within my AJAX service that I wrote and added to every request. We're using rxjs, and this service is just a thin wrapper around it. This is what RxAjax.ajax looks like.
ajax(urlOrRequest) {
const request = typeof urlOrRequest === 'string' ? { url: urlOrRequest } : urlOrRequest;
request.headers = Object.assign({}, this.headers, urlOrRequest.headers);
request.url = `${this.baseUrl}${request.url}`;
return Observable.ajax(request).map(this.afterRequest, this);
}
where this.headers is the stored headers from last request (or the loaded headers from sessionStorage). this.afterRequest is what sets the headers from the response xhr.
My problem is that I'm getting bad values into my headers object (specifically old access tokens). What I've noticed is that when I add a logging statement of headersObj after assignment, sometimes it will have old response headers from a past request. However when I look at the request itself in the dev console Network tab, it doesn't show any of the auth headers in the response headers ('access-token', 'client', etc...). This gets fixed for a little while if I do a hard refresh on the browser, but comes back seemingly inexplicably.
Note we're using rxjs to make our requests, which might be relevant (but I don't think it is the cause of this problem, as I'm trying to read the headers from the original xmlhttprequest object). Thanks!
As Barmar suggested in the comments, it was a caching issue. There may be a bug in the chrome console, where it wasn't showing the cached headers that were on the cached request. Hence even though it looked like there were no auth headers there really were.
It looks like if you're using jQuery you can add the option cache: false to the request in order to prevent caching. Because I'm not, the first thing I did was try adding ?cache=${new Date().toJSON} to each request, which successfully busted the cache and fixed my problem (that is what cache: false in jQuery does).
Our backend is in rails, and so I ended up adding
before_action :set_cache_headers
...
private
def set_cache_headers
response.headers["Cache-Control"] = "no-cache, no-store, must-revalidate"
end
to my application controller. Now no requests are cached by the browser. Not sure if this will be our long term solution or not

how to detect a proxy using javascript

in a web page, is there a way to detect by javascript if a web browser is using a PAC file http://xxx.xx.xx.xxx/toto.pac ?
Notes : the same page can be viewd behind many PACs, i don't want to use a server end language, i can edit the toto PAC file if necessary. Regards
You could make an ajax request to a known external server (google.com) and then get the headers out of that request to see if the proxy headers are in the request...
var proxyHeader = 'via';
var req = new XMLHttpRequest();
req.open('GET', document.location, false);
req.send();
var header = req.getResponseHeader(proxyHeader);
if (header) {
// we are on a proxy
}
Change proxyHeader to what ever your proxy adds to the response.
EDIT: You will have to add a conditional for supporting the IE implementation of XMLHttpRequest
EDIT:
I am on a proxy at work and I have just tested this code in jsfiddle and it works. Could be made prettier so that is supports IE and does an async get but the general functionality is there... http://jsfiddle.net/unvHW/
It turns out that detecting 'via' is much better...
Note that this solution will not work on every proxy and would probably only work if you are BEHIND the proxy :
Some proxies append a field in the response headers of an HTTP request which is called : X-Forwarded-For
Maybe you can achieve what you are trying to do with an AJAX request to google.com for example and check if the field is there.
Something like this :
$.ajax({
type: 'POST',
url:'http://www.google.com',
data: formData,
success: function(data, textStatus, request){
if(request.getResponseHeader('X-Forwarded-For')) !== undefined)
alert("Proxy detected !");
}
});
Edit: As Michael said, the X-Forwarded-For is only appended to requests. You'd better check for the response header your proxy puts in the response header.
No.
Browsers do not expose that sort of configuration data to websites.

Detect the Internet connection is offline?

How to detect the Internet connection is offline in JavaScript?
Almost all major browsers now support the window.navigator.onLine property, and the corresponding online and offline window events. Run the following code snippet to test it:
console.log('Initially ' + (window.navigator.onLine ? 'on' : 'off') + 'line');
window.addEventListener('online', () => console.log('Became online'));
window.addEventListener('offline', () => console.log('Became offline'));
document.getElementById('statusCheck').addEventListener('click', () => console.log('window.navigator.onLine is ' + window.navigator.onLine));
<button id="statusCheck">Click to check the <tt>window.navigator.onLine</tt> property</button><br /><br />
Check the console below for results:
Try setting your system or browser in offline/online mode and check the log or the window.navigator.onLine property for the value changes.
Note however this quote from Mozilla Documentation:
In Chrome and Safari, if the browser is not able to connect to a local area network (LAN) or a router, it is offline; all other conditions return true. So while you can assume that the browser is offline when it returns a false value, you cannot assume that a true value necessarily means that the browser can access the internet. You could be getting false positives, such as in cases where the computer is running a virtualization software that has virtual ethernet adapters that are always "connected." Therefore, if you really want to determine the online status of the browser, you should develop additional means for checking.
In Firefox and Internet Explorer, switching the browser to offline mode sends a false value. Until Firefox 41, all other conditions return a true value; since Firefox 41, on OS X and Windows, the value will follow the actual network connectivity.
(emphasis is my own)
This means that if window.navigator.onLine is false (or you get an offline event), you are guaranteed to have no Internet connection.
If it is true however (or you get an online event), it only means the system is connected to some network, at best. It does not mean that you have Internet access for example. To check that, you will still need to use one of the solutions described in the other answers.
I initially intended to post this as an update to Grant Wagner's answer, but it seemed too much of an edit, especially considering that the 2014 update was already not from him.
You can determine that the connection is lost by making failed XHR requests.
The standard approach is to retry the request a few times. If it doesn't go through, alert the user to check the connection, and fail gracefully.
Sidenote: To put the entire application in an "offline" state may lead to a lot of error-prone work of handling state.. wireless connections may come and go, etc. So your best bet may be to just fail gracefully, preserve the data, and alert the user.. allowing them to eventually fix the connection problem if there is one, and to continue using your app with a fair amount of forgiveness.
Sidenote: You could check a reliable site like google for connectivity, but this may not be entirely useful as just trying to make your own request, because while Google may be available, your own application may not be, and you're still going to have to handle your own connection problem. Trying to send a ping to google would be a good way to confirm that the internet connection itself is down, so if that information is useful to you, then it might be worth the trouble.
Sidenote: Sending a Ping could be achieved in the same way that you would make any kind of two-way ajax request, but sending a ping to google, in this case, would pose some challenges. First, we'd have the same cross-domain issues that are typically encountered in making Ajax communications. One option is to set up a server-side proxy, wherein we actually ping google (or whatever site), and return the results of the ping to the app. This is a catch-22 because if the internet connection is actually the problem, we won't be able to get to the server, and if the connection problem is only on our own domain, we won't be able to tell the difference. Other cross-domain techniques could be tried, for example, embedding an iframe in your page which points to google.com, and then polling the iframe for success/failure (examine the contents, etc). Embedding an image may not really tell us anything, because we need a useful response from the communication mechanism in order to draw a good conclusion about what's going on. So again, determining the state of the internet connection as a whole may be more trouble than it's worth. You'll have to weight these options out for your specific app.
IE 8 will support the window.navigator.onLine property.
But of course that doesn't help with other browsers or operating systems. I predict other browser vendors will decide to provide that property as well given the importance of knowing online/offline status in Ajax applications.
Until that happens, either XHR or an Image() or <img> request can provide something close to the functionality you want.
Update (2014/11/16)
Major browsers now support this property, but your results will vary.
Quote from Mozilla Documentation:
In Chrome and Safari, if the browser is not able to connect to a local area network (LAN) or a router, it is offline; all other conditions return true. So while you can assume that the browser is offline when it returns a false value, you cannot assume that a true value necessarily means that the browser can access the internet. You could be getting false positives, such as in cases where the computer is running a virtualization software that has virtual ethernet adapters that are always "connected." Therefore, if you really want to determine the online status of the browser, you should develop additional means for checking.
In Firefox and Internet Explorer, switching the browser to offline mode sends a false value. All other conditions return a true value.
if(navigator.onLine){
alert('online');
} else {
alert('offline');
}
There are a number of ways to do this:
AJAX request to your own website. If that request fails, there's a good chance it's the connection at fault. The JQuery documentation has a section on handling failed AJAX requests. Beware of the Same Origin Policy when doing this, which may stop you from accessing sites outside your domain.
You could put an onerror in an img, like <img src="http://www.example.com/singlepixel.gif" onerror="alert('Connection dead');" />.
This method could also fail if the source image is moved / renamed, and would generally be an inferior choice to the ajax option.
So there are several different ways to try and detect this, none perfect, but in the absence of the ability to jump out of the browser sandbox and access the user's net connection status directly, they seem to be the best options.
As olliej said, using the navigator.onLine browser property is preferable than sending network requests and, accordingly with developer.mozilla.org/En/Online_and_offline_events, it is even supported by old versions of Firefox and IE.
Recently, the WHATWG has specified the addition of the online and offline events, in case you need to react on navigator.onLine changes.
Please also pay attention to the link posted by Daniel Silveira which points out that relying on those signal/property for syncing with the server is not always a good idea.
You can use $.ajax()'s error callback, which fires if the request fails. If textStatus equals the string "timeout" it probably means connection is broken:
function (XMLHttpRequest, textStatus, errorThrown) {
// typically only one of textStatus or errorThrown
// will have info
this; // the options for this ajax request
}
From the doc:
Error: A function to be called if the request
fails. The function is passed three
arguments: The XMLHttpRequest object,
a string describing the type of error
that occurred and an optional
exception object, if one occurred.
Possible values for the second
argument (besides null) are "timeout",
"error", "notmodified" and
"parsererror". This is an Ajax Event
So for example:
$.ajax({
type: "GET",
url: "keepalive.php",
success: function(msg){
alert("Connection active!")
},
error: function(XMLHttpRequest, textStatus, errorThrown) {
if(textStatus == 'timeout') {
alert('Connection seems dead!');
}
}
});
window.navigator.onLine
is what you looking for, but few things here to add, first, if it's something on your app which you want to keep checking (like to see if the user suddenly go offline, which correct in this case most of the time, then you need to listen to change also), for that you add event listener to window to detect any change, for checking if the user goes offline, you can do:
window.addEventListener("offline",
()=> console.log("No Internet")
);
and for checking if online:
window.addEventListener("online",
()=> console.log("Connected Internet")
);
The HTML5 Application Cache API specifies navigator.onLine, which is currently available in the IE8 betas, WebKit (eg. Safari) nightlies, and is already supported in Firefox 3
I had to make a web app (ajax based) for a customer who works a lot with schools, these schools have often a bad internet connection I use this simple function to detect if there is a connection, works very well!
I use CodeIgniter and Jquery:
function checkOnline() {
setTimeout("doOnlineCheck()", 20000);
}
function doOnlineCheck() {
//if the server can be reached it returns 1, other wise it times out
var submitURL = $("#base_path").val() + "index.php/menu/online";
$.ajax({
url : submitURL,
type : "post",
dataType : "msg",
timeout : 5000,
success : function(msg) {
if(msg==1) {
$("#online").addClass("online");
$("#online").removeClass("offline");
} else {
$("#online").addClass("offline");
$("#online").removeClass("online");
}
checkOnline();
},
error : function() {
$("#online").addClass("offline");
$("#online").removeClass("online");
checkOnline();
}
});
}
an ajax call to your domain is the easiest way to detect if you are offline
$.ajax({
type: "HEAD",
url: document.location.pathname + "?param=" + new Date(),
error: function() { return false; },
success: function() { return true; }
});
this is just to give you the concept, it should be improved.
E.g. error=404 should still mean that you online
I know this question has already been answered but i will like to add my 10 cents explaining what's better and what's not.
Window.navigator.onLine
I noticed some answers spoke about this option but they never mentioned anything concerning the caveat.
This option involves the use of "window.navigator.onLine" which is a property under Browser Navigator Interface available on most modern browsers. It is really not a viable option for checking internet availability because firstly it is browser centric and secondly most browsers implement this property differently.
In Firefox: The property returns a boolean value, with true meaning online and false meaning offline but the caveat here is that
"the value is only updated when the user follows links or when a script requests a remote page." Hence if the user goes offline and
you query the property from a js function or script, the property will
always return true until the user follows a link.
In Chrome and Safari: If the browser is not able to connect to a local area network (LAN) or a router, it is offline; all other
conditions return true. So while you can assume that the browser is
offline when it returns a false value, you cannot assume that a true
value necessarily means that the browser can access the internet. You
could be getting false positives, such as in cases where the computer
is running a virtualization software that has virtual ethernet
adapters that are always "connected".
The statements above is simply trying to let you know that browsers alone cannot tell. So basically this option is unreliable.
Sending Request to Own Server Resource
This involves making HTTP request to your own server resource and if reachable assume internet availability else the user is offline. There are some few caveats to this option.
No server availability is 100% reliant, hence if for some reason your server is not reachable it would be falsely assumed that the user is offline whereas they're connected to the internet.
Multiple request to same resource can return cached response making the http response result unreliable.
If you agree your server is always online then you can go with this option.
Here is a simple snippet to fetch own resource:
// This fetches your website's favicon, so replace path with favicon url
// Notice the appended date param which helps prevent browser caching.
fetch('/favicon.ico?d='+Date.now())
.then(response => {
if (!response.ok)
throw new Error('Network response was not ok');
// At this point we can safely assume the user has connection to the internet
console.log("Internet connection available");
})
.catch(error => {
// The resource could not be reached
console.log("No Internet connection", error);
});
Sending Request to Third-Party Server Resource
We all know CORS is a thing.
This option involves making HTTP request to an external server resource and if reachable assume internet availability else the user is offline. The major caveat to this is the Cross-origin resource sharing which act as a limitation. Most reputable websites blocks CORS requests but for some you can have your way.
Below a simple snippet to fetch external resource, same as above but with external resource url:
// Firstly you trigger a resource available from a reputable site
// For demo purpose you can use the favicon from MSN website
// Also notice the appended date param which helps skip browser caching.
fetch('https://static-global-s-msn-com.akamaized.net/hp-neu/sc/2b/a5ea21.ico?d='+Date.now())
.then(response => {
// Check if the response is successful
if (!response.ok)
throw new Error('Network response was not ok');
// At this point we can safely say the user has connection to the internet
console.log("Internet available");
})
.catch(error => {
// The resource could not be reached
console.log("No Internet connection", error);
});
So, Finally for my personal project i went with the 2nd option which involves requesting own server resource because basically there are many factors to tell if there is "Internet Connection" on a user's device, not just from your website container alone nor from a limited browser api.
Remember your users can also be in an environment where some websites or resources are blocked, prohibited and not accessible which in turn affects the logic of connectivity check. The best bet will be:
Try to access a resource on your own server because this is your users environment (Typically i use website's favicon because the response is very light and it is not frequently updated).
If there is no connection to the resource, simply say "Error in connection" or "Connection lost" when you need to notify the user rather than assume a broad "No internet connection" which depends on many factors.
I think it is a very simple way.
var x = confirm("Are you sure you want to submit?");
if (x) {
if (navigator.onLine == true) {
return true;
}
alert('Internet connection is lost');
return false;
}
return false;
The problem of some methods like navigator.onLine is that they are not compatible with some browsers and mobile versions, an option that helped me a lot was to use the classic XMLHttpRequest method and also foresee the possible case that the file was stored in cache with response XMLHttpRequest.status is greater than 200 and less than 304.
Here is my code:
var xhr = new XMLHttpRequest();
//index.php is in my web
xhr.open('HEAD', 'index.php', true);
xhr.send();
xhr.addEventListener("readystatechange", processRequest, false);
function processRequest(e) {
if (xhr.readyState == 4) {
//If you use a cache storage manager (service worker), it is likely that the
//index.php file will be available even without internet, so do the following validation
if (xhr.status >= 200 && xhr.status < 304) {
console.log('On line!');
} else {
console.log('Offline :(');
}
}
}
I was looking for a client-side solution to detect if the internet was down or my server was down. The other solutions I found always seemed to be dependent on a 3rd party script file or image, which to me didn't seem like it would stand the test of time. An external hosted script or image could change in the future and cause the detection code to fail.
I've found a way to detect it by looking for an xhrStatus with a 404 code. In addition, I use JSONP to bypass the CORS restriction. A status code other than 404 shows the internet connection isn't working.
$.ajax({
url: 'https://www.bing.com/aJyfYidjSlA' + new Date().getTime() + '.html',
dataType: 'jsonp',
timeout: 5000,
error: function(xhr) {
if (xhr.status == 404) {
//internet connection working
}
else {
//internet is down (xhr.status == 0)
}
}
});
How about sending an opaque http request to google.com with no-cors?
fetch('https://google.com', {
method: 'GET', // *GET, POST, PUT, DELETE, etc.
mode: 'no-cors',
}).then((result) => {
console.log(result)
}).catch(e => {
console.error(e)
})
The reason for setting no-cors is that I was receiving cors errors even when disbaling the network connection on my pc. So I was getting cors blocked with or without an internet connection. Adding the no-cors makes the request opaque which apperantly seems to bypass cors and allows me to just simply check if I can connect to Google.
FYI: Im using fetch here for making the http request.
https://www.npmjs.com/package/fetch
My way.
<!-- the file named "tt.jpg" should exist in the same directory -->
<script>
function testConnection(callBack)
{
document.getElementsByTagName('body')[0].innerHTML +=
'<img id="testImage" style="display: none;" ' +
'src="tt.jpg?' + Math.random() + '" ' +
'onerror="testConnectionCallback(false);" ' +
'onload="testConnectionCallback(true);">';
testConnectionCallback = function(result){
callBack(result);
var element = document.getElementById('testImage');
element.parentNode.removeChild(element);
}
}
</script>
<!-- usage example -->
<script>
function myCallBack(result)
{
alert(result);
}
</script>
<a href=# onclick=testConnection(myCallBack);>Am I online?</a>
Just use navigator.onLine if this is true then you're online else offline
request head in request error
$.ajax({
url: /your_url,
type: "POST or GET",
data: your_data,
success: function(result){
//do stuff
},
error: function(xhr, status, error) {
//detect if user is online and avoid the use of async
$.ajax({
type: "HEAD",
url: document.location.pathname,
error: function() {
//user is offline, do stuff
console.log("you are offline");
}
});
}
});
You can try this will return true if network connected
function isInternetConnected(){return navigator.onLine;}
Here is a snippet of a helper utility I have. This is namespaced javascript:
network: function() {
var state = navigator.onLine ? "online" : "offline";
return state;
}
You should use this with method detection else fire off an 'alternative' way of doing this. The time is fast approaching when this will be all that is needed. The other methods are hacks.
There are 2 answers forthis for two different senarios:-
If you are using JavaScript on a website(i.e; or any front-end part)
The simplest way to do it is:
<h2>The Navigator Object</h2>
<p>The onLine property returns true if the browser is online:</p>
<p id="demo"></p>
<script>
document.getElementById("demo").innerHTML = "navigator.onLine is " + navigator.onLine;
</script>
But if you're using js on server side(i.e; node etc.), You can determine that the connection is lost by making failed XHR requests.
The standard approach is to retry the request a few times. If it doesn't go through, alert the user to check the connection, and fail gracefully.

Categories

Resources