I've a tried a lot of things to check the internet connection of the users of my app. My first researches brought me to the navigator.onLine method. It works sometimes. But sometimes it doesn't. Of course it's not new, I have seen multiples people complaining about that.
Then I tried the XHR request. It was working on every devices, whatever the internet navigator (not like the previous method). But I got some warnings from Chrome and Firefox cos it was synchronous and may slow the whole app.
So I converted my function to an asynchronous function :
function verification() {
var xhr = new XMLHttpRequest();
xhr.open("GET", "//" + window.location.hostname + "/ressources/favicon.ico?rand=" + Date.now(), true);
xhr.onload = function (e) {
if (xhr.readyState === 4 && xhr.status === 200) {
reconnection();
}
};
xhr.onerror = function (e) {
deconnexion();
};
xhr.send(null);
}
The idea is simple, I check if I can access to the favicon (with a rand to make it unique and avoid cache ressource). If the request is a success then I consider I'm connected. If not, I'm not connected. So far, it seems to work pretty well.
My question is : is it the best way to do that with pure JS ? Or should I maybe consider using fetch ? Or is there a better way I didn't find ?
Related
I'm trying to use the API from https://www.themoviedb.org/. (The key is free and can be changed easily, so I'll include it because without it, you can’t even test their functions).
Now my JavaScript is working fine in FF when it's hosted local, but not on GitHub pages.
Here is a function that doesn’t work. Error is:
NetworkError: A network error occurred.
…and it appears to happen after bhttp.send();.
function getMovieDetails() {
var reqURL = "https://api.themoviedb.org/3/movie/latest?api_key=afe4e10abbb804e2b4a4f8a3ef067ad5&language=en-US";
var bhttp = new XMLHttpRequest();
bhttp.open("GET", reqURL, false);
bhttp.setRequestHeader("Content-type", "json");
bhttp.send();
var response = JSON.parse(bhttp.responseText);
var str = JSON.stringify(response, null, 2);
return response;
}
console.log(getMovieDetails());
It works fine in Chrome. Googling appears to indicate it’s a CORS problem, but as far as I know GitHub pages supports CORS, so I don't know what I'm doing wrong.
I'm not a firefox user, so you will need to test this. But if the theory of async blocking is true this should work.
I've modified it to use a simple callback, personally I wouldn't use callbacks but would make into promises, but that's another question :)
function getMovieDetails(callback) {
var reqURL = "https://api.themoviedb.org/3/movie/latest?api_key=afe4e10abbb804e2b4a4f8a3ef067ad5&language=en-US";
var bhttp = new XMLHttpRequest();
bhttp.open("GET", reqURL, true);
bhttp.setRequestHeader("Content-type", "json");
bhttp.onload = function() {
if (bhttp.readyState === 4) {
if (bhttp.status === 200) {
callback(JSON.parse(bhttp.responseText));
} else {
console.error(bhttp.statusText);
}
}
};
bhttp.send();
}
getMovieDetails(function (movie) {
console.log(movie);
});
Alright looks like skirtle was correct, the issue was the firefox addon Privacy-Badger blocking the API. I feel pretty stupid now but at least my code is now pretty clean.
Currently working on a project where we get a bunch of image over AJAX.
We do quite a lot of these, and it seems that IE11 seems to lose quite a few of them.
We get the image, call "URL.createObjectURL", get a valid URL, but by the next line, the image is gone. This works fine in other browsers, I'm assuming we're hitting some limit in IE.
Is there a nicer way of detecting if this URL is valid than trying to load it up?
Seems a little redundant to have to:
AJAX a file.
Get its address on the user's comp.
AJAX that address to make sure it's there.
Code:
var urlObj = window.URL || window.webkitURL;
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function() {
if (this.readyState == 4) {
active--;
if (this.status == 200) {
var blobUrl = urlObj.createObjectURL(new Blob([this.response], {type : 'image/jpeg'}));
image.setSource(blobUrl);
}
}
}
xhr.open('GET', url);
xhr.responseType = 'arraybuffer';
xhr.send();
So if you call this A LOT (with different URLs) - the blobUrl that we pass to setSource is a sensible looking object URL, but when you try and use it, you get an error.
IE has either cleared up the memory, or lost the image or something. Bearing in mind we're not changing the page, losing the session, or revoking the blobUrl.
The only way I can think of to check if this has happened (i.e. that blobURL no longer points at anything), is to fire another AJAX request at the blobURL, and check it returns 200 etc....
Is there a better way? I'm imagining not.
I've "carved" this code-chunk from a larger block, so if there are any obvious mistakes there that's why. It works around 95% of the time. If I fire the AJAX "check" afterwards, and load the image again on a fail that fixes the issue.
Weird IE behaviour. Anyone else seen this?
More info: We're not using pdf.js, but same problem is reported here: https://connect.microsoft.com/IE/feedback/details/813485/resource-blob-not-found-when-using-url-createobjecturl-blob
This is (more or less) the check that we are using:
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function(){
if (this.readyState == 4){
if (this.status == 200 || (this.response && this.response.type && this.response.type == "image/jpeg")) {
success(blobUrl);
}
else {
fail(blobUrl);
}
}
}
xhr.open('GET', blobUrl);
xhr.responseType = 'blob';
xhr.send();
Have to use that weird response.type check, because Firefox returns with status == 0.
My script is sending a GET request to a page (http://example.org/getlisting/) and the page, in turn, responds back with a JSON object. ({"success":true, "listingid":"123456"})
Here's an example snippet:
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
Simple enough. The script works perfectly too! The issue arises when I want to do this:
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
window.setInterval(function() {
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
}, 250);
What I imagine should happen is my script would create a steady flow of GET requests that get sent out to the server and then the server responds to each one. Then, my script will receive the server's responses and send them to the callback.
To be more exact, say I let this script run for 5 seconds and my script sent out 20 GET requests to the server in that time. I would expect that my callback (listingCallback) would be called 20 times as well.
The issue is, it isn't. It almost seems that, if I sent out two GET requests before I received a response from the server, then the response is ignored or discarded.
What am I doing wrong/misunderstanding from this?
Many browsers have a built in maximum number of open HTTP connections per server. You might be hitting that wall?
Here is an example from Mozilla but most browsers should have something like this built in: http://kb.mozillazine.org/Network.http.max-connections-per-server
An earlier question regarding Chrome:
Increasing Google Chrome's max-connections-per-server limit to more than 6
If you have Windows, take a look at a tool like Fiddler - you might be able to see if all of the requests are actually being issued or if the browser is queueing/killing some of them.
You can't reuse the same XMLHttpRequest object opening a new connection while one is in progress, otherwise it will cause an abrupt abortion (tested in Chrome). Using a new XMLHttpRequest object for each call will solve that:
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
window.setInterval(function() {
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
}, 250);
This will work nicely queueing a new ajax request for each interval.
Fiddle
Note that too frequent calls may cause slowdown due to the maximum limit of concurrent ajax calls which is inherent to each browser.
Though, modern browsers have a pretty fair limit and very good parallelism, so as long as you're fetching just a small JSON object modern browsers should be able to keep up even when using a dial-up.
Last time I made an ajax polling script, I'd start a new request in the success handler of the previous request instead of using an interval, in order to minimize ajax calls. Not sure if this logic is applicable to your app though.
I've been working on a Windows gadget (meaning the "browser" is Internet Explorer) that queries specified subnet addresses for information. Now, it sometimes does this at a relatively quick pace (roughly every 5 seconds) and it works well enough. However, sometimes it will get stuck at ready state 1 and will just stay there forever. Whenever the gadget tries to redo the function for getting the xmlhttprequest and getting information from it it will stay at state 1. This is easily replicable when opening multiple instances of the gadget and then closing all but one of them. At that point, the one that's still open will almost always get stuck in this state. I feel like it might have something to do with them all accessing the same website, or it may just have to do with xmlhttprequests being stopped mid-transmission and that preventing another from working. Below is the relevant code.
//Reference to this for the inner function
var me = this;
var request = new XMLHttpRequest();
request.onreadystatechange = onReadyStateChange;
var url = this.url;
//Make the URL random to prevent being cached
url += ("&a=" + ((new Date()).getTime()));
Trace(DEBUG_COMM, "Sase.updateStatus url: " + url);
request.open("GET", url, true);
request.send(); // fire off the request, calls httpRequestReadyStateChange as things continue
Trace(DEBUG_COMM, "Request sent" + request.readyState);
function onReadyStateChange() {Trace(DEBUG_COMM, "Sase.httpRequestReadyStateChange: state=" + request.readyState);
if (4 == request.readyState) {
Trace(DEBUG_COMM, "Sase.httpRequestReadyStateChange: status=" + request.status);
if (request.status == 200) {
Trace(DEBUG_COMM, "retrieved html: " + request.responseText);
var results = request.responseText;
var resultsString = request.responseText.toString();
Trace(DEBUG_COMM, "results String: " + resultsString);
me.ParseStatusData(resultsString);
}
else {
//me.commError(request.status);
}
request = null;
}
}
Well it looks like I figured it out. I had a feeling it was an unresolved request, since it only happens when instances of it are closed (meaning that if one of them is closed while in communication with the server it stays in communication forever and no one else can access the server) and it appears that is the case. I made some modifications to the code in multiple areas and basically what it comes down to is when the gadget closes it makes sure to abort all of the requests. The requests are now instance variables (to allow for the aborting of them), but are still made new everytime they are needed.
For those who stumble across this and need a concrete code example, here you go.
I had the same problem and the solution was to re-use the XMLHttpRequest object, to ensure that any previous request was cancelled before initiating a new one. This won't work if you want to have multiple AJAX requests flying around but in my case triggering a new request meant that the last one was no longer needed.
All of the requests on my page went through the same XMLHttpRequest wrapper method which looked like this;
//Declare the XMLHttpRequest object to be re-used
var global_xHttpRequest = null;
function xmlHttpHandler(type, params, complete, postData) {
//Prevents an issue where previous requests get stuck and new ones then fail
if (global_xHttpRequest == null) {
global_xHttpRequest = new XMLHttpRequest();
} else {
global_xHttpRequest.abort();
}
//Parse out current URL
var baseURL = window.location.host;
var svc = "https://" + baseURL + "/processAction?";
var url = svc + params;
global_xHttpRequest.open(type, url, true);
//Add the callback
global_xHttpRequest.onreadystatechange = complete;
global_xHttpRequest.send(postData);
}
Which can be used like this:
xmlHttpHandler("GET", params, completeFnc);
I have an issue, mainly with IE.
I need to be able to handle n queries one after another. But If I simply call my function below in a for loop IE does some strange things (like loading only so many of the calls).
If I use an alert box it proves that the function gets all of the calls, and surprisingly IT WORKS!
My guess is that IE needs more time than other browsers, and the alert box does just that.
Here is my code:
var Ajax = function(all) {
this.xhr = new XMLHTTPREQUEST(); // Function returns xhr object/ activeX
this.uri = function(queries) { // Takes an object and formats query string
var qs = "", i = 0, len = size(queries);
for (value in queries) {
qs += value + "=" + queries[value];
if (++i <= len) { qs += "&"; }
}
return qs;
};
xhr.onreadystatechange = function() { // called when content is ready
if (this.readyState === 4) {
if (this.status === 200) {
all.success(this.responseText, all.params);
}
this.abort();
}
};
this.post = function() { // POST
xhr.open("POST", all.where, true);
xhr.setRequestHeader("Content-type","application/x-www-form-urlencoded");
xhr.send(uri(all.queries));
};
this.get = function() { // GET
xhr.open("GET", all.where + "?" + uri(all.queries), true);
xhr.send();
};
if (this instanceof Ajax) {
return this.Ajax;
} else {
return new Ajax(all);
}
};
This function works perfectly for a single request, but how can I get it to work when called so many times within a loop?
I think the problem might be related to the 2 concurrent connections limit that most web browsers implement.
It looks like the latency of your web service to respond is making your AJAX requests overlap, which in turn is exceeding the 2 concurrent connections limit.
You may want to check out these articles regarding this limitation:
The Dreaded 2 Connection Limit
The Two HTTP Connection Limit Issue
Circumventing browser connection limits for fun and profit
This limit is also suggested in the HTTP spec: section 8.14 last paragraph, which is probably the main reason why most browsers impose it.
To work around this problem, you may want to consider the option of relaunching your AJAX request ONLY after a successful response from the previous AJAX call. This will prevent the overlap from happening. Consider the following example:
function autoUpdate () {
var ajaxConnection = new Ext.data.Connection();
ajaxConnection.request({
method: 'GET',
url: '/web-service/',
success: function (response) {
// Add your logic here for a successful AJAX response.
// ...
// ...
// Relaunch the autoUpdate() function in 100ms. (Could be less or more)
setTimeout(autoUpdate, 100);
}
}
}
This example uses ExtJS, but you could very easily use just XMLHttpRequest.
Given that the limit to a single domain is 2 concurrent connections in most browsers, it doesn't confer any speed advantage launching more than 2 concurrent requests. Launch 2 requests, and dequeue and launch another each time one completes.
I'd suggest throttling your requests so you only have a few (4?) outstanding at any given time. You're probably seeing the result of multiple requests being queued and timing out before your code can handle them all. Just a gess though. We have an ajax library that has built-in throttling and queues the requests so we only have 4 outstanding at any one time and don't see any problems. We routinely q lots per page.
Your code looks like it's put together using the constructor pattern. Are you invoking it with the new operator like var foo = new Ajax(...) in your calling code? Or are you just calling it directly like var foo = Ajax(...) ?
If the latter, you're likely overwriting state on your later calls. It looks like it's designed to be called to create an object, on which the get/post methods are called. This could be your problem if you're "calling it within a loop" as you say.