I am trying to check whether there is an internet connection from within a chrome plugin. While there is API call available that suggests to do check that, it only checks if there is internet connection possible, theoretically.
To get this information, I try to load an Image
checkConnection() {
var newImg = new Image;
newImg.src = url;
newImg.onload = function() { ... }
newImg.onerror = function() { ... }
}
I do use the Image object to avoid all these problems with Same-Origin-Policies that occur when using get request with JavaScript (I am using the code from within a plugin, so there is no address that has the same origin as my code :/ ). In principle, my code from above works. I have a setTimeout that calls checkConnection every other time. However, when the image was loaded successfully once (per plugin reload) it is stored in the cache and is loaded from there when the connection fails.
Do you have an idea how to bypass this problem? Or do you know of a smart way to check the internet connection from within a Chrome plugin without setting up a server that might tolerate the origin of my request?
You have 3 options that I know of...
Checking the internet connection
After some quick Googling I found navigator.onLine. It looks like you can use this to check internet connectivity, but it is not always entirely accurate.
Stopping the image from caching
If you have control of the domain where the image is hosted you can set headers telling the browser not to cache the image.
Cache-Control: no-cache, must-revalidate // HTTP/1.1
Expires: Sat, 26 Jul 1997 05:00:00 GMT // Date in the past
If you do NOT have control of the domain, when loading the image try appending a GET variable to the URL with a random number or the current timestamp.
E.g.
https://example.com/image.png?time=1496222683
https://example.com/image.png?time=1496223323
https://example.com/image.png?time=1496222313
This can trick the browser into thinking you are requesting a new resource.
Related
For testing I downloaded images from the net and uploaded using valum file upload in chrome...chrome is not sending session cookie along with these request header( I dont see that in the server side/though I see it on developer tool)...does chrome know that these images are from different domain . what is happening...Is there work around for this to pass the session id (as cookie). It is also happening in IE10 which makes me belive it is some standard. and not just a chrome issue. This problem is not there with firefox/safari/opera
It is fine when uploading to localhost. only when uploading to different server with domain name there is this problem leading to creating a new session for this.
Update:
I have added xhr.withCredentials = true still no use.
Also added on the server side to the upload url...
res.setHeader 'Access-Control-Allow-Origin', '*'
res.setHeader 'Access-Control-Allow-Credentials', true
I dont know how helpful this would be, because I would have already sent the upload file and response header will not of much help.
basically the problem is I don't have access to the session variable at the server side, since the session id/sid cookie is not coming back /I am not able to save some of this upload details into the current session(because this is a new session) .
Update:
I tried creating an image in teh desktop using paint..even then chrome would not sent the cookies. Really drives me crazy...
First of all, to get the basics out of the way, this is unrelated to the origin of the image. Chrome or other browsers don't care where you get your images.
It's rather difficult to guess exactly what's going on, would have helped to see a jsfiddle or some more setup explanation, but based on what I'm guessing, you might be using different domains for the page where the upload button is hosted and the target url where you're sending your files (even using ssl for one and http for the other makes it different). Even different subdomains will not allow cookies to be passed if the cookies were not set with a base domain (yourdomain.com)
So, if sub-domains are the problem, you know what to do - set a base domain so you get your cookies to go on any sub domain.
If it's http vs. https you need to always use https (or http) because you can't switch cookies between those two.
If that's not it, or if you're using completely different domains, you can access your cookies locally via script (if they're not marked as http only) and add them to the upload request. Valum 2.0 (don't know about v1.0) lets you add parameters to the request like so:
var uploader = new qq.FileUploader({
element: document.getElementById('file-uploader'),
action: '/server-side.upload',
// additional data to send, name-value pairs
params: {
param1: 'value1',
param2: 'value2'
}
});
You can't set cookies on a domain which is not the page's domain via script so for using completely different domains your only choice is using request params.
It is possible that the uploader is using Flash under some circumstances to do the upload. There is a bug in Flash which prevents cookies being sent for these types of requests. This would explain the behaviour you are seeing. The workaround is to pass in the sessionId and transmit it in a different way eg. querystring.
I don't know how to ask this question but, i am developing a single page application(SPA) using nodejs on the server side and whenever the data gets updated the user gets informed, however if the user has refreshed wouldn't the json data and every script file just vanish an get requested from the server again?
How can i prevent the javascript files and specifically the file that has the json data from being requested again on page refresh?
is there a way to solve this problem?
JavaScript files are not special. Just like images, style sheets, and HTML files, they get re-requested as necessary by the browser.
And so the same techniques for minimizing re-retrieval of them apply. The browser can reuse its cached copy of the file if you configure your web server to set appropriate caching headers when responding with the file data (provided the browser still has a copy).
You can see an example of this on the Google Libraries site. If you request a specific version of a library file (say, jQuery 1.10.1) when your web console open to the network tab, you'll see that Google returns it with these headers (irrelevant ones omitted):
Age: 238894
Cache-Control: public, max-age=31536000
Date: Thu, 09 Jan 2014 20:47:08 GMT
Expires: Fri, 09 Jan 2015 20:47:08 GMT
Last-Modified: Tue, 09 Jul 2013 11:31:25 GMT
Note that the file is allowed to be cached, without revalidation, for a year. So if the user refreshes the page, the browser can reuse its cached copy (if it has one). (This is not what Google does if you use one of the wildcard "any version of jQuery 1.10" URLs, because of course the tip version changes...)
Some browsers may bypass their cache with a refresh (particularly a "force" refresh like Ctrl+F5). In that case, they may at least send an If-Modified-Since request.
If you want to prevent cache and reload javascript at every request, make suer you use the correct header:
Cache-Control:max-age=0
Your browser will undertand to refresh all resources at every request.
For a better understanding about cache, please give a look at this A/Q
As for json data, you can save it in local storage (http://www.w3schools.com/html/html5_webstorage.asp)
Perhaps you can try Application Cache and Local storage.
We use a heatmap at work (it is secure so I can't post it here and I certainly cannot give the url). Now the heatmap is a sigle image, but it is clickable on different segments - There are different icons on the map and each of them pops up a different window(So it is not just a single click event for the whole image)...
Now this map is generated by a program called Whatsup Gold, and it is definitely not Flash...
The map constantly monitors line connection activity - When an internet line goes down, the
green icon turns red. The map is refreshed every few minutes...
What I want to know: Is there a way, either using a browser plugin, or Javascript, or any other method to notify me that the status of the map has changed(any change on the map) without having to open the browser window everytime?
Here is the part of the markup of the map...(the whole map)
<a href="mappickhost.asp?map=WhatsUpTL.wup">
<img border="0" src="WUG1.jpg" ismap="">
</a>
Update:
Request and Response headers(As retrieved from Firebugs Network tab)
Request:
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.5
Authorization: Basic dGxpbms6dGxpbms=
Cache-Control: max-age=0
Connection: keep-alive
Host: ***************
Response
Cache-Control no-cache
Content-Type image/jpeg
Date Fri, 17 May 2013 07:06:30 GMT
Expires Mon, 17 Jul 1980 20:00:00 GMT
Pragma no-cache
Server WhatsUp_Gold/8.0
I have added a screenshot of firebug console in firefox (press F12 to open it). I executed a xmlhttprequest (line at the bottom) and you can see (the black line) that it connected to stackoverflow. Firebug shows xhr connections in the console by default but you can select what you want to see by clicking the little arrow button to the right of the word "Console" in the tab that activates the console.
Even having the console open with StackOverflow shows that the site connects every so often to update question status and comments.
If nothing shows up there than it might show in the Net tab. It has to make a connection to somewhere to see if network is still up. If it connects to a service than xhr is likely if it just tries to download a textfile sometmes or sets the src of an image than it'll show up only in the net tab every time it checks.
We use a heatmap at work (it is secure so I can't post it here and I certainly cannot give the url). Now the heatmap is a sigle image, but it is clickable on different segments
Surely it's not just the image that's causing interaction. Plain images can't do that. If it's not Flash, then it could be JavaScript powering it. There must be some library and it could have an API. I suggest you start reading the documentation (if any).
The map constantly monitors line connection activity
This is done in a lot of ways, and the most common is polling AJAX or Web Sockets. To check this out, you can take a look at the Network Tab of the browser's debugger. If there's a supported API, you can check it out first and save the hassle of debugging the implementation.
If no API exists, inspect the requests and find a way to replicate it. Once you can replicate it or use the API, you can create your plugin or create a custom app for it.
I make an Ajax request in which I set the response cacheability and last modified headers:
if (!String.IsNullOrEmpty(HttpContext.Current.Request.Headers["If-Modified-Since"]))
{
HttpContext.Current.Response.StatusCode = 304;
HttpContext.Current.Response.StatusDescription = "Not Modified";
return null;
}
HttpContext.Current.Response.Cache.SetCacheability(HttpCacheability.Public);
HttpContext.Current.Response.Cache.SetLastModified(DateTime.UtcNow);
This works as expected. The first time I make the Ajax request, I get 200 OK. The second time I get 304 Not Modified.
When I hard refresh in Chrome (Ctrl+F5), I get 200 OK - fantastic!
When I hard refresh in Internet Explorer/Firefox, I get 304 Not Modified. However, every other resource (JS/CSS/HTML/PNG) returns 200 OK.
The reason is because the "If-Not-Modified" header is sent for XMLHttpRequest's regardless of hard refresh in those browsers. I believe Steve Souders documents it here.
I have tried setting an ETag and conditioning on "If-None-Match" to no avail (it was mentioned in the comments on Steve Souders page).
Has anyone got any gems of wisdom here?
Thanks,
Ben
Update
I could check the "If-Modified-Since" against a stored last modified date. However, hopefully this question will help other SO users who find the header to be set incorrectly.
Update 2
Whilst the request is sent with the "If-Modified-Since" header each time. Internet Explorer won't even make the request if an expiry isn't set or is set to a future date. Useless!
Update 3
This might as well be a live blog now. Internet Explorer doesn't bother making the second request when localhost. Using a real IP or the loopback will work.
Prior to IE10, IE does not apply the Refresh Flags (see http://blogs.msdn.com/b/ieinternals/archive/2010/07/08/technical-information-about-conditional-http-requests-and-the-refresh-button.aspx) to requests that are not made as a part of loading of the document.
If you want, you can adjust the target URL to contain a nonce to prevent the cached copy from satisfying a future request. Alternatively, you can send max-age=0 to force IE to conditionally revalidate the resource before each reuse.
As for why the browser reuses a cached resource that didn't specify a lifetime, please see http://blogs.msdn.com/b/ie/archive/2010/07/14/caching-improvements-in-internet-explorer-9.aspx
The solution i came upon for consistent control was managing the cache headers for all request types.
So, I forced standard requests the same as XMLHttpRequests, which was telling IE to use the following cache policy: Cache-Control: private, max-age=0.
For some reason, IE was not honoring headers for various requests types. For example, my cache policy for standard requests defaulted to the browser and for XMLHttpRequests, it was set to the aforementioned control policy. However, making a request to something like /url as a standard get request, render the result properly. Unfortunately, making the same request to /url as an XMLHttpRequest, would not even hit the server because the get request was cached and the XMLHttpRequest was hitting the same url.
So, either force your cache policy on all fronts or make sure you're using different access points (uri's) for your request types. My solution was the former.
I am working with user data and cannot allow it to be cached.
I am testing with Firefox 4 and a Tomcat 6 server with the cache-control set at no-cache, no-store, private, must-revalidate, max-age=0. I have also set the expires header.
But Firefox is still a generating a wyciwyg (what-you-cache-is-what-you-get) file in it's cache. This might be considered a history mechanism and not a caching mechanism. But it still stores user data.
An example is http://www.w3schools.com/Ajax/ajax_example.asp.
In Firefox 4.0, the about:cache (specially about:cache?device=disk) page shows the wyciwyg:// file and also the expire date set at "No expiration time". What is worse is that the cache persists after the browser is closed and restarted.
After investigation, for me, the wyciwyg is generated with the contents of a document.write() javascript call.
How do it prevent Firefox from caching this? Ideally without changing the document.write()?
Did you check with Firebug that the cache headers are actually sent to the browser?
But in any case, there's a simple solution to prevent caching: add a timestamp parameter to the request URL.
url = '/my/ajax/script?_=' + new Date().getTime();
This trick is also used by many JS libraries. If you're using jQuery, it suffices to pass cache: false to the AJAX request options.