Facebook/messenger's inbuilt browser keeps caching page - javascript

I'm having a serious issue with android's inbuilt browser (I assume that's what messenger uses anyway).
The issue is that HTML content on our page is dynamic and has to be always updated per reload, however, the content itself is being cached within the browser.
Scripts, css and other types of caching have no effect, since the URLs being loaded are cached as well.
Example:
echo time();
Will output 1571044358
However, every time I re-open the link (from messenger) it will always output 1571044358, instead of actually reloading the page.
Chrome and regular browsers are obviously working fine, it's just the inbuilt browser that's causing the issue.
Any ideas on how to solve this?

Related

Is there a way to check if the page was loaded from browser cache?

The issue the question originates from is the following. I'm using TiddlyWiki (Classic) SPA on my Android device and usually use it with FireFox and its TiddlyFox extension for saving. For some reasons I'd like to be able to work with (and save) my TWs using other browsers, so I'm testing it with a PHP back-end (my fork of MicroTiddlyServer, but its code is not important here, I believe, + this PHP server).
In my tests I've noticed that although saving works fine, sometimes (at least when the PHP server is unloaded from memory due to this ugly Android "optimization" which seems to be not configurable) a TW is loaded from cache and because of that it is loaded as it was before the latest saving, not after.
So, what I want is to detect if the page was loaded in an ordinary way or from browser cache. Is it possible to check this via JavaScript?
As a worse alternative I can inject a timestamp via MTS and check it in a TW on load, but I'd like to avoid this complication (which involves both front-end and back-end and adds more TW file manipulation).
With the new Resource Timing Level 2 spec you can use the transfer size property to check if the page is loaded from cache:
var isCached = performance.getEntriesByType("navigation")[0].transferSize === 0;
Spec: https://www.w3.org/TR/resource-timing/#dom-performanceresourcetiming-transfersize
If you use the remote debugger in chrome you can see the network requests and determine if you item is cached or not. Firefox seems to have a remote debugger as well.

Force refresh doesn't work for Head JS

When scripts are loaded via Head JS I am unable to force the content to refresh using the Ctrl+F5 (or equivalent) keyboard shortcut.
The scripts cache correctly and the browser obeys the cache directives sent from the server (I'm using IIS 7.5). But unlike scripts tags included directly in the markup, I can't override the cache and force a refresh of the scripts loaded via Head JS.
I'm assuming this is a consequence of the way the scripts are loaded dynamically. I can live with this behaviour because forcing the refresh is only convenient during development, and I know of other ways I can force the content to be retrieved from the server.
I just wondered if anyone could explain why this is the case...
Update
This was never a problem for us in Live, because the cache directives for our static content were set appropriately. It was only ever a problem in Development and QA, The options left available to me were...
Configure all Dev and QA browsers to never cache content.
Configure the static content cache directives differently for Dev and QA environments - essentially setting MaxAge to something so small the content would always be expired. Only setting the correct MaxAge value in Live.
I went with the second option.
Dynamic script loading is not a part of the page loading proper. When you force refresh, the browser reloads the page and all resources referenced in its HTML and in referenced CSS files, but the scripts you load with head.js are not referenced in the page content and the browser has no way to figure out that head.js is going to create references to additional resources. At the point where these references are created, the browser is no longer refreshing the page and thus normal cache rules apply.
You can force reload of your scripts by appending unique query strings to their URLs (e.g. jquery.js?random=437593486394), but this will disable caching for all loads of your page, not just when you force refresh.
This is also a problem with require.js. Hopefully one of these work arounds will also apply to Head.Js
If using Chrome, open the developer tools panel on the Network tab, right click and choose 'Clear Browser Cache'
Do a bit of 'Cache-busting' by appending a datetime stamp to the query string for js resources
If your using IIS (which it looks like you are). Go to the HTTP Response Headers panel of your website, click Set Common Headers and set Expire Web content to immediately.
The latter is my preferred option for my development machine
I wouldn't say its a question of dynamic or not dynamic, when you inject a script it still causes the browser to make a HTTP request and apply whatever caching logic it applies.
Like mentioned above if you don't want scripts to be cached ..dynamic or static, it doesn't matter, you will usually have to append a timestamp in the form of a query string to it.
If you just want to see if you changes are working, do a force refresh in your browser ...usually CTRL+F5

Browser loads duplicate copies of scripts

If I open my webpage and then look at developer tools on the browsers like chrome/firefox/ie, I see multiple copies of the same view/javascript files loaded. This can potentially screw up how javascript behaves wrt to its state.
Almost everytime, doing page refresh would solve it.
This happens very randomly.
I am not at liberty to post my code, but the set up is nginx/thin/ruby/haml.
What can lead to such behavior? is the problem on the server side or browser side?
Our initial hunch was that may be they are multiple versions of the same doc, but all of them are exact replicas. So I ruled out caching as possible culprit.
More info about the page:
No dynamic loading of scripts - simple script tags
No frames on the page - simple body tag with a form in it
No advertisement scripts on the page
happens on all browsers
intermittent - frequency is like once in 500 page loads

How can I stop loading a web page if it is equiped with frame-buster buster?

How can I stop loading a web page if it uses a frame-buster buster as mentioned in this question, or an even stronger X-Frame-Options: deny like stackoverflow.com? I am creating a web application that has the functionality of loading external web pages into an <iframe> via javascript, but if the user accidentally steps on to websites like google.com or stackoverflow.com, which have a function to bust a frame-buster, I just want to quit loading. In stackoverflow.com, it shows a pop up message asking to disable the frame and proceed, but I would rather stop loading the page. In google, it removes the frame without asking. I have absolutely no intent of click jacking, and at the moment, I only use this application by myself. It is inconvinient that every time I step on to such sites, the frames are broken. I just do not need to continue loading these pages.
Edit
Seeing the answers so far, it seems that I can't detect this before loading. Then, is it possible to load the page in a different tab, and then see if it does not have the frame-buster buster, and then if it doesn't, then load that into the <iframe> within the original tab?
Edit 2
I can also acheive the header or the webpage as an html string through the script language (Ruby) that I am using. So I think I indeed do have access to the information before loading it into an <iframe>.
There's no way to detect this before loading the page since the frame busting is done via a header or is triggered via JavaScript as the page is loading.
Without a server backend you won't be able to as you are pretty limited with the amount of tinkering you can do in javascript due to crossdomain policies.
You might want to consider creating some sort of a blacklist for URLs to stay away from...

Setting URL hash from Silverlight sometimes failing

In our silverlight application we set the location hash property of the browser window to bookmark the current control and query parameters being requested. This is done through javascript via Silverlight like so:
var hashCode = "Example.ControlNamespace.ClassName?clientID=62189";
HtmlPage.Window.Eval(string.Format("window.location.hash='{0}'", hashCode));
This works well enough, but we get intermittent errors from production where this is failing with a stack track that ends at that line..
System.InvalidOperationException: Eval failed.
at System.Windows.Browser.HtmlWindow.Eval(String code)
This only happens occasionally, but I would like to know what is causing it. I've been able to replicate it once myself using IE8, so I don't think there are any obscure browsers causing this. It seems that it is sometimes invalid to set the hash, but I don't know why. Also if it matters its hosted on a secure connection, https.
Thanks in advance.
Edit: I was able to replicate it again. When debugging the javascript the error was 'permission denied'. This seems to only happen on the first load of the page, so maybe the page isn't finished loading and the url hash is not allowed to be changed until it is complete?
This may be associated with this particular issue here:
Suppress navigation when setting HtmlPage.Window.CurrentBookmark property in Silverlight.
The behavior I've seen is that when you set the hash in IE after a redirect, the page refreshes (rather than giving you an "permission denied"), but perhaps there are other scenarios when you're not allowed to do so, e.g., if you're running under HTTPS.
If it does turn out that this is the problem, the only real workaround I've seen is to detect if you're in that scenario (i.e., you've reached this page after a redirect, and you're running in IE), and refresh the page (using JavaScript) before you load your Silverlight application.

Categories

Resources