I am working on Angular2 project and got an issue that Chrome caches HTML templates. It is not a problem for the development since I can make cache to be ignored in Chrome Dev mode.
But it is a real pain for the customer and his users...
I tried to add following to templateURL:
templateUrl: './pages/add-financial-deal.html?v=201610070907',
Updated URL is added into auto-generated JS but Chrome still uses cached JS even if I click Shift + F5. Checkout screenshot: http://screencast.com/t/bc2nf4zVcm
Is there any reliable way to get it working?
Thanks
Edit: Decmber 2nd
I have finally figured out what happening there. Despite the fact that I have added changes into TS (and auto-generated JS was also updated) Chrome is still loading file from in-memory cache - screencast.com/t/FGuaMXaKL . I have been waiting for 30 mins, but it still loads in-memory cached file. Is it intended behavior? How to avoid this?
I have finally fixed an issue which was actually 2 issues:
HTML templates were cached. This was fixed by adding extra parameter into template URL, like "template.html?v=201612041641"
JS files were staying in Chrome in-memory cache. Despite the fact that it should be kinda fast cache JS were cached for hours and hours. And Ctrl/Shift + F5 does not affect in-memory cache. Also, I tried to set cache control to "no-store, no-cache" in root html page. Finally, I set extra header into IIS Response header option "Control-Cache:no-cache" and now it works. JS files are stored on hard drive Chtome store, 304 status is recieved and changed files are properly updated
Related
Afternoon All,
A bit of background - I'm building a custom calendar for a company where jobs can be scheduled and engineers can access it from their mobile to know when and where they're going. They previously used Google calendar but now want something bespoke.
All is fine until somebody loses phone signal and gets a horrible offline page in Chrome and can't access any information. What I'm wanting to do is have it save an offline version of the calendar but also update it when they re-visit it with a better connection - as job times often change.
I've tried saving the page and enabling offline mode in Chrome but the page doesn't update until you manually clear the cache so no good.
I've tried adding some javascript to hard refresh the page in the hope it clears the browser cache but again it doesn't update the page.
<script>location.reload(true);</script>
I read about cache manifests and have tried that too but although it feels like it wants to work it also doesn't update the page until I go to chrome://appcache-internals and remove the file.
CACHE MANIFEST
/calendar.php
/css/style.css
Neither PHP or Javascript headers work either as they either don't update the file on re-visit or simply don't save any files in the first place.
header("Cache-Control: no-cache, must-revalidate");
<meta http-equiv="Cache-Control" content="no-store" />
As far as I can tell there's no way to manually delete a user's website cache and re-download it and once the cache has been saved there's no way to force it to update. If you set it to expire then it's not there to access and you don't know when they will next have a connection to update so I'm going round in circles.
I've been trying for several hours now to find something that works and can't believe it's not a simple thing to do and therefore I'm now throwing myself on the mercy of you fine coders to point me in the right direction before my boss hangs me from the first floor window.
Many Thanks
UPDATE
Using what Clarence said as a starting point I came up with the following code in my appcache file:
CACHE MANIFEST
CACHE:
/css/bootstrap.css
/css/style.css
/calendar.php
NETWORK:
/calendar.php
# UPDATED: 03-04-2018 15:55:57
What this does is caches the calendar.php file BUT if there is a connection then it has another look at those files under the NETWORK heading so I also put it in there. If the appcache files hasn't changed then the browser doesn't bother looking so I've used the following code to write to the file when a job has been altered:
$manifest = file_get_contents(__DIR__ . '/cache.appcache');
$newFile = substr($manifest, 0, (strpos($manifest, '# UPDATED: ') + 11));
$newFile = $newFile . date('d-m-Y H:i:s');
file_put_contents('cache.appcache', $newFile);
Basically just searches the file for "UPDATED" and inserts a new time thus updating the file and requiring a re-check from returning users.
Somebody might point out this isn't the right way to do it but it seems to work from my tests so would like to thank those that contributed.
Have you tried changing the contents of your cache manifest whenever you change one of the files? The APPCACHE is a bit finicky when it comes to changes in the files and can be troublesome to handle. I usually include a comment with a timestamp and version number just to force it to update in the browser, like so:
CACHE MANIFEST
# 01-01-2001 v1.0 (Change whenever you need to force an update of the cache)
CACHE:
/css/file.css
/js/file.js
NETWORK:
*
FALLBACK:
well the best option would be creating a PWA. This should include the manifest files and Service workers as well. It enables you to cache the content of the websites and update it once the connection has been reestablished. However it is very new and would require a decent amount of research into service workers. If you need any help regarding the development of PWA would be happy to help to an extent
The only way without HTTP-Header is to rename files continuously.
And the depending HTML tag file-name.
So the files are loaded afresh.
With HTTP-Header look here.
How to control web page caching, across all browsers?
You can do this whenever a new change in the calendar was made.
I have been sitting here for almost an hour here to test the website I'm building. Since I wanted to see the new changes from my code I reloaded, but it was reloading old one. I opened the devetools to hard reload and empy cache hard reload, they both load my old code. I went to incognito mode and it did the same thing. I went to devtools again to disable the cache from the settings and checked the disable cache in the network tab; it still cache my old code. Add-ons to clear the cache didn't work as well. Man, I haven't had this problem before and it only happened last night and it's worst today.
I'm so lost now since chrome doesn't load my new changes from my javascript file. Is there a solution for this?
One solution for this problem is to force reloading the resource in order to avoid the cache. You can get this modifying the url with http get parameters:
Change:
<script src="myscripts.js"></script>
to:
<script src="myscripts.js?newversion"></script>
Where newversion can be any string as it will be ignored. A useful option is to use the date, or version, of your code.
I found this workaround particularly useful when I came across this same problem and wanted to ensure that all clients (not just my own browser!) would run the new version of the code.
I think there's an even better way:
You can use PHP to add the last modification date of your JavaScript file to the URI of that file.
<script src="js/my-script.js?<?php echo filemtime('js/my-script.js'); ?>">
</script>
The browser will receive:
<script src="js/my-script.js?1524155368"></script>
The URI of the file will automatically change if the file is updated.
This way the browser can still cache unchanged files while recognizing changes instantly.
Are you using any type of compilation tools (like gulp or grunt)? It's possible that there is an error in your code, and the tool is not compiling the updated code.
Otherwise, the solution #airos suggested should work. Appending any unique query string to the reference of your JS will always serve a fresh copy on first reload (since the browser will be caching a new URL).
I am using PhantomJS for testing purposes. I need PhantomJS to remove all the information related to the previous session after the following command,
phantom.exit()
Actually, it does that and it removes all the cookies, cache and the history in fact. But, It does not remove the saved information for localStorage command automatically. I need to go to the default path for saving this information and remove the file manually. I am wondering if there is any automatic way for removing this file. I tested following ways but none of them worked for me.
first of all, I used the following command to set the new path for this information, but PhantomJS did not care about it and just used its previous saved information again.
--local-storage-path = path
second, I used the page.open part of PhantomJS to clear the path using following command which did not work either for me.
localStorage.clear();
It's always good to clear localStorage after you're done with testing. You have to keep in mind that you can have multiple pages open in PhantomJS at the same time, but localStorage is only bound to a specific domain.
The localStorage.clear(); has to be executed on the page and not in phantom context:
page.evaluate(function(){
localStorage.clear();
});
So every time before you exit the script, you should clear the localStorage (maybe multiple times depending on how many domains you visited). Alternatively, you can try to do this at the beginning of your script/page load, but it will be hard to do it well. A page must already have a target URL, but the page cannot be yet loaded. Otherwise, the clear may come at a time when the page javascript already executed. Then this should probably be done only once per domain. Otherwise, navigation over different pages of a site will be broken.
Another simple solution would be to use the fs module to delete all localStorage files in the Ofi labs directory at the beginning of the script, but this might delete the localStorage of pages that you didn't want deleted.
If you're using phantomJS 1.9.8 on linux x64 you can test this binary which has the localStoragePath fix included: https://github.com/PatrickHuetter/phantomjs/releases/tag/1.9.8-fixedStoragePath
If you're using phantomjs on another operating system you could checkout my fork and compile it on your plattform. https://github.com/PatrickHuetter/phantomjs/tree/localStoragePathFix
PhantomJS 2.0 already has this fix included, so you can go with the official binary if you're using the 2.0 version or newer.
We have ASP.NET application in which we have used the YUI to generated the popup for user interface. Now when I amd testing the locally isntalled site the popups are comming correctctly withoug any error and also getting displayed correctly on all the browwsers (including IE 7/8/9).
However when the site is exposed on the server and i tried to test it from the outside network the YUI popup's are not getting genrated correclty like if some Javascript or CSS are not getting loaded or are cached. Generally Ctr+F5 does the trick to flush local cache and to fix the issue we have added query parameters xyz.css?v=10 trick. But its not working. Now this issue is showing only on the IE(6/7/8/9) and other browsers are working correctly. To check the issue i again logged into the production box and found that popup is appearing correctly on IE also.
Now i am not having clue how it could possibly happen. Does any one has came across anything like this? What could be the cause of the issue and how to fix it ?
Thanks
As far as I know, IE caches GET responses.
The xyz.css?v=10 trick is used when you want it to use cached CSS but only as long as it is the same version. Whenever you change something in the css you need to change the url (ie xyz.css?v=20).
If you want IE to NEVER use the cached css, you need the URL to look different everytime. you can do that by adding some timestamp to the url.
something like:
xyz.css?v=201201180600123
(201201180600123 is a timestamp)
I'm trying to debug a Javascript written in the Mootools framework. Right now I am developing a web application on top of Rails and my webserver is the rails s that boots WEBrick.
When I modify a particular tree.js file thats called with in one a mootools init script,
require: {
css: [MUI.path.plugins + 'tree/css/style.css'],
js: [MUI.path.plugins + 'tree/scripts/tree.js'],
onload: function(){
if (buildTree) buildTree('tree1');
}
},
the changes are not loaded as the headers being sent to the client are Last Modified: 10 July, 2010..... which is obviously not true since I just modified the file.
How do I get rid of this annoying caching. If I go directly to the script in my browser (Chrome) it doesn't show the changes until I hit refresh, but this doesn't fix my problem when I go back to my application and hit refresh, it still loads the pre-modified script.
This has happen to me also in FF, I think it is a cache header sent by the server or the browser itself.
Anyway a simple way to avoid this problem while in development is adding a random param to the file name of the script.
instead of calling 'tree/scripts/tree.js' use 'tree/scripts/tree.js?'+random that should invalidate all caches.
As frisco says, adding a random number in development does the trick but you will likely find that the problem still affects you production. You want to push new JavaScript changes to your users but can't until their browsers stop caching the file. In order to do this, just get the files mtime and add that as the random string. This will only change when the file is modified and so the JavaScript will be loaded from cache if it has not been changed or it will be loaded from the server, if it has.
PHP has the function filemtime but as I'm not familiar with Ruby, I'm afraid I can't help you further in that direction (sorry!). However, this answer seems to accomplish what you want.
Try the Ctrl+F5 trick. To avoid hitting browser cache.
More info here:
What requests do browsers' "F5" and "Ctrl + F5" refreshes generate?