Updating site with git push, browser still loads old cached version - javascript

Git has made updating my site a hell of a lot easier and quicker. I love it.
However, it seems when updating files with git, my browser seems to cling to old cacheable files much longer than it should.
I have no idea if this is just my browser, if it's a quirk of git, or if it's just a problem that affects only me for some other reason.
A couple days ago I found a bug on my site, so I fixed it and pushed a new version of the affected js file to my site.
When I do this, I find if I don't hit f5, then it'll load the old js file. So I always hit f5 and think nothing of it.
But for users of my site, they are probably having the same experience... which isn't good.
So I updated the js file 2 days ago and refreshed the home page, checked it was working and left it.
Just now, I checked another page on the site, loading the exact same js file and it was still using the old cached version. I hit f5, it now loads the new one.
Is there any way I can force all browsers to forget the cached version of old files? I figured this should just happen automatically after a cache's short lifetime.
Here's the headers from chrome:
As you can see, the cache control max-age is stupidly high. My server runs with nginx+apache, and a backend system called Vesta Control Panel (VestaCP).
If I fix the cache control on the backend, how do I then tell all of my user's browsers to forget the seemingly unforgettable cached version?

This depends on your setup. If your index page is HTML, you'll want to set the server cache to expire very quickly on HTML files, so it will reload your index page frequently. (This isn't an issue with an index.php, as it should reload every time). Then you'll want to filerev your resource files.
For instance, you can use grunt, gulp, or something similar that will append a unique string at the end of the filenames for all your resources, so script.js becomes script.1a34be4sde4.js and then the next update becomes script.3ezseasd4sad.js and so on. Or you could manually rename them, adding 1 each time (script-001.js, script-002.js, etc. - although with many files this would be a pain).
This way you can keep your max-age stupidly high and the user won't have to download that version of the file again. But when your index page points them to an updated version (new filename), they'll go grab the new version and cache it stupidly long.
The problem comes with using git push to update the site. To keep your repo clean, there are a few things you could do. What I'd probably lean toward is a post-receive hook script on the server that would checkout the pushed branch to a staging folder, run a build script, and move the final version to the deployment folder.

There are a couple of cache-busting techniques that you can use, none of which are particularly linked to git (and some may be anathema to it).
In your index.html, you could use a query cachebuster:
<script src="/js/core.js?cache=12345">
Which works in some cases (most browsers, AFAIK, won't re-use cached files if the query string is different). This means you have to change your index.html every time you update stuff. There's plenty other ways to do cache busting, google around and you'll probably find a dozen at least, each of which is the "best".
Personally, I use gulp-rev in combination with gulp-inject in my build process. gulp-rev will create a hash-based uniqid for the filename (so it renames core.js to core-ad234af.js), and then gulp-inject changes index.html so it pulls that file in, instead of core.js. I only do this when doing a production build (since in dev, I have cache-control set to 0).
But this all works because I don't do git push to deploy - I use git to get the source to my production server, and then build on that server with the production flag set.

Related

How to deal with webpack chunk updated on server?

We have a React application with code splitting using React.lazy and Suspend. Every Tuesday we deploy a new version and so our chunks will change too.
The problem we have right now is if our user did not refresh after we deploy, their old main.js is still pointing to the old chunk files with old hashes. And it's going to crash when they try to load the old chunk files.
We know that we can prefetch routes when our app is loaded but there are a lot of routes to be prefetched (around 20). This might affect our home page performance because we have a few API calls on home page.
Are there any better ways of dealing with this?
Many thanks in advance.
What keeps you from keeping multiple versions on your server? Let's say v1.commons.js is currently deployed. Now when you build a new version, v2.commons.js gets created, and both files are served by the server. Old clients will still work with the old version, but depending on your caching settings (page expiry time) they will migrate soon to the new version. Then you can remove the old version from your server.
Use the [hash] placeholder in the your Webpack output configuration, e.g. filename: '[hash]/[name].js'. This way every compilation will yield a fresh set of filenames.
Ensure the page that refers to these chunks (be it generated with webpack-html-plugin or something else) is always served fresh, never from cache, via Cache-Control headers or other similar techniques.
This way very stubborn clients (who disregard the cache-control headers) will most probably use their old version of your code, but as soon as they refresh (to get the new HTML page), they'll be guaranteed all of the new JavaScript too, since the URL has changed.
We have decided to preload every route in background so our clients do not need to lazy load other chunks at a later point in time.

Correct method for ensuring users get the latest version of a website after an update

Everytime I deploy an update to our web application customers ring in with issues where their browser hasnt picked up that index.html has changed and since the name of the .js file has changed they run into errors. Presumably because their index.html still points to the old javascript file which no longer exists.
What is the correct way to ensure that users always get the latest version when the system is updated.
We have a HTML5 + AngularJS web application. It uses WebPack to bundle the vendor and app javascript into two js files. The files contain a hashname to ensure they are different once released.
Some other information
I can never replicate this issue locally (and by that I mean in debug, on our staging site or our production site)
We use CloudFlare but purge the entire cache after release
We have a mechanism in JS that checks on page load or every 5 minutes to see if the version of our API has changed, and if so show up a "Please refresh your browser" message. Clicking this runs window.location.reload(true);
Our backend is IIS
If you need users to pick up the latest index.html when they load your site immediately after you've updated the file, make index.html non-cacheable. That will mean the browser, CloudFlare, and any intermediate proxies aren't allowed to cache it, and that one file will always be served from your canonical server.
Naturally, that has a traffic and latency impact (for you and them), but if that's really your requirement, I don't see any other option.
There are spins on this. It might not be index.html itself that isn't cacheable, you could insert another resource (a tiny JavaScript file that writes out the correct script tags) if index.html is really big and it's important to cache it, etc. But if you need the change picked up immediately, you'll need a non-cacheable resource that identifies the change.

Js extension modification not showing up in magento

So I have installed an extension in magento for a popup. The extension worked fine but i wanted to change the size of the popup wich is calculated in a js (fancybox). When I finally made it to make the changes that I wanted I noticed that Magento is not serving me the new modified javascript file.
This is what I tried so far by googling all around:
Refreshing the cache
Flush the js/CSS cache
Flush the store cache
changing permissions on the file to 666 then back to 644
changing the js merge on or off
Checking the cache in the database (all clear)
Checking if unsecure route and secure route are spelled fine (in the database and in magento admin panel, not very sure how this work though)
I am going mad.
One more thing:
If I access the js file from secure path
https://mysite/jspath/jsfile.js (it shows the old file)
And if:
http://mysite/jspath/jsfile (it shows the new file)
http://mysite//jspath/jsfile (it shows the new file)
Since I am not the only admin in the site I don't know if someone else made change to some magento configuration
Any idea is welcome and appreciated, thanks
If compilation is on, re run the compilation process.
Due to the fact that it is serving up the old file still, it seems like there is still come caching there. You say you flushed the js/css cache. Can you flush all caches and see if its still a problem?

Magento js and css changes not reflect

I had problem that i added custom java script its included but when i changes its contents it doesn't effect. it runs older java script file. since i cleared cache. i deleted every folder of /var/ also. but still it runs older java script code. while i see it in uploaded file also it shows updated code but using URL of that java script in browser it shows old code.
I flushed magento cache storage. flushed cache of css/javascript also.
In case if any guy have solution let me know.
Thanks in advance.
EDITED
Same problem with css also. Changes doen't reflect. cleared cache a lot of times from back-end as well as cleared var folder also.
Your server probably have header information asking browsers to cache static files like JS/CSS. It is likely that your browser is still caching the old CSS files. One way to check if it is indeed the browser and not say accidentally editing the wrong CSS file is by enabling and disabling (only go one way to check) the CSS file merge. By doing so you are forcing the browser to fetch for a whole new file - essentially bypassing caching.
You may also want to take a look at our CSS/JS Versioning extension which includes automatic refresh of the file name hash based on CSS/JS file timestamps (sensitive to editing and changes) http://extensions.activo.com/css-and-javascript-versioning.html
Have you cleared your local browser cache on your workstation?
Often, CSS and JavaScript can stick mightily and no matter now much you flush Magento caching on the server, the workstation browser never requests and downloads the new script. These are static files, a change in file date doesn't trigger browser reload, only complete removal from the browser cache does.
Usually CTL-F5 about three times will do it, otherwise you have to go into the web browser setups and flush browser cache there.
Also, if you're using JavaScript/CSS Merge, you need to click the button on the Cache Management page to Flush JavaScript/CSS Cache as well.
The only other place things can gum up is if you're running APC cache, you may need to flush it as well so the block caching for the head can refresh. This only matters if you changed the script and CSS file names, which you probably haven't, so it likely doesn't matter.

Magento caching error

I'm looking for a solution to a strange problem I stumbled upon this afternoon.
My category pages began throwing a 404 not found error for a media/js file (one of the Magento-generated merged js files, I believe). Naturally this has caused the category pages to malfunction in a few places, including placing an unexplained number at the top of the page and breaking the nav. Here it is in action: http://www.vapetropolis.ca/herb-grinders.
I've tried disabling js and css merging in the backend. I've also tried flushing magento cache, flushing cache storage, flushing the javascript/css cache, as well as manually deleting all entries within var/cache. After all this flushing, the media/js directory is empty. However, the category pages are still looking for this same file (all other pages are still working fine).
Notably, the identical site on my local machine is working fine, and includes the file not found by the production site.
Edit: throwing the js file from the local to production site hasn't helped - there are multiple js errors thrown on the category pages still.
I'm guessing this problem has something to do with Magento's messed up merge functionality.
Edit(2): Problem has to do with caching (thank you runamok). When a querystring is added to the URL, the page works fine. So it must be that magento's caching is serving up a faulty page somehow.
I've tried disabling all cache, as well as disabling precompiling on the backend, but behaviour remains the same
Edit(3): Still in need of help!
Looks like you may have fixed the issue based on this url existing.
http://www.vapetropolis.ca/media/js/a87bf7cc5dcd7a07e58a41c1063e1f4a.js
Generally speaking this is because the permissions for the media/js file are not correct.
Obviously chmod 777 is the easiest way to set this but ideally you should just make sure the directory is owned by the user running apache.
Furthermore if you are using some sort of full page cache you will likely need to flush that too. The pages expect that the file is already there so it will not attempt to regenerate it.
Finally are you using any sort of service like cloudflare or varnish or a cdn or anything else that may cache the 404 for a short time?

Categories

Resources