Changes do not show up on google cloud compute engine VM - javascript

When I upload any changes in javascript files to my deployed site on Google cloud compute engine, the changes do not show up right away, if I load the js files in browser. The changes are physically there, validated by RDPing to the server. B
These changes sometime take 6 to 8 hours to show up in browser, and sometimes show up sporadically. What can I do to ensure that my changed js files take effect immediately on load?
I dont face this problem when I upload on my test server, which is not under cloud. Happens only on the google cloud server.

Try opening your builtin browser debugger (for example right-click on your page, if using Chrome, and click "Inspect"). Now select the Network tab. Now reload. Now look at the sizes of the files. If they are very small, then probably the browser loaded cached copies. You can verify this by checking the "Disable cache" option in the Inspector.
If it works correctly and the file updates right away when you have the browser cache disabled, then you need to investigate with your web server software to see how to cause it to invalidate caches. One mechanism you can look into is called "ETags."

What worked was invalidating the cloud cache for the file. Lesson learnt: always add version numbers to your js files.

Related

Correct method for ensuring users get the latest version of a website after an update

Everytime I deploy an update to our web application customers ring in with issues where their browser hasnt picked up that index.html has changed and since the name of the .js file has changed they run into errors. Presumably because their index.html still points to the old javascript file which no longer exists.
What is the correct way to ensure that users always get the latest version when the system is updated.
We have a HTML5 + AngularJS web application. It uses WebPack to bundle the vendor and app javascript into two js files. The files contain a hashname to ensure they are different once released.
Some other information
I can never replicate this issue locally (and by that I mean in debug, on our staging site or our production site)
We use CloudFlare but purge the entire cache after release
We have a mechanism in JS that checks on page load or every 5 minutes to see if the version of our API has changed, and if so show up a "Please refresh your browser" message. Clicking this runs window.location.reload(true);
Our backend is IIS
If you need users to pick up the latest index.html when they load your site immediately after you've updated the file, make index.html non-cacheable. That will mean the browser, CloudFlare, and any intermediate proxies aren't allowed to cache it, and that one file will always be served from your canonical server.
Naturally, that has a traffic and latency impact (for you and them), but if that's really your requirement, I don't see any other option.
There are spins on this. It might not be index.html itself that isn't cacheable, you could insert another resource (a tiny JavaScript file that writes out the correct script tags) if index.html is really big and it's important to cache it, etc. But if you need the change picked up immediately, you'll need a non-cacheable resource that identifies the change.

Force Cache Refresh for Web Resources

I have several web resources that are displayed on forms in Microsoft Dynamics. The web resources are html files that include JavaScript/CSS files. When I update the JavaScript files, I am seeing that the latest changes are not getting pulled to end user computers on their next use of the form. I believe this is because the previous version of the web resource has been cached on their machine.
According to this SO question, the solution would be to add a version to the script tag. However, according to the comments on the question, this solution does not work on Chrome and is considered a hack. I have also read here that Dynamics should automatically handle caching when web resources are updated, but does not do so reliably (which is my experience).
How can I force end user computers to get the latest version of my code on their next use of the form when I push out updates?
If you are only changing the files for development (ie. Once they are finished they won't change), then most browsers will allow you to disable the cache. In Chrome, this can be done as long as developer tools are open and you click the "disable cache" button in the network tab.
If they are going to change for the client with each request, then you can generate a random ID to be sent with the file (eg example.com/script.js?182hdh2). To allow this, just put some js in your html file (not in an external script) to import all the other files.

Duplicated Mapped Javascript Source not Saving Changes on Chrome's Dev Tools

I'm making a local web page (.html) that loads a few .js files and am having trouble using Google Chrome's Developer Tools.
Definition
The problem I'm having has to do with the Source Panel: I have one source tab open with a specific file and when I open this file by clicking the console or the source file at the left, randomly a duplicate is created instead of just redirecting it to the one already open.
Both will have the same file-path:
Both will allow me to write and save the file (even showing/hiding the asterisk correctly)
Only one of them will correctly save the contents to disk.
I have to fix it by closing both files and opening it again, but sometimes I can't see that there is a duplicated file and it causes me to fix a problem only to find out that the file wasn't actually saved, reverting completely every change I made to that file when I refresh the page.
Example of the problem
The most common appearance of the bug is when I'm doing the following:
I identify a console.warn / console.error / syntax error log in the console
I click the line that caused that log (at the rightmost of the console) and I'm redirected to the source panel, where a new source file tab opens and I begin editing it.
Somewhere in the source tabs is another source file of that same file (with the same filepath): That tab should be the one I was editing: only this original tab will correctly save changes made to the file.
I fix the random javascript problem in the file that I was lead to by clicking the console.
I hit Ctrl+S (or Right Click > Save), the asterisk that indicates a unsaved file disappears. At that point Chrome expects to have saved the file successfully, but it didn't (I can check by opening the file in Notepad)
I hit F5 to refresh the page.
Chrome loads the old, unsaved file, erasing all the changes I made in the source file.
Sometimes the very same steps doesn't create a duplicate file, but all I have to do is refresh and try again until it does. Recreating this bug is a matter of chance, I can't predict or pinpoint its causes either.
When refreshing the page with the Dev Tools open, there's a small chance that it will creates a broken mapping, where the mapped project becomes only partially mapped
even though it was fine seconds before: (only restarting chrome fixes that)
This last paragraph may or may not have something to do with the problem but I can clearly select and open the "fake" file and the "real" file even though they have the same file path.
I made this gif to show how the file paths are identical to each other in the Dev Tools: http://i.imgur.com/ULlbskO.gif
Details of the setup
I'm using the local file system (file:///) strictly, there is no localhost or server being used to host my application, it is pure HTML + Javascript.
I'm using Google Chrome 57 for Windows without any extensions, but I've been having that problem since December, 2016.
My project was mapped by adding the folder to the workspace and mapping it to a local file, which used to work in the past.
Here's a picture of my configurations: http://i.imgur.com/IEmE3zG.png
Things I've tried
Clearing Chrome's Cache
Removing the project from the Source Panel workspace and adding it again
Reinstalling Chrome
Moving the project path to somewhere else
Searching on Google
Letting go / Accepting defeat (I've grown too dependent on the tool)
Waiting 2 months for someone to have this problem too and post it somewhere in the internet
Questions I need help with
Can I minimize/fix this problem in any way?
Has anyone dealt with this before?
Does someone know if this is a Chrome bug or am I doing something wrong with my workflow?
Found the answer myself after several months working with web development.
The duplicated file has been fixed in Chrome since the time I made this question, but files keep loosing "connection" with the local file system (green dot that marks them as sync'd with local) randomly, which made me investigate and I finally found the reason:
What's happening is that Dev Tools is trying to save the file and, when it retrieves it it loads from cache (because my local web server was sending cache-specific headers), which makes the browser think that file file is not actually the one it saved, so it stops its syncronization!
To solve it, all I had to do is ensure my local web server is disabling any form of cache for my javascript files, which I can check from the network panel:
My local web server was sending cache headers for 1 hour, which made chrome open the cached file, which was different from my edited file which was an indication that the file is not the one it saved.
After changing the server to serve static content without the cache headers, everything went smoothly and files kept sync'd correctly!

Persistent Local JavaScript - Chrome Debugger

I remember several years ago that I was able to save a remote JavaScript file from a website onto my local in Chrome Debugger, make a few code adjustments, and refresh the page so Chrome will be reading the local copy of the JS file. I am wondering if this feature is still available, and if so how am I able to use it?
I know that I can add breakpoints to achieve something similar. I have followed a few guides, but none of them was able to achieve what I want:
https://developers.google.com/web/tools/setup/setup-workflow?hl=en
https://www.sitepoint.com/edit-source-files-in-chrome/
It looks as though you can only achieve this when you are using a local server. According to the Stage persisted changes section of the API:
If you are mapping files from a remote server instead of a local server, when you refresh the page, Chrome reloads the page from the remote server. Your changes still persist to disk and are reapplied if you continue editing in Workspaces.
It seems you could achieve what you want if you use Fiddler AutoResponder:
Fiddler's AutoResponder tab allows you to return files from your local disk instead of transmitting the request to the server.

Javascript fullcalendar prod server vs dev machine issue

<div id='calendar'></div>
Is the html tag that the fullcalendar uses to insert a calendar and do its magic. It's a great tool, but something weird is happening.
My calendar is created with events from the DB and all that stuff works well. Here's the issue...
Calendar on 2 pages - both work great on dev workstation.
Deployment - works on one page, but not on the other. The calendar div gets populated with some complex tables etc for rendering. Except this doesn't happen - ONLY on one page, ONLY on the production server - same browser. All the db stuff is there, pages coming back are identical other than the table stuff that gets inserted on my dev machine, but not when server from production. But again, the same control works just fine from prod on another page - stumped! Web server is IIS 7
Any thoughts or even wild speculations most welcome!!
Just to make it an official answer...
Commonly when something doesn't render in either a development or production settings but does in the other environment you're experiencing an absent resource. This is usually something along the lines of requiring the following:
<script type="text/javascript" src="..path/to/script.js"></script>
Since one location has the file in that location and the other doesn't, you can run in to scenarios where it works in one spot but not the other.
Easiest method to confirm this is to open the debugger in your favorite browser and use the "networking" section to determine if all the resources are loading correctly (and are resolved). Otherwise, chances are the page that's not working is getting a 404 (or other error) when trying to retrieve the file it needs and thus the page fails to operate.

Categories

Resources