Js extension modification not showing up in magento - javascript

So I have installed an extension in magento for a popup. The extension worked fine but i wanted to change the size of the popup wich is calculated in a js (fancybox). When I finally made it to make the changes that I wanted I noticed that Magento is not serving me the new modified javascript file.
This is what I tried so far by googling all around:
Refreshing the cache
Flush the js/CSS cache
Flush the store cache
changing permissions on the file to 666 then back to 644
changing the js merge on or off
Checking the cache in the database (all clear)
Checking if unsecure route and secure route are spelled fine (in the database and in magento admin panel, not very sure how this work though)
I am going mad.
One more thing:
If I access the js file from secure path
https://mysite/jspath/jsfile.js (it shows the old file)
And if:
http://mysite/jspath/jsfile (it shows the new file)
http://mysite//jspath/jsfile (it shows the new file)
Since I am not the only admin in the site I don't know if someone else made change to some magento configuration
Any idea is welcome and appreciated, thanks

If compilation is on, re run the compilation process.
Due to the fact that it is serving up the old file still, it seems like there is still come caching there. You say you flushed the js/css cache. Can you flush all caches and see if its still a problem?

Related

Tempalte File won't update in Browser

im trying to change some behavior of a webpage. The page is built using SMARTY so the file that won't update is a .tpl file. Containing mostly JavaScript. The whole thing is weird because on my local drive (IDE) the file is the way i want it to be, it is the correct version on the server when i check it with SFTP.
BUT if i open the page with a browser, even a newly installed one, the code inside that page is the old version.
Now i can't send any error because there obviously is none.
So im lookin at a problem i can't fix or even begin to untangle.
I tried following things:
Restarting apache2
Deleting the file and uploading the correct version (here i might add the site broke when i deleted it and tried to view it)
This has been bugging me for a long time so it seems time couldn't fix it either.
I hope somebody has any idea to why this is happening!
Thanks
Clear (delete all cache files) smarty cache!
More details about cache dir: https://www.smarty.net/docs/en/api.set.compile.dir.tpl

Duplicated Mapped Javascript Source not Saving Changes on Chrome's Dev Tools

I'm making a local web page (.html) that loads a few .js files and am having trouble using Google Chrome's Developer Tools.
Definition
The problem I'm having has to do with the Source Panel: I have one source tab open with a specific file and when I open this file by clicking the console or the source file at the left, randomly a duplicate is created instead of just redirecting it to the one already open.
Both will have the same file-path:
Both will allow me to write and save the file (even showing/hiding the asterisk correctly)
Only one of them will correctly save the contents to disk.
I have to fix it by closing both files and opening it again, but sometimes I can't see that there is a duplicated file and it causes me to fix a problem only to find out that the file wasn't actually saved, reverting completely every change I made to that file when I refresh the page.
Example of the problem
The most common appearance of the bug is when I'm doing the following:
I identify a console.warn / console.error / syntax error log in the console
I click the line that caused that log (at the rightmost of the console) and I'm redirected to the source panel, where a new source file tab opens and I begin editing it.
Somewhere in the source tabs is another source file of that same file (with the same filepath): That tab should be the one I was editing: only this original tab will correctly save changes made to the file.
I fix the random javascript problem in the file that I was lead to by clicking the console.
I hit Ctrl+S (or Right Click > Save), the asterisk that indicates a unsaved file disappears. At that point Chrome expects to have saved the file successfully, but it didn't (I can check by opening the file in Notepad)
I hit F5 to refresh the page.
Chrome loads the old, unsaved file, erasing all the changes I made in the source file.
Sometimes the very same steps doesn't create a duplicate file, but all I have to do is refresh and try again until it does. Recreating this bug is a matter of chance, I can't predict or pinpoint its causes either.
When refreshing the page with the Dev Tools open, there's a small chance that it will creates a broken mapping, where the mapped project becomes only partially mapped
even though it was fine seconds before: (only restarting chrome fixes that)
This last paragraph may or may not have something to do with the problem but I can clearly select and open the "fake" file and the "real" file even though they have the same file path.
I made this gif to show how the file paths are identical to each other in the Dev Tools: http://i.imgur.com/ULlbskO.gif
Details of the setup
I'm using the local file system (file:///) strictly, there is no localhost or server being used to host my application, it is pure HTML + Javascript.
I'm using Google Chrome 57 for Windows without any extensions, but I've been having that problem since December, 2016.
My project was mapped by adding the folder to the workspace and mapping it to a local file, which used to work in the past.
Here's a picture of my configurations: http://i.imgur.com/IEmE3zG.png
Things I've tried
Clearing Chrome's Cache
Removing the project from the Source Panel workspace and adding it again
Reinstalling Chrome
Moving the project path to somewhere else
Searching on Google
Letting go / Accepting defeat (I've grown too dependent on the tool)
Waiting 2 months for someone to have this problem too and post it somewhere in the internet
Questions I need help with
Can I minimize/fix this problem in any way?
Has anyone dealt with this before?
Does someone know if this is a Chrome bug or am I doing something wrong with my workflow?
Found the answer myself after several months working with web development.
The duplicated file has been fixed in Chrome since the time I made this question, but files keep loosing "connection" with the local file system (green dot that marks them as sync'd with local) randomly, which made me investigate and I finally found the reason:
What's happening is that Dev Tools is trying to save the file and, when it retrieves it it loads from cache (because my local web server was sending cache-specific headers), which makes the browser think that file file is not actually the one it saved, so it stops its syncronization!
To solve it, all I had to do is ensure my local web server is disabling any form of cache for my javascript files, which I can check from the network panel:
My local web server was sending cache headers for 1 hour, which made chrome open the cached file, which was different from my edited file which was an indication that the file is not the one it saved.
After changing the server to serve static content without the cache headers, everything went smoothly and files kept sync'd correctly!

Updating site with git push, browser still loads old cached version

Git has made updating my site a hell of a lot easier and quicker. I love it.
However, it seems when updating files with git, my browser seems to cling to old cacheable files much longer than it should.
I have no idea if this is just my browser, if it's a quirk of git, or if it's just a problem that affects only me for some other reason.
A couple days ago I found a bug on my site, so I fixed it and pushed a new version of the affected js file to my site.
When I do this, I find if I don't hit f5, then it'll load the old js file. So I always hit f5 and think nothing of it.
But for users of my site, they are probably having the same experience... which isn't good.
So I updated the js file 2 days ago and refreshed the home page, checked it was working and left it.
Just now, I checked another page on the site, loading the exact same js file and it was still using the old cached version. I hit f5, it now loads the new one.
Is there any way I can force all browsers to forget the cached version of old files? I figured this should just happen automatically after a cache's short lifetime.
Here's the headers from chrome:
As you can see, the cache control max-age is stupidly high. My server runs with nginx+apache, and a backend system called Vesta Control Panel (VestaCP).
If I fix the cache control on the backend, how do I then tell all of my user's browsers to forget the seemingly unforgettable cached version?
This depends on your setup. If your index page is HTML, you'll want to set the server cache to expire very quickly on HTML files, so it will reload your index page frequently. (This isn't an issue with an index.php, as it should reload every time). Then you'll want to filerev your resource files.
For instance, you can use grunt, gulp, or something similar that will append a unique string at the end of the filenames for all your resources, so script.js becomes script.1a34be4sde4.js and then the next update becomes script.3ezseasd4sad.js and so on. Or you could manually rename them, adding 1 each time (script-001.js, script-002.js, etc. - although with many files this would be a pain).
This way you can keep your max-age stupidly high and the user won't have to download that version of the file again. But when your index page points them to an updated version (new filename), they'll go grab the new version and cache it stupidly long.
The problem comes with using git push to update the site. To keep your repo clean, there are a few things you could do. What I'd probably lean toward is a post-receive hook script on the server that would checkout the pushed branch to a staging folder, run a build script, and move the final version to the deployment folder.
There are a couple of cache-busting techniques that you can use, none of which are particularly linked to git (and some may be anathema to it).
In your index.html, you could use a query cachebuster:
<script src="/js/core.js?cache=12345">
Which works in some cases (most browsers, AFAIK, won't re-use cached files if the query string is different). This means you have to change your index.html every time you update stuff. There's plenty other ways to do cache busting, google around and you'll probably find a dozen at least, each of which is the "best".
Personally, I use gulp-rev in combination with gulp-inject in my build process. gulp-rev will create a hash-based uniqid for the filename (so it renames core.js to core-ad234af.js), and then gulp-inject changes index.html so it pulls that file in, instead of core.js. I only do this when doing a production build (since in dev, I have cache-control set to 0).
But this all works because I don't do git push to deploy - I use git to get the source to my production server, and then build on that server with the production flag set.

Updated website through FTP but still showing old content

I am working at a project on my webhost. Until today everything went fine, but now I save the files, they update on the ftp but when I refresh the my webpage the old content shows up, not the updated one.
I checked the FTP, the files are updated with the new content and saved. I checked with the Chrome Dev Tools and I keep getting the old content from the files I work on.
I tried deleting the cache a few times but it only worked temporarily(once), and then again the content not updating.
I work with php, css and javascript files, all having the same problem.
What is wrong?
This is because Cloudflare automatically saves some files (i.e. Javascript's, CSS...). When you're in that kind of problems, in Cloudflare you should go to your page's settings and then "Cache Purge", you purge the cache and then it should work nice.
SOLUTION:
It was because of a CDN I was using, called CloudFlare. I have no idea why, but it kept the old content instead of updating the new one.
Be careful when using CDN's.

Magento caching error

I'm looking for a solution to a strange problem I stumbled upon this afternoon.
My category pages began throwing a 404 not found error for a media/js file (one of the Magento-generated merged js files, I believe). Naturally this has caused the category pages to malfunction in a few places, including placing an unexplained number at the top of the page and breaking the nav. Here it is in action: http://www.vapetropolis.ca/herb-grinders.
I've tried disabling js and css merging in the backend. I've also tried flushing magento cache, flushing cache storage, flushing the javascript/css cache, as well as manually deleting all entries within var/cache. After all this flushing, the media/js directory is empty. However, the category pages are still looking for this same file (all other pages are still working fine).
Notably, the identical site on my local machine is working fine, and includes the file not found by the production site.
Edit: throwing the js file from the local to production site hasn't helped - there are multiple js errors thrown on the category pages still.
I'm guessing this problem has something to do with Magento's messed up merge functionality.
Edit(2): Problem has to do with caching (thank you runamok). When a querystring is added to the URL, the page works fine. So it must be that magento's caching is serving up a faulty page somehow.
I've tried disabling all cache, as well as disabling precompiling on the backend, but behaviour remains the same
Edit(3): Still in need of help!
Looks like you may have fixed the issue based on this url existing.
http://www.vapetropolis.ca/media/js/a87bf7cc5dcd7a07e58a41c1063e1f4a.js
Generally speaking this is because the permissions for the media/js file are not correct.
Obviously chmod 777 is the easiest way to set this but ideally you should just make sure the directory is owned by the user running apache.
Furthermore if you are using some sort of full page cache you will likely need to flush that too. The pages expect that the file is already there so it will not attempt to regenerate it.
Finally are you using any sort of service like cloudflare or varnish or a cdn or anything else that may cache the 404 for a short time?

Categories

Resources