Force reload remote JS loaded in a webpage through a bookmarklet - javascript

I have a little bookmarklet that simply loads a remote JS script from my website to add a button to page elements on Reddit. Pushing the button sends to my site for storage, though that's not the important part.
The issue is that the the webpage where my script loads is "helpfully" cached which makes it REALLY hard to debug and update. Hitting CTRL+F5 doesn't seem to fix it. Neither does loading the JS manually in another tab. I don't know what the conditions are, but it will EVENTUALLY work... but it's slowing me down pretty badly. How can I manually force it to retrieve a new copy of the script from my webpage instead of the cache? This is in Chrome.
Here is my bookmarklet
javascript: (function () {
if (document.getElementById('darmokPending') == null){
var jsCode = document.createElement('script');
jsCode.setAttribute('src', 'https://mypage/scripts/redditer.js');
document.body.appendChild(jsCode); }}());
The code is loaded properly from the remote location, but then it gets cached. Help!

The correct solution is to use a Cache-Control response header on the script:
Cache-Control: public, must-revalidate
But if you can’t change response headers for whatever reason, you can use retro cachebusting by making changes to the URL:
jsCode.src = 'https://mypage/scripts/redditer.js?_=' + Date.now();

Related

How to redirect page after "Enable Unsafe Scripts" is enabled?

I have a website that requires "Load Unsafe Scripts" to be enabled to load. What I want is the site to redirect to another after the user enables the "Load Unsafe Scripts" option is enabled. I can work with HTML, and JavaScript. Any help would be appreciated!
As already mentioned, you really should focus on fixing the unsafe script by serving everything (or nothing) from HTTPS.
If you absolutely can't for some reason, and for the sake of exercise: there is no trigger that you can directly react to when this occurs.
Your only real option would be to periodically try to add the script again if it hasn't been already. This wouldn't involve a redirect, but just pulling the script(s) in again.
Something like this:
function loadUnsafe() {
if (!somethingInYourScript) {
const script = document.createElement('script');
script.src = 'https://path.to/my-script.js';
document.appendChild(script);
setTimeout(loadUnsafe, 10000); // wait 10 seconds to try again
}
}
loadUnsafe();
This will try to pull in your script every 10 seconds if somethingInYourScript doesn't exist. Once it does exist, it'll stop.
The somethingInYourScript would be something that the script pulls in (for example, if you were trying to bring in jQuery, you could check if jQuery variable exists because it will once the script is loaded.
You could try to pull in the main file you want (if your site can handle that), or you could try to pull in an unsafe script that would cause a redirect/refresh.
You can load a script from an unsave source which redirect to the other page.
Something like
var locationn = "https://google.com";
window.location = locationn;
But you realy need to make your scripts save...

Web Worker: How to prevent that file gets loaded from cache?

This is incredibly annoying.. I am wondering why the heck my changes aren't reflected as I notice that my JavaScript file for my Web Worker always gets loaded from cache:
I have disabled the Cache and hitting Ctrl + F5 does not work either.
How can I make sure that this file does not get loaded from cache?
_worker = new Worker('js/toy-cpu.js');
You could add a kind of version number, for example like this:
_worker = new Worker('js/toy-cpu.js?v=' + new Date().getTime());
If you are looking for the purposes of development / the configuration of your personal machine.. rather than that every user needs it to load from the web server.
Chrome has an option to disable cache
Notice the checkbox "disable cache" that you can check as I have.
And in the section below, where it says "Perform a request". if you refresh then you see the page listed, and it can indicate whether a URL is loaded from the web server, if chrome says 200 and if in the size column it gives size as a number, then it loaded from the web server. And if you double click a URL in the inspector, you see HTTP Headers.

Eliminate: ISP Injects Pages with Iframe Script for Ads

So my ISP (Smartfren; Indonesia) has decided to start injecting all non-SSL pages with an iframing script that allows them to insert ads into pages. Here's what's happening:
My browser sends a request to the server. ISP intercepts it and instead returns a javascript that loads the requested page inside an iframe.
Aside being annoying in principle, this injection also breaks any number of standard page functionality; and presents possible security hazards.
What I've tried to do so far:
Using a GreaseMonkey script to nix away the injected code and redirect to the original URL. Result: Breaks some legitimate iframes. Also, the ISP's code gets executed, because GreaseMonkey only kicks in after the page is loaded.
Using Privoxy for a local proxy and setting up a filter to clean up the injection and replace it with a plain javascript redirect to the original URL. Result: Breaks some legitimate iframes. ISP's code never gets to the browser.
You can view the GreaseMonkey and Privoxy fixes I've been working on at the following paste: http://pastebin.com/sKQTvgY2 ... along with a sample of the ISP's injection.
Ideally I could configure Privoxy to immediately resend the request when the alteration is detected, instead of filtering out the injected JS and replacing it with a JS redirection to the original URL. (The ISP-injection gets switched off when the same request is resent without delay.) I'm yet to figure out how to accomplish that. I believe it'd fix the iframe-breaking problem.
I know I could switch to a VPN or use the Tor browser. (Or change the ISP.) I'm hoping there's another way around. Any suggestions on how to eliminate this nuisance?
Actually now I have a solution:
The ISP proxy react on the Accept: header that the browser sends.
So this is the default for firefox:
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Now we are going to change this default:
And set it to: Accept: */*
Here is how to setup header hacker for google chrome
Set the title to anything you like:NO IFRAME
Append/replace select replace with
String */*
And Match string to .* and then click add.
In the permanent header switches
Set domain to .* and select the rule you just created
PS: changing it in the firefox settings does not work 100% because some request like ajax seem to bypass it so a plugin is the only way as it literally intercepts every outgoing browser request
That's it no more iframes!!!
Hope this helps!
UPDATE: Use DNSCrypt is the best solution 😁
OLD ANSWER
Im using this method
Find resource that contain iframe code (use chrome dev tool)
Block the url with proxy or host file
I'm using linux, so i edited my hosts file on
/etc/hosts
Example :
127.0.0.1 ibnads.xl.co.id

Browser used cache without revalidate

I am trying to learning the browser's (Chrome/Firefox) cache mechanism.
I set up a simple HTML:
<HTML><BODY>
Hellow World
<script>
function loadJS(){
var s = document.createElement('script');
s.setAttribute('src','/myscript');
document.body.appendChild(s);
}
loadJS()
</script>
<BODY></HTML>
I output "Cache-Control: max-age:30" for "/myscript"
Everytime I press F5, browser will re-validate /myscript with my server to get a 304 back.
But if I use
setTimeout(loadJS, 1);
Everytime I press F5, it looks like browser will check expire time, and if not expired, browser will use the cache directly instead of going to server for revalidation.
My question is:
Why? is there a detail explanation for this?
Does it mean if I want browser to use cache and reduce network request as much as possible, I need to wait the page loaded, and then request resources by js later?
I've done a fair amount of experimentation with browser cache control, and I am surprised that no one has posted an answer.
Many people do not pay attention to this. As a results websites--for no reason at all--make browsers perform useless roundtrips for a 304-not modified on images, js or css files which are unlikely changed in 5 years--like who is going to change jquery.v-whatever?
So anyway, I have found that when you hard refresh the browser using F5 or ctrl-r, Chrome will revalidate just about everything on the page--as it should. This is very helpful and is why you want keep the etags in the response header.
When testing your max-age and expires headers, browse the site as a user naturally would by clicking the links on the page. Watch the web server's logfile (I use http://www.apacheviewer.com) and you'll get a good idea of how the browsers are caching.
Setting the headers works. I posted this a while back: Apache: set max-age or expires in .htaccess for directory
The easiest way for me to manage the web server is to create a /cache directory and instruct apache to set a 1 year max-age and expires header for everything in every subdir.
It works wonders. My pages make 1 round trip to the server, where as they used to make 3-5 trips with each request, just to get a 304.
Write your html as you normally would. The browsers will obey the cache settings in the headers.
Just know that hard refreshing the browser causes the browser to ignore max-age and relies on etags.

Javascript debugging difficult as browser doesn't refresh the scripts!

I'm trying to debug a Javascript written in the Mootools framework. Right now I am developing a web application on top of Rails and my webserver is the rails s that boots WEBrick.
When I modify a particular tree.js file thats called with in one a mootools init script,
require: {
css: [MUI.path.plugins + 'tree/css/style.css'],
js: [MUI.path.plugins + 'tree/scripts/tree.js'],
onload: function(){
if (buildTree) buildTree('tree1');
}
},
the changes are not loaded as the headers being sent to the client are Last Modified: 10 July, 2010..... which is obviously not true since I just modified the file.
How do I get rid of this annoying caching. If I go directly to the script in my browser (Chrome) it doesn't show the changes until I hit refresh, but this doesn't fix my problem when I go back to my application and hit refresh, it still loads the pre-modified script.
This has happen to me also in FF, I think it is a cache header sent by the server or the browser itself.
Anyway a simple way to avoid this problem while in development is adding a random param to the file name of the script.
instead of calling 'tree/scripts/tree.js' use 'tree/scripts/tree.js?'+random that should invalidate all caches.
As frisco says, adding a random number in development does the trick but you will likely find that the problem still affects you production. You want to push new JavaScript changes to your users but can't until their browsers stop caching the file. In order to do this, just get the files mtime and add that as the random string. This will only change when the file is modified and so the JavaScript will be loaded from cache if it has not been changed or it will be loaded from the server, if it has.
PHP has the function filemtime but as I'm not familiar with Ruby, I'm afraid I can't help you further in that direction (sorry!). However, this answer seems to accomplish what you want.
Try the Ctrl+F5 trick. To avoid hitting browser cache.
More info here:
What requests do browsers' "F5" and "Ctrl + F5" refreshes generate?

Categories

Resources