If I open my webpage and then look at developer tools on the browsers like chrome/firefox/ie, I see multiple copies of the same view/javascript files loaded. This can potentially screw up how javascript behaves wrt to its state.
Almost everytime, doing page refresh would solve it.
This happens very randomly.
I am not at liberty to post my code, but the set up is nginx/thin/ruby/haml.
What can lead to such behavior? is the problem on the server side or browser side?
Our initial hunch was that may be they are multiple versions of the same doc, but all of them are exact replicas. So I ruled out caching as possible culprit.
More info about the page:
No dynamic loading of scripts - simple script tags
No frames on the page - simple body tag with a form in it
No advertisement scripts on the page
happens on all browsers
intermittent - frequency is like once in 500 page loads
Related
I'm having a serious issue with android's inbuilt browser (I assume that's what messenger uses anyway).
The issue is that HTML content on our page is dynamic and has to be always updated per reload, however, the content itself is being cached within the browser.
Scripts, css and other types of caching have no effect, since the URLs being loaded are cached as well.
Example:
echo time();
Will output 1571044358
However, every time I re-open the link (from messenger) it will always output 1571044358, instead of actually reloading the page.
Chrome and regular browsers are obviously working fine, it's just the inbuilt browser that's causing the issue.
Any ideas on how to solve this?
When scripts are loaded via Head JS I am unable to force the content to refresh using the Ctrl+F5 (or equivalent) keyboard shortcut.
The scripts cache correctly and the browser obeys the cache directives sent from the server (I'm using IIS 7.5). But unlike scripts tags included directly in the markup, I can't override the cache and force a refresh of the scripts loaded via Head JS.
I'm assuming this is a consequence of the way the scripts are loaded dynamically. I can live with this behaviour because forcing the refresh is only convenient during development, and I know of other ways I can force the content to be retrieved from the server.
I just wondered if anyone could explain why this is the case...
Update
This was never a problem for us in Live, because the cache directives for our static content were set appropriately. It was only ever a problem in Development and QA, The options left available to me were...
Configure all Dev and QA browsers to never cache content.
Configure the static content cache directives differently for Dev and QA environments - essentially setting MaxAge to something so small the content would always be expired. Only setting the correct MaxAge value in Live.
I went with the second option.
Dynamic script loading is not a part of the page loading proper. When you force refresh, the browser reloads the page and all resources referenced in its HTML and in referenced CSS files, but the scripts you load with head.js are not referenced in the page content and the browser has no way to figure out that head.js is going to create references to additional resources. At the point where these references are created, the browser is no longer refreshing the page and thus normal cache rules apply.
You can force reload of your scripts by appending unique query strings to their URLs (e.g. jquery.js?random=437593486394), but this will disable caching for all loads of your page, not just when you force refresh.
This is also a problem with require.js. Hopefully one of these work arounds will also apply to Head.Js
If using Chrome, open the developer tools panel on the Network tab, right click and choose 'Clear Browser Cache'
Do a bit of 'Cache-busting' by appending a datetime stamp to the query string for js resources
If your using IIS (which it looks like you are). Go to the HTTP Response Headers panel of your website, click Set Common Headers and set Expire Web content to immediately.
The latter is my preferred option for my development machine
I wouldn't say its a question of dynamic or not dynamic, when you inject a script it still causes the browser to make a HTTP request and apply whatever caching logic it applies.
Like mentioned above if you don't want scripts to be cached ..dynamic or static, it doesn't matter, you will usually have to append a timestamp in the form of a query string to it.
If you just want to see if you changes are working, do a force refresh in your browser ...usually CTRL+F5
I would like a javascript to run after a page is loaded , like on the example below with a delay of 6seconds. Right after the page loads the rest of JS is lost (obvious)...
Got any ideea how change content after page is loaded without clicking a button?
javascript:window.location = "http://example.com";
setTimeout(function() {
document.getElementById('lightbox').style.display = 'none';
}, 6000);
Once you set window.location the original page will be unloaded before the new page is loaded by the browser. This means your script will be gone before the new page start loading and thus can't modify the new HTML anymore.
This behavior is inherent to the security model of the browser. Without it you could inject any JavaScript into any web site of your choosing, which would be a huge security risk. What you are asking for is so-called XSS (for cross site scripting), which is prevented by the browser applying a so-called SOP (for same-original policy).
There are some common ways to work around this limitation in a safe way:
Set up a proxy to serve both your JavaScript and the original site. This way both your script and the original site come from the same domain and satisfy the browser's same-original policy (SOP). You could run the original site in an iframe with your custom script occupying the top-level window. Alternatively you could inject your script into the HTML as it is being retrieved through your proxy.
Run your script as a browser add-on or user-script. If you choose to do this, the user will have to specifically grant your script the rights to run locally with elevated rights. Greasemonkey popularized client-side scripts for Firefox a few years ago, but recently they seem to have lost momentum.
Ask the site owner to include your script. I doubt this is a valid option for your situation. But if it is a valid option it is definitely the simplest one.
Ask the user to run your script after the site has loaded. This one is probably also not valid for you, but if valid it would once again be a very simple solution.
Your example shows that you are first redirecting and then attempting to hide #lightbox. This script would not work, because you are redirecting the browser to another site before #lightbox gets hidden.
In short, you cannot have Javascript of a previous page manipulate DOM of the next page if you redirect the user to another URL (or even the same URL). Only Javascript that is 'on currently open page' can manipulate currently open page and no other pages.
I have not understood what you are saying. JS is lost? Please be more clear.
I think what you are talking about is the jquery ready function which runs after the DOM is ready. Or in the other case, try using window.onload() function.
This should do the job:
$(window).bind('load', function() {
// your code here
});
Then simply add the delay to your added code with .delay("6000");
The inserted code will only run when your page is completely loaded.
How can I stop loading a web page if it uses a frame-buster buster as mentioned in this question, or an even stronger X-Frame-Options: deny like stackoverflow.com? I am creating a web application that has the functionality of loading external web pages into an <iframe> via javascript, but if the user accidentally steps on to websites like google.com or stackoverflow.com, which have a function to bust a frame-buster, I just want to quit loading. In stackoverflow.com, it shows a pop up message asking to disable the frame and proceed, but I would rather stop loading the page. In google, it removes the frame without asking. I have absolutely no intent of click jacking, and at the moment, I only use this application by myself. It is inconvinient that every time I step on to such sites, the frames are broken. I just do not need to continue loading these pages.
Edit
Seeing the answers so far, it seems that I can't detect this before loading. Then, is it possible to load the page in a different tab, and then see if it does not have the frame-buster buster, and then if it doesn't, then load that into the <iframe> within the original tab?
Edit 2
I can also acheive the header or the webpage as an html string through the script language (Ruby) that I am using. So I think I indeed do have access to the information before loading it into an <iframe>.
There's no way to detect this before loading the page since the frame busting is done via a header or is triggered via JavaScript as the page is loading.
Without a server backend you won't be able to as you are pretty limited with the amount of tinkering you can do in javascript due to crossdomain policies.
You might want to consider creating some sort of a blacklist for URLs to stay away from...
I'm making an ASP.NET web forms web app. I've just started with the client side scripts. I'm planning to put quite a lot of JavaScript code in a file that will be loaded on each page. I want to know some general guidelines about when to start worrying about the size of the file, in consideration of the users and their page loading times.
The users will mostly be using Internet Explorer 7 and 8, but I suppose the script still will be cached after the first visit? If not, is there any way to make IE cache the file?
They'll be cached, like you suppose, after the first visit, so you don't need to worry unless it actually becomes an issue.