So I've been using the old fashioned way to switch between pages within my websites, I include my header on every page and use a/href's to go to another page.. I only use ajax requests within the pages themselves only. Not to switch my visitors to another page, as I said, I do that with native href's.
I looked at the history.pushState() to change the url's on the address bar every time I load a new page via jquery ajax. And it does it pretty well. The thing is when I refresh the page, it does keep the same url and loads that url/file again. But the file comes back nude, with no CSS and no header, since those are only included in my index.
I use php as my SS though.
Any good advice?
Using hrefs is perfectly fine, and standard. However, there are some performance gains to be had by avoiding full page loads via ajax.
If you do want to use pushState and ajax you have to configure your server so that all HTTP requests hit your index page/router. You can do this in nginx, apache, express js, or whatever you're using on the backend.
Then in your JS you'll want to sniff window.location to see what page the user is currently at and fire an ajax request immediately to load that page.
Alternatively, a faster but more difficult approach is to not set your server to point all requests to your index, and actually serve the full HTML for the page they request (HTML, CSS, headers and footers) so that the page doesn't come back "nude". This will make the initial page load faster. After that, you can resume using ajax requests as they navigate around the site.
How you do this depends on your technology stack. If you're using webpack like me, then at least in dev-mode you can set historyApiFallback: true to have all requests "fallback" to your index, and then I'm using React Router to load the appropriate page. But again, you might be using something different.
Related
I have a component which lazy loads the images.For the first time when my page loads then at that time the images are displayed using lazy loading but if I refresh or reload or close and then open the tab then my images are pre loaded because it is now fetched from cache.Is there any way i can stop caching of my component in angular 7?
The cache is not being done by Angular but your browser. Once you load an image (and depending on the headers of the response) your browser will cache it to be able to load it faster the next time. This is usually a good approach.
Not sure why you don't want them to be cached but you have different options. Here you have a good read about HTTP caching: https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/http-caching This cache configurations for static assets are usually done by your web server and they depend on which webserver you are using (nginx, Apache, IIS, node, ...).
Another option is to append a random query string to your image URL. This HTTP cache system works by using the image URL as a resource key to identify it. Because of this reason you can do something like:
<img src="./yourimagefolder/yourimage.jpg?r=putherearandomstring">
In this way your image resource 'Id' will be different in each request. (You will need to change the 'putherearandomstring' string in the example with a different random string each time the page is loaded.
If this is just for development purposes, you can disable the cache in developer tools. I don't see a reason you would want to do this for a live site though? As you would be forcing the user to grab the images everytime they load the component which will reduce performance.
The problem with cache in an environment where custom software is updated frequently and some users are less savvy is that they will not automatically get critical client-side changes unless they are told specifically to refresh their cache. With all of the decorations in the index.html I have not yet found a reliable solution.
I am building a sizable, mobile application that is currently built on top of jQuery Mobile and KnockoutJS. My first approach made heavy use of a Single Page Application design along with loading all dynamic content and data via Knockout and ajax calls. This has worked OK but maintenance and development has become very complicated as jQuery Mobile loads more and more into the DOM.
I wonder about moving to more traditional, individual HTML pages that are completely static while still loading data via Knockout and ajax. This will allow browsers to cache the biggest parts of the app: the HTML pages.
Question:
How can I best pass parameters around from page to page without creating unique URLs that inhibit client-side browser caching? I want browsers to aggressively cache pages.
I realize that I can implement all kinds of server side caching but that is not my goal here. /Display/3 and /Display/5 are the same page. Will the browser cache these as one?
I wonder about passing parameters after the hash mark? /Display#3 and /Display#5? How about passing parameters via JavaScript in the global namespace?
Hoping for a standard approach here.
Ok sorry for misunderstanding, but I think your approach goes the wrong way. You cannot use GET paramters that way, also JQueryMobile is a little bit confusing in url handling for AJAX.
Normally, if using AJAX to refresh content, you do not need to reload the page. So you need no caching, because the page is already there and only some content is reloaded via AJAX. But JQM's single page approach is not usable for dynamic created content that way. You can only dynamically create a page with all content in it, and JQM shows content by switching visibility. Then the # could be used to switch between the pages (the # does not force an reload, as used for on side navigation).
You can write your own loading function calling in buttons and links (instead of using URL GET paramters). By using JQuery's $.ajax method with dataType "html" (instead of json, default) you can do a content refresh in its success handler.
You could try html5 sessionStorage/localStorage. If html5 is an issue, than plain old cookies.
Just to clarify, if there are several HTML pages, each page must have its own URL.
When scripts are loaded via Head JS I am unable to force the content to refresh using the Ctrl+F5 (or equivalent) keyboard shortcut.
The scripts cache correctly and the browser obeys the cache directives sent from the server (I'm using IIS 7.5). But unlike scripts tags included directly in the markup, I can't override the cache and force a refresh of the scripts loaded via Head JS.
I'm assuming this is a consequence of the way the scripts are loaded dynamically. I can live with this behaviour because forcing the refresh is only convenient during development, and I know of other ways I can force the content to be retrieved from the server.
I just wondered if anyone could explain why this is the case...
Update
This was never a problem for us in Live, because the cache directives for our static content were set appropriately. It was only ever a problem in Development and QA, The options left available to me were...
Configure all Dev and QA browsers to never cache content.
Configure the static content cache directives differently for Dev and QA environments - essentially setting MaxAge to something so small the content would always be expired. Only setting the correct MaxAge value in Live.
I went with the second option.
Dynamic script loading is not a part of the page loading proper. When you force refresh, the browser reloads the page and all resources referenced in its HTML and in referenced CSS files, but the scripts you load with head.js are not referenced in the page content and the browser has no way to figure out that head.js is going to create references to additional resources. At the point where these references are created, the browser is no longer refreshing the page and thus normal cache rules apply.
You can force reload of your scripts by appending unique query strings to their URLs (e.g. jquery.js?random=437593486394), but this will disable caching for all loads of your page, not just when you force refresh.
This is also a problem with require.js. Hopefully one of these work arounds will also apply to Head.Js
If using Chrome, open the developer tools panel on the Network tab, right click and choose 'Clear Browser Cache'
Do a bit of 'Cache-busting' by appending a datetime stamp to the query string for js resources
If your using IIS (which it looks like you are). Go to the HTTP Response Headers panel of your website, click Set Common Headers and set Expire Web content to immediately.
The latter is my preferred option for my development machine
I wouldn't say its a question of dynamic or not dynamic, when you inject a script it still causes the browser to make a HTTP request and apply whatever caching logic it applies.
Like mentioned above if you don't want scripts to be cached ..dynamic or static, it doesn't matter, you will usually have to append a timestamp in the form of a query string to it.
If you just want to see if you changes are working, do a force refresh in your browser ...usually CTRL+F5
how i can make my pages show like grooveshark pages
http://grooveshark.com/#!/popular
is there a tutorial or something to know how to do this way for showing page by jQuery or JavaScript?
The hash and exclamation mark in a url are called a hashbang, and are usualy used in web applications where javascript is responsible for actually loading the page. Content after the hash is never sent to the server. So for example if you have the url example.com/#!recipes/bread. In this case, the page at example.com would be fetched from the server, this could contain a piece of javascript. This script can then read from location.hash, and load the page at /recipes/bread.
Google also recognizes this URL scheme as an AJAX url, and will try to fetch the content from the server, as it would be rendered by your javascript. If you're planning to make a site using this technique, take a look at google's AJAX crawling documentation for webmasters. Also keep in mind that you should not rely on javascript being enabled, as Gawker learned the hard way.
The hashbang is being going out of use in a lot of sites, evenif javascript does the routing. This is possible because all major browsers support the history API. To do this, they make every path on the site return the same Javascript, which then looks at the actual url to load in content. When the user clicks a link, Javascript intercepts the click event, and uses the History API to push a new page onto the browser history, and then loads the new content.
I want to pre-cache next web page into a thumbnail. Is it possible to pre-render a html page (with css) into an image on-the-fly with javascript/jQuery? And how to persist that temporary image on the client?
You could do an ajax request requesting an image or a linkt o an image from a script.
This srcipt needs to request the data needed from the website and render it using a rendering mechanism.
The returned information could be a link to the generated image on the server.
Performance could be pretty low depending on the data to be retrieved and rendered.
This question will show you a solution to render a website and produce a pdf.
You could use this approach and convert the pdf into an image usinf ImageMagick (needs to be installed on your server).
Afaik, that's not possible on the client-side, because it raises security concerns. Even the <canvas> element cannot render HTML-elements (only browser plugins are allowed to use the methods provided for that purpose).
What is the site written in?? If you have server side capabilities you could probably do it and send the image to be cached. Is not possible from jquery or javascript as far as I know.
Unless your page is absurdly complex, then it's more likely your bottleneck is in the network, rather than rendering. You can easily preload the html page and all its important resources (e.g. images, multimedia, etc), so that when the user go to the next page, you don't need to hit the network anymore and will load it from local cache.
There are a few techniques you can use to preload HTML files, invisible iframe is probably the easiest (though I never tried it myself).