Javascript Asset Download Progress From CDN - javascript

I have a web-based app which requires a lot of resources (audio/images/video). Previously I have been hosting everything on the same server and using PreloadJS to grab all the resources and download them (while showing a nice progress bar).
I am now moving to a Content Delivery Network (CDN) to host all these assets, but need to keep the base web application on my server.
So I have my app on webapp.com and all my resources on cdn.webapp.com - my question is how do I load all these resources from another domain and view the progress at the same time? Are there libraries that handle this or am I going to need to write up some code to subscribe to the onload() function of every asset and only continue when everything is downloaded?
(Thanks to the cross-domain ajax requests I cannot use PreloadJS to download everything anymore)

In the end (and because no one came to my rescue) I achieved it by doing this:
Images/assets are on cdn.webapp.com and were included as usual in the HTML of the page. Javascript would then run and set the div these images were in to have no height. This would still allow the images to load but it would not render them on the page.
I then found a JS library (https://github.com/desandro/imagesloaded) to register callbacks on these images so it would notify me when the images were completely downloaded.

Related

iOS HTML5 FileSystem API alternative

I have an application built that utilizes the HTML5 FilesSystem API, but it only works for Chrome.
Does anyone know of an existing plugin or a technique for replicating this functionality in iOS?
The catch is that I am rendering "mini-sites" for offline use. So I would need to be able to:
Download the files for the micro-site
Store them locally
Access them later. Right now, I'm using an iframe to render the page
My solution (right now) is to do the following.
Because I am caching microsite files that I am pulling from a 3rd party, I set
up a folder on a webserver and built out a PHP-based "caching"
service that routinely compares the content I have stored on the 3rd
part site to the same content stored locally to my server. It updates
the content where necessary.
The app, when it is run from iOS, will asynchronously load each of the microsites in an iframe (create a frame of size 1 x 1px with the appropriate src). The iframe self-destructs after
loading is completed.
Step 2 allows my service worker to cache all of the micro-sites locally, along with the main site.
I have other code in place to keep the local iOS cache "fresh".
This works, but it is nowhere near as ideal as the Chrome File System API, so any alternative suggestions would be great!
Thanks,
Wayne

how to cache images with angularjs

I am working with an angularjs web app. I have a sidebar with images. In my localhost these images only load once because they are cached. When I push the web app to the web the images are not cached and everytime I switch to another state the images reload. Is there a way to cache the images in angularjs without adding extra headers to the server.
You could load the images from a CDN; if the expires headers were setup properly on the CDN, using a CDN allows your images to cache and meets your questions's requirement to not setup your expires headers. This should dramatically increase the speed at which your images load anyway.
If you still aren't getting results, even after you've setup a CDN to serve static content, I would make sure the filenames being served doesn't have some kind of cache-busting url that forces a new image to download each time (the source would look something like "../path/to/image.png?23someRand0mString". And most browser's dev tools turn off caching when the dev tools are open (or at least they have a setting) so verify they aren't being cached; your images should be cached by default in most managed server configurations.

How to locate all images in a directory using jQuery

I'm developing an application which runs on a localhost server.! In this application I do ajax calls and get items from a local h2 DB! Using the response I create dynamic elements using jQuery. The elements use an item image as background and the requirement is that I should get the images from a local folder. ( The folder is created when the server is first started and the images are synchronized from a main server over the intranet. ) The folder hierarchy is shown below.
c:/----
|
zharaimages/ -----
|
[item id]/-----
|
[image].jpg
The image can contain any name for it but will be a jpg. How can I read the file system using jQuery to get the necessary image file when the item is dynamically loaded. I thought of this method but for that I can only read a file with a static name. I want to write a method where the image name can be anything.
clone.css('background-image','c:/zharaimages/' + items[i].itemId + '/image.jpg');
Any ideas or plugins are welcome.
Thank you.
update
This is a deployable application which uses an embedded jetty server. The folders are in the same computer as the application is!
Unfortunately a big NOOOOOO...
javaScripts cannot read or write files on users' computers (File API - not currently supported by many browsers - allows files to be read by scripts if the user specifically chooses to allow it),
though they can cause the browser to load remote pages and resources like scripts or images, which the browser may choose to cache locally.
They cannot create files on the server (except by communicating with a server side script that creates files for them).
You have to make a server request(many ways...) for the resources.
I'm not sure weather its possible with HTML5 or not
jQuery runs on the browser.
The files are on the server.
The only way that jQuery can read the files on the server is if it makes an AJAX call to the server, and your web server enumerates them.

AJAX Application Single or Multiple JavaScript Files

This is a best practice type of question. I am developing a complete AJAX application. The user navigates to the main page of the application and everything from there on out is loaded via AJAX into the content section of the main page. Is it better to take all the javascript files I have and merge them into one file that is loaded on the main page or to split them up into just what is needed for each page that is loaded?
Putting it all in one file obviously has the benefit that only one HTTP request is made to load the javascript needed for the site and any request for a page there after will only need to fetch the HTML. But, this requires that every event that is wired up (using jQuery) be attached to the document using the live or on function. So everything will look like:
$(document).on('click', '#SomeButton', function () { });
Doing it this way will cause there to be many hundreds and possibly over a thousand events being tied to a single element, the document.
Putting them in separate files requires multiple HTTP requests to be made to load the various pages of the site but limits the number of events that are attached to the document.
Any thoughts on what is best here?
I would vote for separate js files for each page for bigger projects specially if your project is using any js library like jQuery and its plugins like grid plugin etc. In case you have a big single javascript file your first page will load slowly obviously giving your user a bad first impression. What we do is that we create separate js files for each page specially when there are ajax calls to load data for the pages. Plus there are separate files for each pluggable component like custome drop down or date counter etc. This way its easy to manage the code and customize it later.
Before deploying the app we can merge related files and create single file for a single page. For example if a page called editProfile.php uses a data picker, a jquery validation plugin and custom js to load user data, we can combine them in a single file so that only file will be loaded for a single page.
So I would vote for separate files for each page and applying optimizations before deploying.
Honnestly i'm not really an expert in this domain but this is my piece of advice on this subject on a production environment.
I would use CDNs for libraries (like jquery). They offer maximum cacheability, and there is a very big chance it is already cached in your client's browsers from visiting other websites. This saves some requests already.
Group and minify your common javascript code, like plugins, utilities, things used throughout your site. It will be requested once for all and will then be available.
Have a separate, minified, script file for each "page" you load dynamically that you will load along with your content.
Loading script for content pages:
Using the .load() method from jquery to load fragments of pages will unfortunately remove any <script> tag present in the fragment. As noted in the jquery load() method, this is to avoid "Permission denied" in IE.
What you can do is to have a <script id="contentScript"></script> tag in your base page and load the script along with the content by replacing the src.
I don't know if it is a good practice but it makes sense to me :-)

Start loading next page while browser is idle

I have product website. On one page I show thumbnails and a brief description of all the products. When you click on the photos, you get to a detailed product page.
Is there a way to get the browser to start loading and caching the javascript and CSS for the "detailed product" page while the user is just looking at the "all the products" page and trying to make a choice?
I want this preloading and caching to start only once the page has fully loaded as to not slow it down.
Any suggestions on how to implement this?
If you're using a JavaScript framework (like jQuery, protype, etc) then you can use a simple method to do an AJAX call. If not you'll have to write one which might be a bit confusing for someone that isn't familiar with JavaScript. A basic example is here.
You can use JavaScript to add script tags to your html page and it will include JS. Remember that if the JS is set to auto execute any code it will happen. For CSS, your only option is probably using JavaScript to send a request to grab the file (see above). You could include the CSS but it will override any styles from your original CSS file.
Websites that precache:
Websites including sites as big as Google and Yahoo use preaching to help performance. Google for instances loads a CSS sprite http://www.google.com/images/nav_logo7.png on their main page along with other CSS and JS files that are not completely used on the main page alone. Most people already do something similar to this by just combining their CSS and JS files into one file in production. HTTP requests take more time than downloading the actual content. An example of Yahoo preaching is here
Yahoo talks about this on YSlow's help here.
Taken from one part of the guidelines here:
80% of the end-user response time is spent on the front-end. Most of this time is tied up in downloading all the components in the page: images, stylesheets, scripts, Flash, etc. Reducing the number of components in turn reduces the number of HTTP requests required to render the page. This is the key to faster pages.
Organization in development, speed in production:
What I usually try to do is in development I will split up my JS files if needed (hardly ever my CSS though). When its time to push this data to production servers, I run a compiler (simple script that combines all the files, and minifies them) and then put them online.
Minifying/compressing:
Remember HTTP requests are evil. A compressed JavaScript file and a compressed CSS file are so small, that I'm almost 100% sure there is an image on your main page that is smaller than it. Therefor it's pointless to worry about splitting them up per page. It's actually more of a performance hog to split them up across multiple pages.
CSS Sprites
The point in CSS sprites is a website probably has 40+ images on their page using CSS. Well thats 40+ HTTP requests on a users page load, thats A LOT of requests. Not only is that bad for the user, but thats also a lot of requests your web server is having to handle. If you aren't using a static content server and are just using Apache that is on your main host, you're poor Apache server is getting loaded with requests it could be serving for your web application. You can reduce this by combing your images into one file, or at least into fewer files. Using CSS's background-position property, you can do wonders.
I highly recommend reading the YSlow guidelines by Yahoo here: http://developer.yahoo.com/yslow/help/#guidelines
Theoretically you can start accessing resources from subsequent pages so that they are later available in the cache.
However, this is not good practice - especially if you are loading resources for all detail pages they may select. In doing so, you make the assumption that you should determine how the user's bandwidth is used, not them. If they are browsing multiple things at the same time, or doing other things with their bandwidth besides viewing your website, you are using their bandwidth in a manner they do not intend.
If their connection is slow enough that the load time for your detail pages needs to be optimized, chances are their connection is slow enough that they will feel the loss if they are doing other things at the same time.
use setTimeout in the load event of the page, and set a timeout of a few seconds, after that, insert a script tag and a css tag into page (those ones from the next page)
something like this: (where url is the url of the thing you want to cache)
//cache a script
var scriptTag = document.createElement("script");
scriptTag.setAttribute("type", "text/javascript");
scriptTag.setAttribute("src", url);
document.getElementsByTagName("head")[0].appendChild(scriptTag);
//cache an image:
var img = new Image(); img.src = url;
//cache a css
var css= document.createElement("style");
css.setAttribute("type", "text/css");
css.setAttribute("src", url);
document.getElementsByTagName("head")[0].appendChild(css);

Categories

Resources