I'm trying to bridge the gap between loading/preloading images and other items in a manifest and making them accessible to CSS and JS in conventional DIV tags.
I understand the preloading process. I then have an array of images or objects, etc, that are viewable in a window/stage object, but I want them in DIV tags to reference them.
How do people go about loading/preloading images and then reference them in a standard HTML5 markup page?
Like, imagine I have a site with clean markup and scripts/styles links. Now I want to preload it. I can't find any documentation that makes that clear.
Does anybody know what I mean or can point out the perspective of trying to load a page of content and then access it with getElementById?
Thanks in advance for any feedback.
Preloading images is not directly related to being able to access elements in DOM.
Preloading tells the browser to download a web resource in advance already when the browser would otherwise not download it yet.
The ability to access an element in DOM on the other hand depends on the stage of downloading the page itself.
Related
I have a blog website that loads the images slowly i want to know how to make them load faster and:
I am using same image for thumbnail and story. thumbnail is small,does it still load full image?if so how to use thumbnail of an image?
Where should i store the images? what is the best location to store images for your websites and blogs? can save them in one drive and use the source?
how to optimise images?what is a placeholder?i have seen many websites such as facebook use a kind of place holder which displays before image and content?how to do so?
-how to i preload images ? or is there any better way ?
Here are some pointers.
Thumbnail images have to be separate from original (large) images. When the user uploads the images, you have to use some script to resize the images. If you are using a standard CMS like Drupal or Wordpress, there should be an option somewhere to do the resizing (without you having to write code).
Assuming your blog is public, the images as well should be public (usually). You can create a directory named files and you can store the images inside that directory. If you are using a standard CMS, these options should be there in some form.
To avoid having all files in one directory in the long run, use folder naming schemes like files/[YEAR]/[MONTH] or anything else you think would serve your purpose.
Make sure the uploads directory and your upload mechanism is well-protected using and .htaccess (or equivalent). Otherwise, someone might upload malicious scripts and execute them on your server.
A placeholder is anything which holds the place of something while the original thing is absent (or being loaded). So, a placeholder image will be a standard image with a general design - it's as good as saying loading. You can use JavaScript or CSS (background-image) to achieve such a placeholder.
Preloading should not be necessary as far as I see from your question. A better opinion / answer could be given if you share the link to your site.
Next time, please try to make detailed questions - one question per problem, if possible. Also, do not fear to Google for a solution. I learnt programming (PHP, JS, Drupal, CodeIgniter and more) just by Googling! Hope this helps!
Jigar has done a fairly good job of answering the question though I thought I'd add if you want to optimise images there are plenty of websites that do it for you for free.
My favourite is https://tinyjpg.com/ however there are plenty of others. A quick Google search will get you plenty of different sites all doing basically the same thing.
This post might also help Load a low-res background image first, then a high-res one
SVGs have been around for years due to its scalability and it is long-familiar that the benefit of inline SVG is one can manipulate it with CSS and JS, and when we want to repeat the same SVG over a html document, then we can use the <use> tag to reference the original element. Furthermore, inline SVGs could also reduce the number of HTTP requests.
However, many articles suggest (without explaining the details) that while we use inline SVG to save HTTP request, it is no longer cacheable by a browser as a separate subject, which means it is not reusable across the pages.
As I happen to use inline SVGs extensively for a project, I would like to know exactly how inline SVG (the renowned html5 element, which is a w3c recommendation) can be cached in browsers whilst using SVGs with <img> tag or background-image are cacheable.
If DOM is cacheable, then why can't the SVG DOM ?
(which builds upon and is compatible with DOM Level 2. Ref: https://www.w3.org/TR/SVG/svgdom.html)
So far, the solution I came up with cachebility is to use Data URI scheme
(Also Ref: Optimizing svgs in data uris )
But by doing so, it loses the ability to deal with CSS and JS for styling and manipulation.
A few examples around the web suggest the use of JS to load cacheable resource or by replacing placeholder elements such as <object> tag, as well as using localStorage, CacheStorage and Service Worker. But I still need some guide lines to get started to achieve an ideal solution.
Could someone shed me some light please?
-
-
-
Ref: Caching SVG Sprite in localStorage
Ref: Inline SVG and caching
Ref: SVG ON THE WEB
Ref: Do Inline SVGs Weigh Down Websites?
Basic HTTP caching works based on URLs, and it is “all or nothing” - you can instruct the client to either take the whole resource from cache, or to reload it completely.
Now, by “inlining” your SVGs, you are making them part of the HTML document - they are not external resources any more, that could individually be checked for whether they can be taken from cache or need to be reloaded.
So, if you have three HTML documents that all have the same SVG image inlined, the code of the image will be loaded three times - because it is part of the three HTML documents.
Whereas, if the image was embedded as an external resource (as img, background-image, object, …), it would be loaded only once, on the first of those three HTML pages the browser loads. On the other pages, it will recognize, “hey, that external resource with this particular URL is in my cache already - no need to load it again.”
I'm building a website for a gallery owner that has a lot of images per webpage.
Therefore I want to lazy load the images on the webpage, making the initial load
less heavy. However, I would like to implement this in a "progressive enhancement" way.
I've found a lot of lazy loading methods but they all require fiddling with the html code
in such a way that the webpage would be useless with javascript turned off. (eg. the src attribute of the img tags remains unset until the images is lazy loaded).
To implement a lazy loading method progressivly I think one would need the following:
prevent the browser from fetching the images, even though thers are on the page,
but only do this when javascript is on (so on non-javascript browsers, the images still
load as normal). This should be done without altering the html.
save the src attribute in a data-src attribute
sequentually load the images when scrolling down
Of these three steps the first one seems the hardest one. Even this stackoverflow discussion did not provide an answer that doesn't ruin progressive enhancement.
Has anyone got any ideas?
Since none has come up with an answer, I'll post what I found a reasonable solution.
This problem boils down to the following: while we want to prevent the browser from downloading the images when javascript is turned on, we must be sure the images are downloaded
when javascript is turned off or not available.
It is hard to consistently use javascript to stop loading images on a page when they are
in the "normal" format:
<img src="path/to/image.jpg"></img>
To stop the images from downloading we'd have to remove their src attributes, but in order
to do this, the DOM should be loaded already. With the optimisations a lot of browsers have nowadays it is hard to guarantee that the images aren't downloading already.
On top of that, we certainly want to prevent interrupting images that are already downloading,
because this would simply be a waste.
Therefore, I choose to use the following solution:
<img data-src="path/to/image.jpg" class="lazy"></img>
<noscript>
<img src="path/to/image.jpg"></img>
</noscript>
Notice how the images outside of the noscript tag have no src but a data-src attribute instead. This can be used by a lazyloading script to load the images one by one for instance.
Only when javascript is not available, will the images inside the noscript block
be visible, so there's no need to load the .lazy images (and no way to do this, since
javascript is unavailable).
We do need to hide the images though:
<noscript>
<style>
.lazy {
display: none;
}
</style>
</noscript>
Like the img tags inside the noscript block, this style block will only be visible to the browser when javascript is unavailable.
On a related note: I thought I could reduce the html size by not putting a src or data-src attributes on the lazy images at all. This would be nice because it eliminates
the redundant url from the page, saving us some bandwidth.
I thought I could pluck the src attribute out of the noscript block using javascript anyways. However, this is impossible:
javascript has no access to the contents of a noscript block. The above scheme is therefore
the most efficient I could come up with.
Not specifying a src attribute is invalid HTML, which is unfortunately how most lazy image loaders work.
I am working on a lazyloader that uses valid html markup, github link:
https://github.com/tvler/lazy-progressive-enhancement
A lazyloaded image would be declared by wrapping it in a noscript element:
<noscript><img alt="hello!" src="..."></noscript>
and the final outputted html would be
<img alt="hello!" src="...">.
You can view the whole project on github, which deals with batch loading, event hooking & more, but here's the basic functionality at the scope of a single noscript image:
var noscript = document.querySelector('noscript'), img;
(img = document.createElement('div')).innerHTML = noscript.textContent;
noscript.parentElement.replaceChild(img.firstChild, noscript);
I'm making finishing touches on my site and am struggling to make the load of the page look less jumpy. best way to show you what i mean is to show the site:
http://marckremers.com/2011 (Still not complete, in alpha phase)
As you can see the contents sticks to the left, loads a ton of jquery and images into the page, and only then click into place (centre).
I'm wondering if there is a way i can make it click into place first and then load the elements?
I tried putting the reposition script just after , not even that works.
Any ideas? Thanks
With all of the images you have, your page is 1.5mb, coupled with 70 http requests. No wonder your site behaves the way it does.
You should be using sprites on the smaller images to reduce http requests and as far as the large images go, you are loading all of the pictures at once. Even the ones that aren't displayed right away. The images that aren't displayed right away should be pulled in via AJAX after the page loads.
If you want to go further into optimization I would also:
Use one external javascript file. Yes
it increases size, but I favor that
over http requests.
Minify your html/javascript/css.
Don't host jQuery on your site, use a CDN such as Google APIS.
Check out a service similiar to Amazon S3.
I could reinvent the web site best practices wheel here, or I could send you to Yahoo best practices for web site optimization There is a ton of very important information there, read it and reference it.
You loaded jQuery twice, once from your own site and another time from Google's CDN. For starters, you should probably move all the JavaScript to the bottom of your HTML. Then you need to optimize your if ... else that handles how many columns to display and your Google Maps iframe.
To speed the visual up, instead of using jQuery, you should probably have some vanilla DOM scripting that dynamically creates some CSS styles for the projects and tb_tweets classes, so it doesn't have to wait for all your JavaScript to load before handling resizing of your projects and tb_tweets.
use http://mir.aculo.us/dom-monster/ and do everything it tells you to do. If you want tools to figure out what is going on during page load, the chrome developer tools are hands down the best out there for client side optimization.
A think you could do is put your javascript functions in the document.ready(function()), this way the functions will be loaded AFTER the page is loaded. I guess you don't need the functions for loading the site, just to interact with it?
Generally you only want to trigger your events after the page has rendered, i.e., using
$(document).ready(function()) {
//your javascript goes here
}
Then, in your HTML you have placeholders so the page doesn't "expand" or "jump" as you put, with some kind of indication that the element is still loading. Then, when your ajax requests complete, simply populate the placeholders with the ajax response.
Hiho,
There's an existing website that i need to include into another site which goes like this:
a.mysite.com
and i need to fetch content from this site in my
www.mysite.com
website...
As i need to access the content of the iframe the Same origin policy produces a problem here.
What i did was to configure mod_proxy on Apache to proxy pass all requests from
www.mysite.com/a
to
a.mysite.com
This will work fine...but my problem is that im not sure what the best way would be to include those pages.
1. Idea
As the content of the iframe is a full featured site with a top navigation...left navigation etc....i would need to change the page template to only show the content box to be able to integrate that page in the iframe.
2. Idea
I could just load the DIV where the content lies through JQuery.load() and integrate it into my site.
What is the best way to accomplish such a task? How bad is both ideas from the SEO point of view?
Unless it involves significant rework, the best solution is to combine the two into a single HTML page on the server side (using server-side includes).
Advantages:
No problems with SEO as it's delivered as a single page. Content in iFrames and content loaded via AJAX (with an associated link in the HTML) are traversed, but only the link, not the content itself is associated with the main page. See: http://www.straightupsearch.com/search-marketing/best-practices/seo_iframes_a_g/
Faster page load - either of your suggestions will cause the main page to be loaded first before the other content is loaded.
No reliance on Javascript - your second method will fail completely if javascript is not supported / turned on.
Include all JS and CSS only once - your first method will require these to be duplicated in the <head> of each page. This is more of a long term advantage if you wish to achieve full integration of site "a". However, it can be a disadvantage initially, see below.
Disadvantage:
May cause conflicts with scripts and CSS between the two pages. However, this same problem exists with your second method.
If you must choose between either of the two options you proposed, I would not select the second as others have suggested. Significant amounts of static content should never be loaded via Ajax, and in this scenario gives you no additional benefits. At least iFrames guarantee no JS and CSS conflicts.
Use the 2nd approach (jQuery.load) and if you're working with HTML5, for browsers that support the History API you can change the URL to whatever the content is for that div.
Check out https://github.com/blog/760-the-tree-slider for an example of how github did it for their tree slider.
EDIT:
I am not sure how using an iFrame whose src points to your own domain affects search rankings but at best it's a grey area. I would assume that possibly some pagerank would trickle from the parent to the child but I have no clue how it would work for instance if a blogger linked to your page with the iframe that pointed to another page. This would be a pretty good question to ask at the Webmaster Help Forum
Always say no to iframes. jQuery+Ajax all the way.