Is it possible to load images from bottom to top?
Assume that I have a long, very long image that needs 60 seconds to load. And the content is readable from bottom to top, can I do anything to make the browsers to load my image from bottom to top so users can read it while the image is loading?
Thank you,
This is a funny "AAAAAND ITS GONE" meme related to this question.
Thanks to all you guys who have answered my question, here are some solutions I have summary:
Solution 1: (Derek's answer)
Flip the image and then display it with -webkit-transform: scaleY(-1);
Demo: http://jsfiddle.net/Nk6VP/
Solution 2 (aloisdg's answer)
Use BMP format, this way browser will load the image "upwards" but BMP is not a good file type to save big images, but its still a good try since you doesn't need to flip the image at server-side.
Demo: http://jsfiddle.net/dfbPz/
(Please disable cache to see the loading in the demo)
In fact, you can. Just use BMP format. This format is stored from the bottom.
You can find a sample of image loading upward here. (You have to click on the button "Bitmap and .rle Images to display the sample.)
From the Bitmap file format at fileformat.info:
[Regarding file header structure] If Height is a positive number, then the image is a "bottom-up" bitmap with the origin in the lower-left corner. If Height is a negative number, then the image is a "top-down" bitmap with the origin in the upper-left corner.
You can find more info on this topic in SO or this one.
You can chop it up into slices in the server and load them separately. This is probably the only way to do this since you don't really have that much control over how contents are sent.
OR, just rotate the image in the server, load it normally, and display it with transform: rotate(-180deg).
I think technology like Spdy (which is a replacement) for Http makes such stuff possible..
And even Browsers like IE/Safari don't support it,
because of the fact, that it's an Google technology
Look at this demo:
https://www.youtube.com/watch?v=zN5MYf8FtN0&feature=youtube_gdata_player
around minute 38
And yes, you would also need to split your image up in multiple parts for this... Like suggested in another comment here
LIVE DEMO
<img src="perfect.jpg" style="background-image:url(imperfect.jpg);">
The huge image will slowly appear over the placeholder by magic!
A bit of CSS for that img like background-size might also come handy. You got the idea.
If your goal is to make your content readable by user when your image loaded, you can use jquery lazy load image plugin, here a demo http://www.appelsiini.net/projects/lazyload/enabled.html
and you still can use jpeg image which have smaller size than bmp
Here is an alternate way that is Backwards-Compatible with uncommon browsers. The flipside (aha) is increased page size and more prep work. You decide if the benefits outweigh the cons for your specific needs.
Step 1: Open the image in an image editor, flip it vertically, and upload it as a new file on your site.
Step 2: Code! Details below...
JSFiddle: http://jsfiddle.net/5uLZD/1/
Libraries: Modernizer, JQuery
HTML:
<!--Source below is the 'unflipped' image-->
<img src="http://i62.tinypic.com/wratsj.jpg" id="bottomtotop">
CSS:
.flipped {
transform: rotate(-180deg);
}
JS:
if (Modernizr.csstransforms) {
// Source below is the 'flipped' image.
$("#bottomtotop").attr('src', 'http://i60.tinypic.com/2qajqqr.jpg').addClass('flipped');
}
You can alleviate the size issue by simplifying the Javascript and getting away from the frameworks (i.e, use "document.getElementById('bottomtotop')" in place of "$('bottomtotop')" and so forth), but that will take a significant amount of time and work.
Related
Ok so after some searching and not finding the real anwser I was looking for I came with the following question in this situation:
I have a trading website that loads about 2300 PNG images of 37x50 twice, once in a left column and once in a right column. The images and all information that comes with it is inserted using jQuery on the document on the onLoad event. However loading 2300 images (even when the html came straight from the server) takes just TOO much time and even hangs new versions of chrome!. So now the quick solution was just to remove the images and show in a dynamic tooltip. Works great but got angry website users and it is indeed ugly.
So... I thought of some possible solutions but I have no idea what is good/bad practice here:
Make all images JPEG and reduce quality.
With the above or not: Add all images to 1 very large image, load it and draw 4600 canvasses based on locations in an array like 'imageArray["someimageID"] = { x = 0, y = 40 }'
Convert all images to base64, add them in an array 'imageArray["someimageID"] = "base64"' and draw 4600 canvasses.
And to an extend where I must think of as well that of all those 2300 images I have a small, medium and large version. (of which only the small ones, 37x40, are shown all together in a page)
Hope to get some nice insights on how to correctly solve such a problem!
Greets
If your images are static (not generated for every request) I think you should use CSS sprites. (similar to your own suggestion of lots of canvases).
Basically you create a div for each image you want to show (or some other container element) and set a background on it that takes a small portion of the big image that contains all images.
Example css:
img.icon1
{
width:50px;
height:50px;
background:url(spritesheet.png) 50 0;
}
in this example the 50 and 0 indicate the offset of your 50x50 icon in the spritesheet.
Update: Here http://css-tricks.com/css-sprites/ is an explanation that goes a bit further than my simple example.
First off, consider whether or not you actually need this many images, and loaded on the page all at once. Assuming you do...
Make all images JPEG and reduce quality.
Use the right format for what you're doing. JPEG is for photos. My guess is that since you have 37x50 pixel images that you're not showing photos. PNG is likely smaller from a file-size perspective in this case. It doesn't matter a whole lot though because the speed issue you're having is probably 80% browser time, 20% network time.
With the above or not: Add all images to 1 very large image, load it and draw 4600 canvasses based on locations in an array like 'imageArray["someimageID"] = { x = 0, y = 40 }'
Try it and see. I don't think this is going to help you. A canvas will have more overhead than a simple image.
Convert all images to base64, add them in an array 'imageArray["someimageID"] = "base64"' and draw 4600 canvasses.
Don't do this. You're adding 33% overhead to the file size, and again the load problem is mostly in your browser.
What you can do
Really question again whether or not you need this many images in the first place.
Cut down on network requests by using several hostnames to load the images. image1.example.com, image2.example.com, image3.example.com, etc. This will allow for more network requests in parallel.
Use your developer tools to verify where the problem actually is. Again, I suspect it's mostly client-side. Once you know the real problem, you can solve it directly.
I would advise if you can, creating a very low resolution sprite of images that can be placed to make it look like everything is loaded, then replace this with the proper images. Without better code/samples/what your images contain/ are they dynamic I am unable to give you a real answer with solution but at least it can lead you in the correct direction.
If your images are static, this will work fine, if dynamic there is not much else that can be done. I think some examples of the webpage you are creating would be great
The reason you're having problems is simply a massive amount of HTTP requests - something you should always be trying to minimize.
Like others are saying here, you're going to want to use a spritesheet technique if possible (it sounds like it is). A spritesheet will condense all of your images into one, removing 2299 of your HTTP requests.
I've been trying to find a good approach to solve a very commmon problem in the retina era.
Lets say the following is given:
Create a website with responsive images
No CSS background images
Websites basic functionality must be working without JS
The websites images must be optimized for retina displays.
An easy way to solve this could be something like this:
<img src="img.jpg" data-highres="img#2x.jpg" />
and write some kind of js to swap out img.jpg with img#2x.jpg if retina device is detected. This would work, but if I enter the website on a retina device both img.jpg and img#2x.jpg would be loaded. Not very bandwidth friendly :(
Is it possible somehow to intercept and change the src of the image before the original src is loaded?
Or do any of you have another approach in solving the problem?
In the future, you might be able to use the picture element. In the meantime, the only approach I've seen that might work is to:
Put a div or span where you want the image. Style it to have the dimensions and layout of the image. Add some kind of identifying mark to it so you can find it (e.g. class="retina-image")
Store information about the different sizes of images you have on or in the element (e.g. using data-something attributes)
Put a <noscript><img src="..." alt="..."></script> inside the div
On DOM ready:
use JS to find all the elements with the identifier from step 1
detect the type of image you want
find the attribute that tells you what URL to use for that image
add the image to the DOM inside the container from step 1
This is the approach used by the picturefill library.
This will probably be flame-bait, but here's my two cents:
The concern over bandwidth is largely a mobile platforming issue. Considering that most modern mobile platforms now adopt high pixel density displays, the demand for high res images from these devices is going to be pretty high. Why serve 3 times the image resolution 90% of the time (a low res and high res image) when you can server 2 times the resolution 100% of the time?
Hence, in some scenarios, it may be easier to just serve up the higher resolution images (halving their width/height styles) and leave it at that - thereby saving (expensive) time and energy elsewhere. In an industry where everything seems to be a compromise, this sometimes makes the most sense as an approach.
I think Quentin essentially answered your question.
What I can add is that while having an img element without a src it technically not per spec, omitting it won't break your code and you'll avoid pre-loading. Then you can just have <img data-src="" data-src2=""> and swap images in/out using JavaScript. The <noscript> element is probably the "correct" way of doing this but it makes code really verbose and I've heard some browsers will pre-load <noscript> images as well.
One service I've come across which makes images responsive (including Retina displays) is Pixtulate. They also provide the JavaScript to correctly size and replace your img.src url before the page has finished loading. Should be pretty simple to drop in.
If you do something with window.devicePixelRatio and loop through images that have data-highres attribute.
This is quite rough, but you get my drift.
$(document).ready(function() {
if (window.devicePixelRatio > 1) {
// retina
} else {
// default
}
});
Use a blank/clear 1px gif
<img src="clear.gif" data-basesrc="img1" />
Then use basesrc atttribute and append the appropriate filename in your JS.
The new layout of YouTube added a background random-noise which I like very much, having seen almost exactely the same effect on other sites, so I plan to use the same technique in my webpage prototypes, or at least have this "trick" in my toolbox for future use.
The image is like this (taken from http://g.raphaeljs.com/barchart.html):
Now Youtube accomplishes the (embarrassingly identical) same effect by embedding the image in source code:
(on Youtube main page, right click background to display it, then right click the image and "display image properties" [ffox]):
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAJUAAACVCAAAAAB0....lotsofdata
I tried to discover where this line of code is in the source code, but due to the dynamic creation, I couldn't.
So, my question is:
"Is there a way to apply a tiled background to a page, using a png image generated algorithmically CLIENT-SIDE?" (preferrably with javascript)
I am very beginner in webdev and javascript, but I like to base my learning around defined problems to be solved, so this would be a nice way to learn something
Thanks for reading!
UPDATE:
For anyone interested in tile texture generation using javascript, I found this, which seems very interesting:
http://somethinghitme.com/projects/canvasterrain/
http://somethinghitme.com/projects/canvasterrain/js/canvasTerrain.js
To generate image client-side, I suggest you to have a look to HTML5 canvas element.
You can draw on a canvas with Javascript (even if the canvas element is hidden), and so generate anything you want (including a simple noise tile).
Resource to learn Canvas drawing : https://developer.mozilla.org/en/Drawing_Graphics_with_Canvas
After that, you can export your canvas as URL with the method toDataURL (a string like "data:image/png;base64....") which is interpreted by browsers like a traditionnal url for an image, so you can set it as css background for your body element.
Warning 1 : Canvas is supported by all modern browsers and you can emulate it on IE with ExplorerCanvas - but I don't know if ExplorerCanvas support .toDataURL()
Warning 2 : Canvas is resolution-dependant, so I suggest you to generate a little tile (32*32, or 64*64) and repeat it
Edit : An example of tiled background : http://jsfiddle.net/SfzPc/12/
Edit 2 : An completed example with a noisy background : http://jsfiddle.net/SfzPc/14/
You can use CSS to display this image:
#someimageselector {
background: white url('data:image/png;base64,iVBOR...lots of data') repeat scroll left top;
}
You can change the initial color of your background by editing the value white.
To set CSS with JavaScript, set the background property of an element:
document.getElementByID("someimageselector").background = 'white url(data:image/png....';
There are two jQuery plugin libraries that do exactly what you are looking for: NoiseGen and Noisy. Haven't used either yet but they both look pretty good.
NoiseGen
Project: http://primegap.net/2011/10/20/noisegen-generate-background-noise-with-jquery/
Demo: http://www.lucaongaro.eu/demos/noisegen/
Noisy
Project: https://github.com/DanielRapp/Noisy
Demo: http://rappdaniel.com/other/noisy-sample/
Fyi: Base64 is binary data represented as a string.
Most likely the original image still came out of Photoshop and was later encoded into Base64.
This technique helps having less http-requests per page view, as the actual image data can be saved and cached inside the css or html document.
My demo is here.
Basically, I have a HUGE image (19160px × 512px to be exact, just under 2MB) that I transition the backgroundx using javascript to make it appear as if a transformation was happening.
I cannot compress the image much more without ruining its quality dramatically. Is there another way that I can achieve this with the same level of cross-browser and not rely on plugins like flash, but have it load faster?
Have you considered making this a video?
It might improve loading time somewhat.
Also, another idea. Have you tried using only the first and last image, putting the last one on top of the first, give it opacity:0 and fade it in using JavaScript (e.g. jQuery)?
The effect won't be 100% identical to what you have now, but it might look good enough to please the client, and it would reduce loading time to a bare minimum.
If both ideas won't work for you, I think the first 10-12 frames could be compressed more effectively as GIF images. (It's an estimate, I haven't tried.) You would have to split the image into multiple div s to do that and change the method you use to switch the images, and you would have more requests, but it could be worth it.
If it is a jpeg, you can always use progressive encoding. it will become clearer as it is downloaded.
There is also an interlaced
"Progressive JPEG" format, in which
data is compressed in multiple passes
of progressively higher detail. This
is ideal for large images that will be
displayed while downloading over a
slow connection, allowing a reasonable
preview after receiving only a portion
of the data. -Wikipedia
Slice it like Google Maps.
If you want to change that many pixels on the screen at once, you'll have to get them to the client somehow. You could chunk it into multiple images and use something other than background-x, but then you expose yourself to other potential network interruptions along the way.
The only alternative I can think of to precomputed images like this one is to do the computation on the client - start with the full-colour image and manipulate it using the client's CPU. Your options here involve canvas or CSS3 or a plugin.
I'm not a big fan of Flash but in this case it seems like the right tool for the job (unless you need it work on the iPhone). If you don't have the Flash authoring tool you can use the free Flex compiler.
See http://www.insideria.com/2008/03/image-manipulation-in-flex.html
Make it into an animated gif? Break it up into individual parts to remove all the area that is obscured by content.
I'm showing images from other websites as thumbnails. To do this I display them in a smaller img tag so the browser does the size decrease.
The problem is that the quality of these images (which I have no control of) is diminished.
Also they look much better in FF and Safari than in IE.
Is there a way to make these images look better without caching them on my server? (e.g a javascript library that does the resize with better quality)? Any idea is highly appreciated.
IE's default image resizing algorithm is not that nice - it can be changed by tweaking a registry entry, but of course that is outside of your control.
However, apparently it can also be triggered to do a better image resize through css
img { -ms-interpolation-mode: bicubic; }
source: http://edmondcho.com/blog/2009/03/17/internet-explorer-image-resizing-tip/
A quick Google search shows that in IE7 you can fix the image quality problem:
http://devthought.com/tumble/2009/03/tip-high-quality-css-thumbnails-in-ie7/
The only way to have control is to do the resizing yourself. Various browsers will use different algorithms, some with unsharp masking, some without. The filters used after resizing control most of this. Specific CSS tagging can control this to some extent.
Javascript can't really handle this, but using Flash or similar would allow this. You would have better control of the image. However, you would lose the "imageness" as far as HTML.
One thing I didn't see mentioned by the others - you aren't really resizing the image, you are just displaying it in a smaller space. Let's say you are pulling down an extremely large image file (5MB) and displaying it at 1 x 1 - it's still 5MB!
Writing a caching solution for these images wouldn't be very difficult at all - and will save you the legal ramifications and embarrassment. If I saw your site in my log files and realized you were pulling down my images, you would be Goatse'd - hard.
If you are working with a source image and simply re-sizing on the client, there isn't going to be a good way to do this.
Now, aside from the potential legal ramifications of using other sites images you could look at a simple caching process, and do a quick re-size on the image, and keep the aspect ratio, so that the display is good. This also helps reduce the bandwidth that you are using from the other sites.