jQuery media browser - javascript

I am attempting to build out a visual jQuery based browser for thumbnailed assets grouped by the upload date of the asset. The backend part is fine, but I'm having a really hard time finding a workable visual solution that can handle (potentially) hundreds to thousands of assets smoothly. The display of the content is not an issue as it is being handled by a lightbox, I just need to figure out a way to actually display the thumbnails.
I've been trying to interface with this plugin but have been running into a lot of problems once it gets over 100 records, everything just becomes horribly unresponsive. Ideally I want to be able to build ajax into this for loading media as needed rather than a bulk get on page load. Does anyone know of a good plugin that can be leveraged to achieve this effect or at least provide a good user interface for browsing large amounts of content?
To clarify: I have properly generated thumbnails being made when an asset is uploaded, these are what are displayed on the page, and the full size image is only loaded in the lightbox when the thumbnail is clicked. I'm just trying to determine a good way to browse a large quantity of thumbnails sorted by upload date.

It sounds like your scaling the full size images down which will ruin performance. Do the images have a small version counterpart you can use for the thumbnails?

Turns out there really wasn't a good way to implement a prebuild plugin for this situation, the best way to do it for me was to implement a jQuery UI slider and on the stop event of that I then do an ajax get to load the appropriate data into a div with vertical scrolling.
Maybe not the most elegant or prettiest solution, but it works for the situation and looks nice enough.

Related

Responsive Javascript Files?

I've built my mobile site using the jQuery Mobile UI but I now realize that I need some functionality to be different between it and my desktop site (datepicker dates should be longer on the desktop site, etc).
I've looked into Modernizr and matchMedia to help me load separate js files but I can't figure out a nice way for it to work responsively. Modernizr only works (unless I missed something in the doc) on the initial load and I'm having problems with matchMedia executing multiple times as it crosses the min/max-width threshold. It works sometimes but the trouble is in adding/removing the separate js files. On top of that (but not a huge issue - I don't think), Chrome fires off an error when loading scripts into the DOM from another script.
Would the best thing be to write one js file and then add a listener such as $.mobile.media("screen and (min-width: XXXpx)") to the body's width, changing my needed values?
I may be making this too hard for myself, or missing something obvious because I'm trying to keep HTTP requests and site size as small as possible, but I can't figure out a good solution for this.
Thank you!
Lightning Round Bonus Question: Is it good practice/proper to keep all of the jQuery Mobile styling (data-role data-id data-theme) after switching to the desktop site? It looks awfully .. awful for someone only viewing it on a larger screen.
IMHO, the best approach would be to introduce a couple of flags in your javascript, like "isMobile", "isTablet" or "isDesktop", that would be set within a method attached to the "pageinit" event of your webapp. Later on, you could check that flags to act accordingly with the proper version in the specific parts of your code.
The way you can check that flags depends on your architecture. In the project I am working right now, I extract that information from a class attached by the back-end on the body of the page, but that's because we have that info. You could try to use a library like Harvey to observe the media-queries that would be trigger, and set the flags accordingly. I don't think that your options ends here, but I am afraid I can't help you more!

Caching of HTML Output

I am pretty sure that I know the answer to this already, but I am interested to see if anyone has other ideas. We are working on a website to include a major redesign with mega menus. One of my top things in a redesign is to reduce the page download time as much as is possible. All my images, css and javascript are cached and that's good. However, the part that I am trying to work through is the html coding for the menu, and if there is a way to locally cache that within reason.
As a side note, I like to do things as pure CSS as possible (for SEO), and so that would include outputting the mega menus directly onto the HTML page. But at the same time, I know that if it takes a number of seconds for the page to download the html content at the top, well, then, we are probably going to be running some customers off there too. Maybe the best then would be to have JavaScript output the menus, but then you run into the couple of customers that don't have Javascript enabled.
Right now the pages are about 30K for the menus, and I anticipate that doubling and maybe more when we do the redesign.
Do you have some thoughts for this issue? What would you see as the best way to tackle this?
Thank you!!
JMax
Honestly, 30K for an element is nothing in this day and age, with high-speed connections common and browsers effectively caching as necessary. People don't leave because of a second or two. It's when you have Flash movies that preload or crazy auto-starting videos that people get annoyed in a hurry.
I've got a similar application with a menu that's likely double that right now...let me add, it's not by choice, it's something I inherited and have to maintain for the time being. The menu is output simply in an unordered list and then I use Superfish and CSS to do the styling as necessary. There's an initial hit, but after that, caching kicks in and we're good to go. Even as crazy as it is, the load isn't prohibitive. Navigating it, however, is a mess. I'd strongly recommend against confusing the heck out of your user with so many choices, especially on a mega menu that can be a UI hurdle for disabled and older users. When you boil it down, the whole basis behind the "Web 2.0 movement" (I hate that term) is to minimize or cloak complexity.
If you're REALLY concerned about performance, start off with what you're loading. Limit your Javascript by combining files, especially those small Jquery files that tend to stack up. HTTP requests can severely impact a site, especially since they monopolize the loads initially. Similarly, combine small CSS files and optimize the rest via an online tool To reduce image loads, create sprites for your graphics so you're loading one file instead of many. Here's a tut on Sprites and a simple google search will give you dozens of sites that will build the sprite and css automatically. Load anything you can from CDN, such as Jquery, Prototype, etc (hopefully only one framework per site, because two or more is unnecessary)
If you're still out of hand, look at your graphics one more time. Could you take advantage of pure CSS or image repeating via CSS to reduce loads further? Have you optimized all the graphics? Could you tweak the design to take advantage of those tricks?
After all that, if you simply can't change the menu to be more friendly, start investigating options. However, I suspect you'll find better gains in the first couple of steps than you would from taking extreme measures on the menu.
You could either set HTTP caching for a javascript code file that generates the menu, or use ajax to insert a pre-generated HTML menu from another file (again with a long expiry date set on cache).
Both those solutions require javascript through. I can't think of another way to remove the menus from the HTTP traffic apart from an IFRAME (yuck).
30k is massive for plain HTML though - do you REALLY need such a huge navigation structure?

How do you show pictures as fast as Facebook?

Can any of you help me to be able to show pictures as fast as facebook does!
Facebook is incredible to watch pictures at, because the pictures are kind of preloaded I think.
Often ved you view galleries on other sites, it is a pain in the a**, because it is so slow every time you change picture.
I think you need javascript to do it!?
Depending on your implementation, you could do this with some ajax and hidden dom elements.
Suppose you have a gallery with a slideshow.. You could insert a hidden dom element with the picture next picture of the slide show for each load. This would cause the image to be loaded. If you then were to use JS to insert that same image tag later, the browser would rely on it's cache rather than fetching it form the server since it already has that photo.
This is kind of a broad question but I think this approach would work. You would probally be better off not reinventing the wheel and seeing what Image prefetch librarbies based on JQuery or whatever are available to you..
Facebook compresses images to extremes. Try it yourself, take an image you are having trouble with and upload it to Facebook. Then check the size of the image, you will know why. Once I did a small test by uploading 17429 bytes image and it compressed it to 18757 bytes, a complete 7% increase from the original size!
At that compressed size, you can implement some sort of prefetch next image for display. Along with, I think, they have extremely good infrastructure.
Facebook uses Bigpipe, there is an open implementation in the works called openpipe
Bigpipe pushes the content to the browser when server stopped processing, so user will notice that it is faster.
It basically loads pagelets, when they are ready for the user, at the browser the implementation is Javascript based, and you must push the info to the client with your preferred server language.
First of all, facebook heavilycompresses images. Many other websites don't. Facebook also has a faster network than most other websites.
With the small image size, the client can prefetch the next image.
Preloaded would mean loading when the page is loaded, which is what happens with an <img> tag. No, it's simply because the file size is smaller.
If your wanting images to be viewed quicker on your site first make sure the images are decently compressed and aren't any bigger than they have to be. The amount of times I have seen websites using an extremely large image scaled down to fit in an element 5 times smaller is just ridiculous.
You can check out these sites that has many implementations and links on how to pre-load / pre-fetch images (css, JavaScript, ajax)
http://perishablepress.com/press/2009/12/28/3-ways-preload-images-css-javascript-ajax/
Since your question was tagged with 'jquery' here is one just for that.
http://engineeredweb.com/blog/09/12/preloading-images-jquery-and-javascript

What is the most efficient way to store an image? HTML/CSS/JS

I am going to have a lot of images and trying to find the most efficient way of storing these images to keep the page snappy.
So far I have thought of just the two ways: load with javascript eg picture = new Image(); picture.src = "file.jpg"; and append / remove to the page as necessary, or load into <img> and set display:none.
Are there other options? what is considered the best way to do this?
The best way for a photo gallery (if thats what you are building) is usually to have several sizes of the images, at least two:
a smallish size that is highly compressed and thus have a small footprint: this is the image you load into grids and display in a page where there are multiple images
a larger image with lower compression and higher image quality - this is the one you show when people want to see details.
Since people most often come to the detailed image from a page where the small/fast loading version has already been shown, and thus is already in the browsers cache, you do a little trick and have instant photos, without preloading anything.
It goes like this:
On the details page you show the highly compressed small image in an image tag that has the dimension of the larger detailed version. You then load the larger detailed version in the background using new Image() with an onload event attached that changes the source of the image tag with the small compressed version to the large detailed version.
It looks great, works fast and users will love you ;)
PS: the best way to store images is the browsers cache, not js or the DOM, so if you truly wish to preload images, which is generally a bad practice (tho it can be necessary sometimes), make the browser fetch them for you in the background by including a css file that references them in styles that aren't applied to visual areas of your site.
I'm not sure about "efficient", but the most logical way would be not use the JavaScript to load an image (useless if you have JavaScript disabled) or to set the image as hidden via the display property (likewise, and the browser will probably just load the image anyway).
As such, a sensible suggestion would be to use boring old paging and display 'n' images per page. However, to bring this up to date, you could use "lazy" (a.k.a. "deferred") loading and load additional page content via Ajax as the user scrolls. However, it's key that this gracefully degrades into the standard "paged" behaviour if JavaScript is disabled, etc.
The perfect example of this in operation is Google's image search, and if you search here on StackOverflow you see a discussion of possible implementations, etc.
It's better to use javascript the way that you have it and then add it to the DOM as you need, as opposed to first adding it to the to the DOM and then hiding it because DOM manipulation is much slower and you may not use some images

Jquery Best Case Scenario

I am loading a bunch of images for a GUI which will pick attributes for my clients product.
Is it best to pre-load all the images at the start (which I'm guessing would take some time) and then have the GUI have full functionality or is it better to load images on the fly.
Essentially i am working on a picture/poster framing application. I have about 20+ frames that will be able to be selected by the user and when a frame is clicked I change the images for the frame on the display in the GUI.
I am using Jquery. Any insight would be helpful as to the best case scenario for this.
And if I will be pre-loading all the images how do I put one of those loading bars on the screen like you see in Flash or a loading gif like i've seen in Ajax?
Thanks
Mike
Why not do both?
You can load images lazily, but also hook $(document).ready() to pre-load the images. That way, if the user accesses an image before it's preloaded, it comes in then; if the user waits long enough it will be instantaneous.
This technique is common with things like menubar roll-overs.
Depends on frame images' size...if they are small like 1 - 2K, I'd load the images dynamicaly, otherwise you can preload, but be sure to set the headers right so only once are read and fetched from cache next time.
As for progress bar, I suggest you check this article (talks about preloading images in jQuery and includes progress bar) on Ajaxian.
The correct answer depends on many factors. How large are the images and how many are there? Will loading all images at the start cause severe lag? As Jeff Atwood said Performance is a feature.
I would err on the side of a better performing app, rather than loading everything up front.

Categories

Resources