Is there any reason to NOT have a webpage retrieve it's main content on the fly?
For example, I have a page that has a header and a footer, and in the middle of this page is an empty div. When you click on one of the buttons in the header, an http GET is done behind the scenes and the .innerHTML() of the empty div is replaced with the result.
I can't think of any reason why this might be a bad idea, but I can't seem to find any pages out there that do it? Please advise!
It's not unheard of, but there are issues.
The obvious one is that some users have javascript turned off for security reasons, and they will not be able to use your site at all.
It can also negatively impact handicapped users that are using assistive technology such as a screen reader.
It can make it harder for the browser to effectively cache your static content, slowing down the browsing experience.
It can make it harder for search engines to index your content.
It can cause the back and forward buttons to stop working unless to take special steps to make them work.
It's also fairly annoying to debug problems, although certainly not impossible if you use a tool such as Firebug.
I wouldn't use it for static content (a plain web page) but it's certainly a reasonable approach for content that is dynamically updated anyway.
Without extra work on your part it kills the back and forward history buttons, and it makes it difficult to link to the pages each button loads. You'd have to implement some sort of URL changing mechanism, for example by encoding the last clicked page in the URL's hash (e.g. when you click a button you redirect to #page-2 or whatever).
It also makes your site inaccessible to users with JavaScript disabled. One of the principles of good web design is "graceful degradation"--enhancing your site with advanced features like JavaScript or Flash or CSS but still working if they are disabled.
Two considerations: Search engine optimization (SEO) and bookmarks.
Is there a direct URL to access your header links? If so, you're (almost) fine. For example, the following code is both SEO friendly and populates your page as you desire:
Header Link
The catch occurs when people attempt to bookmark the page they've loaded via JavaScript... it won't happen. You can throw most of those potential tweets, email referrals, and front page Digg/Reddit articles out the window. The average user won't know how to link to your content.
Where did you read it is a bad idea? It purely depends on requirements whether or not content will be populated on-the-fly. In most cases, however, the content is loaded along with the page not on-the-fly but if you need your content on-the-fly, it shouldn't be a bad idea.
If your content is loaded via javascript and javascript is disabled on users' browser then definitely it is a bad idea.
I cant think of a bad reason for this either (other than possibly SEO), one thing that would probably be a good idea is to load the data only once. ie
Show Div1 - do ajax/whatever only if the innerhtml is blank
Show Div2 - do ajax/whatever only if the innerhtml is blank
<div1></div>
<div2></div2>
This should keep the server load down so the divs content is only loaded once.
Cheers
This is pretty standard behavior in ajax enabled sites.
Keep in mind however that extra effort will be needed to:
ensure the back button works
link to (and bookmark) specific content
support browsers with javascript disabled.
Related
I have a site that is similar in layout and function to the app of any streaming music service (Spotify/Rhapsody/iTunes/). I've got a persistent play control at the bottom, persistent navigation on the left, and the center/bulk of the page is used to pick what you want to play, read more about what you are going to play, etc.
I've implemented it in the most logical (for a programmer) way using an iframe for the center content. But is there a better way, a way more conformant with SEO?
I suspect the current approach is terrible for SEO (even with sitemaps) and might violate some cardinal rule because I would need to add some code on each page to check if it is being viewed through the proper iframed interface and if it is not then the page would need to redirect to load up the full interface with the desired content in the center iframe of that interface (that's how I would and have solved similar problems ten years ago).
Rather than redirect on landing I could simply add the interface elements but unless some unknown magic happens when they explore content the page will reload and anything they were playing would stop playing as the page unloads. I do not want to interrupt play, even to resume it at the right spot.
Is the old and reliable reloading mechanism the only real solution? Can you get do it and not be SEO penalized?
Any ideas?
In your case the UX is the most important thing to consider and to work on it, then the other things come like SEO.
You have to focus only on the most important pages like singers pages, geners and playlists, other pages no need to index them, you can avoid indexing them by adding canonical links or from robots.txt or by adding meta tags noindex.
Other thing is the URLs and advanced techniques, when the user click on link you should get the results without refreshing the page using JS, but here Google will not be able to crawl these pages.
Here you need to use advanced techniques like "Progressive Web Enhancements) and the best example to see is Tumblr.
All the pages are done by using this technique which allow them to add a great user experience and at the same time Google can index all the pages.
Example for the links:
Singer Page
You have to read more about it, also the old technique "graceful degration" for old browsers can help you a lot.
Okay, so this may sound confusing, so let me explain. I'm working with a theme in wordpress that has a single page layout and standalone page layouts. In the single page layout, every navbar link you click on scrolls you to a section of the page. In the standalone pages, when you click on a navbar link that contains content for the home page, it links you to a standalone page of that content rather than going to the home page and scrolling to the content.
Now before I get many answers saying just do url/#content block, it doesn't work as the theme creator decided to use multiple ids all named content. Horrible I know. I've tried a lot of things actually. So the idea I have now, is to store a cookie when the user clicks a link in the #header navbar and store a cookie in the browser. When the user reaches the homepage, the homepage checks that cookie and scrolls the to the proper area.
I've never worked with cookies to know how to write the code, I just understand how they work from php, I figure javascript is somewhat similar. If something is unclear, please ask.
You have 2 types of cookies, Http only and regular (you have more, but for this question the others are non relevant). Since here you are talking about creating cookies in JavaScript, the Http only cookies are non existent.
This javascript basic library will give you the tools to do what you want.
Now, from my own view of this problem, I would recommend using local storage only if your viewers are using new browsers (old IE won't work). This javascript library will explain how to use it.
Hope I helped, Cheers!
What I am trying to accomplish is that when redirected to a external page has a frame of some sort on the page that is loaded so it would quickly allow them to come back to my site. I have tried framesets (ugly) and iframes, but some sites block the page from being loaded into the frame and I cannot get it to work properly. I am trying to figure out if it is better to load the external site in a div or what. So I am pleading for help with this issue.
Thanks!
Rob
The simplest method to use for this is to just open external links in new tabs/windows. Some users will find this irritating, though, so be aware. Otherwise, let people go. They know how to use the back button, and building some some way to put the destination site into a frame and offer them a return to your site is a lot more disruptive and unexpected a behavior than most users want. Even opening new windows or tabs is breaking the standard user experience, but it's a lot less unexpected than using frames.
Again, let them go. They know where the back button is. If you want users to come back, having content that draws them back is a lot more compelling that some gadget that lets them come back.
I want to have a popup div with iframe content.
Can search engines read this when I'm using jquery to create it?
Alternatively, is there a way to detect a search engine on the server side and remove the option of this popup?
The best way would be to degrade gracefully, e.g. by using a standard
<a id='mylink' href='xyz.html'>
link that points to the resource that is opened in the popup.
You would then add the JQuery code to the link, causing it to open in the pop-up.
That way, even users that do not have JavaScript turned on can access your popup.
Most Lightbox clones like Thickbox work that way.
Generally speaking, search engines do not execute JavaScript, so there's no way for them to index anything contained in a popup div.
You can, however, inspect User-Agent header to see if the page is requested by web spider, but this is something that's not considered best practice.
Search engines do not play well with javascript but you can see how google bot would fetch your page using Google's webmaster tools.
Rendering a different page to bots is not considered a best practice too. The best you can do is graceful degradation.
Just to say it at once: DON'T use the iframe, as it is bad-ass technology and search engines won't index the iframe'd page.
First question:
No, it cannot, if the data are loaded simultaneously with the popup "popping up" (as search engines, as said already, generally doesn't execute javascript). If the data are already loaded, and the popup div are somehow hidden at page load, the search engine will index the content.
Second question:
Don't do that. That's called cloaking, and will be punished by search engines if detected - they don't like content customized just for them, and then you're back at scratch.
Hi.
I'm new to Java/AJAX etc.
I have a page with links down the left and a DIV on the right.
I want content (other pages) to load in the DIV when users click on the links on the left... beginner AJAX stuff I guess.
I played around with a few JQery plugins and found one that allows pages to load with a fade effect, which is perfect. I have a problem though:
The plugin works fine when I click links on the parent page, but when I click links in one of the loaded pages, after one link deep, it breaks out of the div and replaces my parent page. (This issue was described on the plugin page, supposedly solved, but is still cropping up on my page). I suspect it has something to do with the "bind" variable.
I've uploaded a stripped down example of my site here:
This is the plugin website: www.thecreativeoutfit.com/index.php?view=Simple-AJAX-Content-Changer-with-EZJax (Because I'm a new user I can't add any more links, sorry for the long-hand).
For those who are willing to look at my site or the plugin, I'd appreciate your insight.
If that's a hassle maybe you could recommend a similar simple ajax plugin that allows the loading of content with a fade-in effect, but also allows links within the loaded content to stay contained within the original div.
Many thanks!
Max
I was going to post a comment but it got too long so, what the heck..
Your website worked just fine for me (except for the pages that were not available) in Firefox running on Windows XP.
However, I would strongly recommend against that type of design - it will be a pain for you to maintain in the long run and it is generally considered bad design because it is against the functioning principles of the web: different pages of your website should be represented by distinct URLs which users of your site could use to link to. It also breaks browser back button functionality which is a big usability issue (at least for me).
Plus, it will not be SEO friendly - which means that search engines like Google won't think highly of your website - which means that you won't show up in searches.
You nested pages are breaking because the JavaScript click events are not being reattached to the paging controls after the first page causing the normal href attribute to be used.