Related
I know from an historical perspective that it has always been frowned upon to not gracefully degrade when a visitor with Javascript disabled visits a site but how relevant is this now?
I'm asking since an application I'm building is going to make heavy use of Javascript for charting and user interaction with the charts.
Most of the most popular sites on the web are rendered more or less useless without it enabled. I can't even login to twitter without it enabled, core bits of Facebook stop working and trying to visit Google maps just redirects to their search home page.
The kind of interaction I'm planning would be next to impossible to do any other way, and even if there was a way there isn't the resources to develop two versions of things.
The problem I have is that the site also needs to be as accessible as possible. It's an inherently visual site which makes this problematic to say the least. I'm not even sure there is a solution to this particular problem. If it's visual it's not going to be suitable for blind people.
This depends entirely on your needs and what you want your demographic to be. A lot of sites use exclusively flash to silverlight; does that make them poorly designed? No. All it means is that the site caters to a very specific audience.
As long as you are accepting that there may be a (albeit) small populous that aren't going to use the site (or can't due to missing support) then it's fine.
Side-note: there are other means of conveying graphics other than javascript. Using server-side image generators is one way. You may also decide to use flash/silverlight plugins or activeX/applets. My advice would be to design the site as you see fit.
I would bet that the amount of javascript support these days is in the majority, so creating the site first then working on the discrepancies later wouldn't be a bad avenue. Youre still going to receive a high saturation of visitors just on the assumption that most clients support it.
Heck, look at sites using html5--it's not a standard, and no browser has to support it, but this still doesn't stop people from taking advantage of article, header & footer elements. Granted, there is html5shiv to gracefully down-grade it, but even that depends on JS being present.
Javascript has become so thoroughly integrated with web development that it is now users that browse without Javascript that have become a novelty.
What you should do is put up a friendly message that informs the user that the specific service your page provides (charts etc.) cannot be provided without JS enabled, and leave it at that.
What I would worry about is thoroughly detecting feature compatibility, because while people with JS disabled aren't that common, people with ancient browsers are.
Since you tagged this with accessibility, I will cover that perspective. As others have said, Javascript is part of something that we live with. While it isn't expected that the application can be made fully accessible to people with visual imparments, you have to keep in mind other disabilities as well such as mobility imparments. WebAIM has an article about what to keep in mind when you are coding JS events to make it as accessible as possible.
Depending on what these charts do, you should put the resulting data in an accessible table at the very least either somewhere on the page or a link to the data/table in proxcimity of the graph, so that it is easy to find for those who need it.
You didn't specify who you are making this for, so I will give some more information. Gez Lemon wrote an article about not using <noscript>, his summary is:
I'm surprised that the noscript element is conforming in HTML5, as there are much better techniques for ensuring that pages work with or without JavaScript. Despite early accessibility advice advocating use of the noscript element, best practice is to use unobtrusive JavaScript for progressive enhancement, rather than relying on fallback content.
This sort of echoes what I began my answer with. However, if you are doing this for a US or state government agency, you must still use <noscript> tags because it is a requirement of Section 508. The information with the <noscript> tags should be useful. EX:
<script type="text/javascript">
// Do something cool
</script>
<noscript>
Please perform your search again using the <a>javascript-free form</a>
to get the results in a table.
</noscript>
I'm currently enhancing a user interface to check off various accessibility points, and then I read somewhere in a spec deep down in the fine print that what I'm doing must be JAWS compatible.
Normally that's fine, keep things simple, intuitive and abide by rules - but what about when SCORM is involved? I'm dealing with a frameset, reams of JavaScript, an aged Learning Management System, and its pretty daunting.
Is compliance with JAWS in the context of pages delivered through framesets and using loads of JavaScript possible? Should I be freaking out even more and demanding more money?
Providing code at this point would involved completing a test module and packaging it, unfortunately my budget doesn't allow for this level of question-asking, so I am hoping someone can offer some help/guidance based on information alone - if you need more, please ask!
You can do it, you just have to be careful. Frames and JavaScript can be bad for accessibility, but they will only cause problems if improperly used.
SCORM requires frames (a frameset or iframe), and many people rightly say frames are bad for accessibility. The truth is, they can certainly make sites less accessible if poorly constructed (or deeply nested), but assistive browsing technology like JAWS can handle them if they're created with best practices in mind, such as providing clear titles and structure. WebAIM has a good tutorial on frame accessibility.
As for JavaScript, JS is generally available in most assistive technology (98.4% according to a recent survey), so it isn't a showstopper. However, JavaScript can be very dangerous for accessibility if you're using it to dynamically modify the DOM or introduce interactivity (create new markup, animations, make static elements like DIVs clickable, etc.). Assistive technology sometimes doesn't know that JavaScript has modified the content of a page after the page initially loaded, so the visitor has no way of knowing there's new content right in front of him/her. If you use JavaScript to dynamically alter your page's content, be sure to use WAI-ARIA techniques.
SCORM's JavaScript is a different topic altogether. As far as the browser is concerned, SCORM's JavaScript simply handles the course-to-LMS communication, and doesn't change any of the page content on-the-fly. This means SCORM's JS should have no bearing on JAWS, because it doesn't affect your page's markup, and doesn't affect the DOM in any way.
How important is it to gracefully degrade or inversely Progressively Enhance the UI experiance? I mean am I going to lose a LOT of business if I don't? Do you practice this concept? Are there any web 1.0 users still left out there?
Please could you also include if you practice this personally and how much time you've spent relative to the entire project. I realize every project is different, I want to get a sense of how much time as a general rule I should be allocating toward this goal.
EDIT
Firstly, i'm looking for guidance around how much time I should be devoting to making my applications run without javascript.
Secondly, the BS term "web 1.0" (...lol... I don't really like it either) works because we all understand that as the iteration before ajax and all its goodness.
Thirdly, the kind of applications I'm describing are the ones we are all building, not Facebook, not Twitter (unless you're from Facebook or Twitter) but service or utitlity programs like a web calendar, or an online todo list or [INSERT YOUR APP HERE].
Progressive enhancement is more a mindset than a particular task that you need to allocate time to. If you're doing it right (and if it's important to you), you should be enhancing the user experience with JavaScript, but not relying on it.
For example, a link will point to a new page, but with JavaScript you'll disable the link and load new content into the current page with Ajax. Start off without JavaScript and progressive enhancement will follow naturally.
Progressive enhancement is not only smart but it is faster and easier to develop.
And at each stage, you almost always have a working fall-back.
Here's what it looks like in a nutshell:
Boss/client approves mock ups.
We code to valid HTML output. At this point, the boss/client can start using the site. Baring any boss/client changes, the HTML is mostly done. The site is usable at this point.
We start tweaking the CSS to make it match the boss/client's graphic expectations. Changes to the HTML are minor, if any.
In parallel, JavaScript is added to do non-critical, but nice, things. (Sort tables, Fancy CSS helping, replace some links with AJAX calls, warn the user -- client-side -- of input problems.)
If any one of these things breaks, the site still works.
Also, little, or usually no, html changes are needed.
First of all, lets not start using bullshit terms like "web 1.0" and "web 2.0" etc, the fact is the web is forever progressing and new websites are starting to use JavaScript to enhance the user experience.
I don't know anyone that doesn't allow their site to gracefully degrade when JavaScript isn't available, this is for the same reason we use semantic markup so screen readers can correctly interpret our websites for users with visual impairment, and whilst the vast majority of your visitors / users won't fall into these categories it's still important to think about the minority.
Will you lose a LOT of business, well that depends on how successful you are now and how badly your site degrades, chances are you probably won't lose any business... but that should not be the measure yo use to decide whether to gracefully degrade a website.
So unless you can come up with a pretty good reason, you should probably use JavaScript for the purposes of progressive enhancement, don't depend too much on it.
:-)
This greatly depends on the nature of your aplication and its data. If it's something that you know will be mostly used via computers, than degrading to non-script version UI wouldn't make any obvious benefit (even loss of money because it will take considerable time to develop). You can always tell people to enable javascript in their browsers (similar to what's done here on Stackoverflow - try disabling script and reload the page). Your app/site should display at least something when there's no Javascript posibility.
But if your application has simple data to display and users should access it frequently wherever they are, than degrading to lesser browsers (without script engines like Opera Mini) is a must. Creating a separate UI will less functionality, but keeping everything in that users need to access is probably your best option. UIs like separate iPhone applications for example...
JavaScript is disabled by a small proportion of web users, but when you start to talk big volumes this can make a difference. For example for 1 million visitors, you can expect more than 10,000 not to be able to user your site.
You should decide how much lost business is worth the additional cost of having a non-JavaScript version of your site.
You can have an approach where the entire site may not work without JavaScript but that some of the core features are there.
It's fall of 2008, and I still hear developers say that you should not design a site that requires JavaScript.
I understand that you should develop sites that degrade gracefully when JS is not present/on. But at what point do you not include funcitonality that can only be powered by JS?
I guess the question comes down to demographics. Are there numbers out there of how many folks are browsing without JS?
Just as long as you're aware of the accessibility limitations you might be introducing, ie for users of screen-reading software, etc.
It's one thing to exclude people because they choose to turn off JS or use a browser which doesn't support it, it's entirely another to exclude them because of a disability.
Two simple questions to help you decide...
Does using javascript provide some
core functionality of your site?
Are you prepared to limit your
potential users to those who have
JS? (ie. Most people)
If you answer yes to both of those, go for it!
Websites are moving (have moved?) from static pages of information to interactive web applications. Without something like Javascript or Flash, making compelling user interactions is sometimes not possible.
Designing to degrade gracefully is the most that should be done. We are moving/have moved past the point of simple web "sites" to web "applications". The only option besides client side scripting to add round trips to the server.
I think (personal opinion) that the "don't use JavaScript" comes more from a lack of understanding of what JavaScript is/does than any actual market data that shows a significant number of people are browsing without it.
It's reasonable to design sites that use JavaScript but it is not safe to assume that all clients have support for Javascript and therefore it is important that you provide a satisfactory experience even when JavaScript is not available
Search Engines don't support JavaScript. They're also blind and don't support CSS. So my suggestion to you is to make sure that the part of your product that needs to be indexable by search engines works without JavaScript and CSS. After that, it really depends on the needs of your users.
If you have a very limited subset of users, then you can actually query them. But to remember that 10% of the population has some form of impairment ranging from vision issues (low vision, color-blindness, etc.) or motor functions (low hand dexterity). These problems tend to be more prominent in the elderly and the knowingly disabled
If your site will target the general audience of Internet users then please make it degrade gracefully, but if you can't do that, then make a no-JavaScript version (like G-mail has).
I think the days of "content sites only" are gone. What we see now is WWW emerging as the platform of web applications, and the latest developments in the browser front (speeding up JS in particular) ar indication of this.
There can be no yes/no answer to your question - you should decide, where on content site<---->web application continuum your site is and how essential is the experience provided by JavaScript. In my opinion - yes it is acceptable to have web applications which require Javascript to function.
Degrading gracefully is a must. At a minimum, you sure make use of the NOSCRIPT tag in order to inform potential customers first that your site requires javascript, and secondly why you require it.
If it's for flashy menus and presentations that I could honestly care less about then I probably won't bother coming back. If there's a real reason that you're requiring javascript (client-side validation on forms, or a real situation that requires AJAX for performance reasons) then say so and your visitors will respond accordingly.
I install extensions that limit both Javascript and Cookies. Websites that don't prominently state their requirements of both usually don't get a second visit unless there's a real need for it.
You should never design a public site to rely on ANY technology/platform. The user agent may not display colour (think screen readers), display graphics (again, think screen readers or text only browsers such as links), etc.
Design your site for the lowest common denominator and then progressively enhance it to add support for specific technologies.
To answer the question directly: No, you cannot assume your users have Javascript, so your site should work without it. Once it does, enhance it with Javascript.
it's not about browser capability, it's about user control. People who install the noscript plugin for firefox so they don't have to put up with punch-the-monkey garbage ( the same problem that inspired stack overflow) will not allow your web site to do anything non-static until they trust you.
In terms of client software consider users/customers who are using a browser that supports some but not all Javascript. For example, most mobile phone browsers support a bit of Javascript but nothing very complicated. The browsers on devices such as the Playstation 3 are similar.
Then there are browsers such as Opera Mini, which support a lot of Javascript but are operating in an environment where the scripts are running on a server that then sends the results to a mobile device.
You should design websites with Javascript in mind--but not implemented. Consider, build it where every click, every action, performs a round trip to the server. That's the default functionality for older browsers, and those without JS turned on.
Then, after it's all built, and everything is working properly, add in JavaScript which hijacks the link, button and other events, and overlay their standard functionality with the Javascript functionality you're wanting.
Building the application like this means that it will ALWAYS work, which ultimately is what you're wanting.
The received wisdom answer is that you can use JavaScript (or any other technology) providing that it 'degrades gracefully'...
I have experience with disability organisations, so accessibility is important to me. But equally, I'm in the business of building attractive, usable websites, so javascript can be a powerful ally. It's a difficult call, but if you can build a rich, javascript-aided site, without completely alienating non-js vistors, then do so. If not, you will have to look at the context of the site and decide which way to jump.
Regardless, there are no rights and wrongs with this question. However, in some countries, there is a requirement to build 'public' sites to be accessible, so this may be yet another factor in your decision. [In the UK, it is the Disability Discrimination Act.. though to my knowledge, no company has been prosecuted for failure to comply]
JavaScript is great for extending the browser to do things like google maps. But it's a pointy instrument, so use it with care.
My bank web site uses JavaScript for basic navigation between pages. Sigh. As a result, it's not usable from my mobile device.
Make sure you're familiar with the Rule of Least Power when considering JavaScript:
When designing computer systems, one
is often faced with a choice between
using a more or less powerful language
for publishing information, for
expressing constraints, or for solving
some problem. This finding explores
tradeoffs relating the choice of
language to reusability of
information. The "Rule of Least Power"
suggests choosing the least powerful
language suitable for a given purpose.
5% according to these statistics: http://www.w3schools.com/browsers/browsers_stats.asp
As you said, demographics. The web is expanding onto devices that doesn't have very much power, for instance cellphones. If your site is usable without javascript, Opera Mini will likely show your site without any problems.
I think Javascript implementations in most modern browsers have now reached a reasonable level of maturity and there are a bunch of Javascript UI frameworks which let you build very attractive Javascript based web applications using web-services and such (regardless of the back-end server platform).
An example is ExtJS - they have got a very extensive AJAX + UI widget framework which I recently used to build a full fledged internal web-app for a client with an ASP.NET backend (for webservices).
I think it comes down to what you're about to do. Are you writing a web APPLICATION? Then I think you're bound to use javascript and/or something like GWT. Just have a look at all the social sites, and google aplications like gmail. If you're writing a webpage with product descriptions and hardly any interactivity, then you can make the javascript optional.
I agree with the majority of the stackoverflow respondents. JavaScript has matured and offers an "extra" level of functionality to a webpage, especially for forms. Those who turn off cookies and JS have likely been bitten while surfing in dangerous waters. For the corporate power users that pay my way either in B2B or retail sites, JS is a proven and trusty tool. Until something better comes along (and it will) I'm sticking with JS.
There's addon for Firefox called NoScript which have 27,501,701 downloads. If you site won't work without JavaScript most of those guys wouldn't want to use it.
Why you would install that addon? Ever wanted to get rid of the popup on the site that cover the most of the useful text you want to rid? Or disable flash animation? Or be sure that evil site won't steal your cookies?
Some corporate environments won't allow Javascript, by policy or by firewall. It closes the door to one avenue of virus infection.
Whether you think this is a good idea or not, realize that not everyone has full control over their browser and it might not be their choice.
There is a gradient between web sites and web applications. However, you should alway be able to say "we are building a web site" or "we are building a web application".
Web sites should be readable down to plain HTML (no CSS, no images, no JavaScript).
Web applications, of course, could just say "Sorry, JavaScript is needed" (which also assumes CSS for layout). Application should still be able to work without images.
The accessibility issue is the only important technical issue, all other issues can be socially engineered. When one says that javascript reduced accessibility and another says that Web Applications can use javascript, can we take these two together to imply that all blind people are unemployed? There has to be some kind of momentum in making javascript accessible. Maybe a Screenreader object on the javascript side which can detect the presence of a screenreader and then maybe send hints to the screenreader, Screenreaders which can hook onto the browser, and maybe it gets glued together with a screenreader toolbar.
if you want your site viewable by the top 100 companies in the US. I would write without javascript.
Independence from Javascript and graceful degradation are important to an application despite the actual demographics -- because such an application probably has better software design.
The "human user without Javascript" may be purely hypothetical (for example, if you're trying to make money with your product). But designing for that hypothetical user encourages modular software design which will pay off as you continue to develop your app.
Javascript provides functionality. HTML provides data (on the page itself, and via links that point to more data). As a general rule that reaches well beyond browser apps: A well-designed software product will separate data from functionality. All data should be available, and the functionality should be a separate layer that consumes the data.
If your Javascript is creating data at runtime, then it's time to get specific and figure out whether your webpage really is a piece of software (e.g. a mortgage calculator) or whether it's a document containing data (e.g. a list of mortgage interest rates). This should tell you whether it makes sense to rely on Javascript.
As a final note/example, demographics can be misleading. Relatively few humans browse your site without Javascript, but lots of machines (search bots, data miners, screen readers for the disabled, etc.) are browsing your site without Javascript. Again, the distinction between data and functionality are important -- the bots are just making requests and looking for data in the responses. They don't need functionality. But if your user needs to invoke functions just to make your data accessible, the bots are getting no value from your site.
One side point about the screen readers and other accessibility considerations for the disabled. This is an important niche demographic: a mind that navigates data in a human way, but who can only get data from your site in the same way machines get it. By providing data cleanly and semantically on your page, you make it available to the largest possible set of accessibility tools.
Note this doesn't exclude Javascript from consideration. Our mortgage calculator example can still work: accept input from the user, invoke Javascript, and write the output back into the clean semantic data layer of the page. Screen readers can then read it! And if they can't, you're encouraging the development of better screen readers that can.
If you expect your app to work for everyone, you'll need a backup for all your javascript functionality. If it's form validation, you should also check the data on the server before saving it. So the answer is Yes, it's okay, but have a backup. Do not rely on it.
As many people are saying, it's important to consider your user base, but whoever your users are there's a strong possibility that some (stats say 10%) of them will have some sort of disabilities, and screen readers don't like javascript. If you're only adding simple things, a javascript menu or something, then just make it degrade (or don't do it). If the site depends on javascript to work properly, make two versions, one for javascript and one without.
I generally find that anything too javascript heavy is very difficult to make degrade well without just having javascript re-writing the page to a javascript version if the user can take it. Given this, it's well worth writing two pages from square one for complicated stuff.
I would say that there are very very very few web sites that should be running without some support for users without javascript. You'd need to have a very dynamic application that completely didn't make sense as static pages,or you'd need to have a audience you could guarantee were ok with it (like on an office Intranet say).
Well, it depends on your userbase. If you know that people will be using your site from mobile devices, it's good to have unobtrusive JavaScript. However, if you're trying to appeal to a tech-savvy crowd, don't bother with it.
However, if you're appealing to a crowd that may be using screen readers (blind people), I'd highly suggest using WAI-ARIA standards. Dojo's widget system has full support for this, and would be a great and easy way to do it.
Anyway, in most cases, you don't need unobtrusive JavaScript. Most people who have JavaScript disabled are either using a smartphone, using Lynx, or have NoScript installed. It's enabled by default in all the major browsers, so you shouldn't have to worry.
Lastly, it's good to at least have some unobtrusive JavaScript. <noscript> tags are your best friend. For example, one may want to replace a widget that draws rating stars with text. Example using dojo:
<div dojoType="dojox.Rating" stars="5" value="4"></div>
<noscript>4/5</noscript>
It's the 21st century. People not permitting JavaScript need to exit the last millennium, posthaste. It's a mature, widely used, and very useful technology that is one of the foundations of the recent expansion in useful web services.
You should be tying the functionality of your website to your audience. That being said, every modern browser (save for the mobile platform) includes javascript, and so unless your audience includes luddites with decade old computers, you can assume they have javascript.
The people you need to worry about, then, are those that specifically turn it off. This includes:
Corporate networks with tough security (not common, but some financial and defense institutions)
Paranoid web-heads
So, first, who is your audience? Are there other websites that are comparable to your target? Look at their site and success - do they degrade gracefully, and would yo be satisfied with their level of success?
If you are targeting mobile applications, though, you can't guarantee javascript.
-Adam
I would say that you should look at your target audience. If you can reasonably expect that they will have js enabled, and making everything work without any js is too much of a pain, then by all means - go ahead and ignore the non-js crowd, if, on the other hand you have to create a site that will be used by a very large audience/or you are perhaps building a government web site, then you must make sure that everything works, and it is easier in those cases to first build the site so that it works without any js, and add all the nice time-saving ajaxy bits later.
In general though, almost everyone has js enabled by default.
Though you should be aware that server-side validation of user posted data is a must in either case.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm building a widget, and I've been using iframes to present content within it. At some point, I might start serving third party HTML and JS, so I thought iframes would be a good idea.
It does make the widget javascript a little more complicated, and I'm concerned that this might not be the best implementation.
Do you have any advice? It would be a huge help to hear what other people think about iframes.
No, nothing wrong with iframes. Iframes are probably a better idea if you're going to start serving third party content.
The upcoming HTML5 spec also plans to build more security features into iframes for situations like this, so I would consider it good practice to use them now also.
Before XMLHTTPRequest became widely used, people were using a combination of JavaScript and iframes to serve up content in a dynamic fashion without doing full page refreshes.
There's lots of information about developing sites this way so you should have a relatively easy time of it finding workaround to a lot of the snags that you are likely to hit.
The one thing that I have found to be a pain is cross-domain use of JavaScript in iframes. If the page you embed in the iframe is from a different domain than the "parent" page, browsers have security restrictions against letting you access one from the other. The trick is for both pages to declare
document.domain = 'somedomain.com';
There's plenty of stuff on the Web about this kind of workaround.
Good luck!
One thing I discovered recently is that .aspx pages embedded inside iframes sometimes have problems with losing cookies, which led to lost session state in an application I was involved with.
For me, it was in a scenario where a different development shop was consuming one of my .aspx pages in their own page. This means we were on seperate servers, which may or may not be salient.
Apparently this was caused by the parent page rejecting cookies for the child page... As goes the session cookie, so goes the session.
The specific mechanics of how this works are a little involved: More Details
This problem did not impact FireFox, but it did show up in IE7 and it was a real mystery for a few hours.
Also, I have to contradict the article I linked to above on one point. The article says that you don't get this if the containing page is also an .aspx... In this case, that was not true because both pages were .aspxs.
That casts some doubt on everything else the article says about this situation, but it did lead to a resolution, so that's something as well.
As the article suggested, I put in the following code, which injects a p3p (Privacy Preferences Project - I had never heard of it) header in the page's Init event:
HttpContext.Current.Response.AddHeader("p3p", "CP=\""IDC DSP COR ADM DEVi TAIi PSA PSD IVAi IVDi CONi HIS OUR IND CNT\""")
...And that fixed the problem.
I'm going to disagree with the majority and say that yes, iframes are an absolutely terrible idea. Anyone that has worked within the Web Design community for a while will agree that iframes are pure evil and should be avoided unless ABSOLUTELY essential.
My reasons for believing that they are bad is because they break the navigational pattern of a web page. By using an iframe you can effectively break the back and forward buttons on browsers and confuse your users. It breaks the entire idea behind the HTTP protocol; that a URL will always lead to a unique location. If the iframe were a horse it would've retired long ago. There are other ways to serve content dynamically and these should be used instead.
If you're creating a widget then the immediate concerns with using iframes disappear (bad for Search Engines, bad for Bookmarking, etc), but either way content would be better served dynamically or even in a new window rather than in an iframe.
There is only one "really bad" thing with them that I'm aware of.
If your 3rd party does some JavaScript, that attempts to modify their DOM a bit too early... IE6 and IE7 will throw the oh so unhelpful "Operation Aborted" error, then blank out not only the iframe, but the entire surrounding page. (e.g. your site appears down)
It isn't fixed in IE8, but the crash is better handled.
Personally, I'd avoid it if you can without too much hassle. Using Javascript (or AJAX if you need to load them dynamically), you can quite easily just use a div and change the contents as necessary - in some cases this will give you much more flexibility and will simplify your JS, especially if there's a lot of interaction between your widget and the rest of the page.
That said, I'd investigate both options, and if the JS path seems too tricky or complicated, just go with iframes.
In my experience, iframes are either hacks or time-savers - make sure that if you're using them they're neccesary for those reasons. If you have control over the content (or can gain control through mirroring or scraping) you should consider using AJAX or server-side includes to pull external data onto and push it off of the page - it'll end up being more flexible, more robust, and easier to manage in the end.
Depends what the widget does. Iframes have their place, but they do cause few layout headaches (not to mention making your js more complicated) so most people tend to avoid them unless absolutely necessary..
iframes, like frames, are just controls to use for the task at hand. As such, it is neither good nor bad in itself, but could be good or bad based on the task at hand and the client's requirements. As far as I know, all modern browsers (and non-linux users) will be able to "see and consume" iframes without a problem.
A good option is to use the overflow CSS property. The default value is visible but you can set it to hidden, scroll or auto. I would use auto in your case. If your content gets too big it will look like you have an iframe but it is still right on the page.
see: overflow property
Iframes are not evil they are just another tool like anything else and to determine their merits you have to determine the context in which they will be used. Google Image Search, and several other high profile sites, use iframes for limited purposes.
In general I find they are used for branding or to enable a user to return to a site that redirects the user off site.
Note, if you are using cross domain iframes e.g. an iframe that refers to a domain outside where the page is being served you are limited by design for security reasons and cannot access through javascript the internals of a DOM outside the domain it is associated with.
Also please note many sites prevent their site from being embeded and will stripe the iframe off (redirecting the top url to their domain).
Not necessarily, as long as the content within the iframe is predictable.
Technically there is nothing wronger with iFrame that with alternatives. But semantically, there are evil.
The Web is based on HTTP, a protocol that says a given URL will always leads to a unique ressource.
Using iFrame, you just serve several ressources melted in a web pages behind one URL for all of them. If you have concerns about how the Web should grow, it's troublesome. What's more, for the search engine robots, it's tricky.
There are several usability and accessability issues with iframes. Some browsers and screenreaders can not display iframes, so you should provide alternative content:
<iframe src="content.html">
<p>
This content will only be displayed by browsers that do not support
iframes. You should provide a link to the content, or in your
case an alternative way to use your widget.
</p>
</iframe>
If you start serving third party content, you should watch out for the iframe grabbing focus after it has finished loading. While a minor annoyance for regular users, it can be very confusing for users browsing with screenreaders.
Re: "the entire idea behind the HTTP protocol; that a URL will always lead to a unique location"
I serve my entire CMS from the same URL for security and scaleability (using mostly POST instead of GET parameters). I don't want secure content visible without authentication, and a dispatch system makes development easier for me as I don't have to worry about authentication for every new page.
Also, for some applications SEO is not applicable (such as for web-based ERP).
I use an iFrame for serving content from a PHP generated assembly tree. I don't want the tree (and node visibilities) refreshed whenever the user wants to view details for a part/assembly.
There is a significant issue with iframes that hardly gets a mention but which bugs the stuffing out of me.
Our colleague has a lifetime of work invested in a dynamically changing database which we have loaded into Google Docs spreadsheets which we then display on our site alongside a lot of supporting material.
There is absolutely nothing to stop someone grabbing the iframe code out my page source and shoving it onto their page. Now they are getting all our data, refreshed right up to a few minutes ago, served to their page for absolutely nothing at all.
If a google iframe could be tied to a specific domain, that would stop that in its tracks.
Any ideas, bright sparks?