I am just wondering if Google or other search engines execute JavaScript on your web page. For example, if you set the title tag using JavaScript, does the Google search engine see that?
There have been some experiments performed for SEO purposes which indicate that at least the big players (Google, for example) can and do follow some simple JavaScript. They avoid sneaky redirects and such, but some basic content manipulation does seem to get through. (I don't have a link handy for Google themselves confirming or denying this, it's just various posts I've come across when dealing with this before.)
However, this is generally considered unreliable. If SEO is being done for any important purpose, don't rely on the spiders indexing much dynamic content.
There's actually a very good (in my opinion, anyway) answer here to a very similar question. What I like about that answer is how it breaks down the steps for generating good, indexable, and best of all maintainable web pages with concerns properly separated. Adhering as much as possible to this process generally results in good SEO, good accessibility, and good design skills in general.
Yes, Google executes Javascript. How much is a moving target.
Google executed some Javascript as early as 2011: http://searchengineland.com/google-can-now-execute-ajax-javascript-for-indexing-99518
This article from 2012 documents some experiments on what Javascript Google did and did not run at the time: http://moz.com/ugc/can-google-really-access-content-in-javascript-really
In May 2014, Google said publicly that they execute Javascript: http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html Although that post says that Google has been getting better, there are no publicly available details on what Javascript Google does and does not execute -- but presumably they are at least as good at it as they were in 2012.
I'm pretty sure they dont. However, you can see for yourself: google have a tool which will show you your page as it sees it as http://www.google.com/webmasters/
if the text is in the onpage javascript, google will see the text. but it will not be seen as the text of the title element.
but hey, this is quite easy to test. just do it. wait two days. if you then google your site with site:.... look whats in the headline. if it's in there then the answer is yes: google sees it, if not: no google doesn't. it's easily testable.
(p.s.: my money is on: no)
We need to remember that JavaScript is client-side language, and always start executing from client-side. If all of titles or contents are via javascript then it'll be output from client-side, and I doubt it'll show up on Google search (meanwhile if outputted on .html, then yes).
If I am correct as of latest, meta tags are "fuel for search-engine", and it have ties to SEO, where it is common robots to be scripted to crawl on your site.
I'm writing a UI to my R script, which asks the user some names of organisms and the location of a folder, using javascript/html that will be local (not hosted, ever).
At the moment, I have just that: a couple of text boxes that take input and pass an executable R script. Originally this UI was being written as a very user friendly option, but slowly I've realized that some nifty tricks can be added such as a textbox that completes the word for the user (so if the user misspells the name of the organism, the UI will correct the input based on the files uploaded. And this would come from a list of organisms text file that R would generate immediately once the files have been added).
Is there a way to make this more efficient? For example, retrieving plots from R (as .pngs) and updating my local webpage and being able to share a log file between R and the UI (mind you, I am aware of the potential File I/O errors)..but for the sake of brainstorming.
I'm aware of Shiny, but what I would like is a simple local UI, as I will be dealing with big data (average ~ 1 gigabyte worth of files that my script will process).
Another way to ask my question that is more to the point:
Here's an example of integrating PHP and R: http://www.r-bloggers.com/integrating-php-and-r/
I am looking to create something similar with javascript/css/html/jquery etc.
Thanks
You could definitely use nodejs (nodejs.org) for that. Take a look at https://github.com/elijah/r-node and r-node. Confusingly enough, this is two different projects with the same name. More info on the latter here: squirelove.net/r-node/doku.php
In recent years JavaScript has become one of the fastest programming languages. In one case I know of, JavaScript is faster than C++. See: benchmarksgame.alioth.debian.org/u32/performance.php?test=regexdna
Bear in mind, though, that memory is very difficult to manage in JavaScript, so you should run some sort of memory leak detection program on your code, if you plan to create long running processes.
E.I: memwatch (npmjs.org/package/memwatch) or nodeheap (npmjs.org/package/memwatch)
Good luck with your endeavors!
PS. sorry for the lack of real links. I'm apparently not allowed to post more than 2 links.
Why wouldn't you be able to use Shiny locally? You design your app on your computer and run it locally with runApp('myapp') from an R-prompt. Unless you are experienced with javascript I would give shiny another look: http://www.rstudio.com/shiny/
The example you linked to can be very easily implemented using Shiny. See link below for a tutorial on how to write the app:
http://rstudio.github.com/shiny/tutorial/#hello-shiny
To run that example locally:
install.packages('shiny')
shiny::runExample('01_hello')
I have a similar case, and shiny looked like a good idea to me. However, after I did a few first steps, I am no longer sure about this. Note that most of the examples use shiny to display results. When you get into editing some fields and using a database, things can become messy; the reactive-ness gets in the way once fields can be change by program and by the user.
As an example see https://gist.github.com/dmenne/4721235/edit. The main problem for the current state of shiny is that you must use the dynamic UI for this type of work, which kills any separation of ui and server because you have to create the ui elements in the server.
shiny is a great idea, but for anything larger with interaction it is too early now. Knowing that the amazing RStudio team is behind it, I am sure the stress should be on now.
What else is there around to make user interfaces for R? TclTk makes me shudder. I working in c# a lot, and I had been using R(D)COM for interfacing some years ago, but gave up after installation and licensing problems. There is R.DOTNet which works better now; it is the most hazzle-free installation-wise, but it is not a very active project, and tends to crash. Interfacing via RServe/RServeCLI is stable, but is too difficult to install on Windows, for example on hospital computers with their strict security issues.
And there is Qt. With the active RInside community, it would be a good choice and the interface is great. I wish however my programming skills were at the level of the RStudio-guys. The fact that even Dirk is one the proof-of-concept level (using rinside with qt in windows) is not encouraging.
How important is it to gracefully degrade or inversely Progressively Enhance the UI experiance? I mean am I going to lose a LOT of business if I don't? Do you practice this concept? Are there any web 1.0 users still left out there?
Please could you also include if you practice this personally and how much time you've spent relative to the entire project. I realize every project is different, I want to get a sense of how much time as a general rule I should be allocating toward this goal.
EDIT
Firstly, i'm looking for guidance around how much time I should be devoting to making my applications run without javascript.
Secondly, the BS term "web 1.0" (...lol... I don't really like it either) works because we all understand that as the iteration before ajax and all its goodness.
Thirdly, the kind of applications I'm describing are the ones we are all building, not Facebook, not Twitter (unless you're from Facebook or Twitter) but service or utitlity programs like a web calendar, or an online todo list or [INSERT YOUR APP HERE].
Progressive enhancement is more a mindset than a particular task that you need to allocate time to. If you're doing it right (and if it's important to you), you should be enhancing the user experience with JavaScript, but not relying on it.
For example, a link will point to a new page, but with JavaScript you'll disable the link and load new content into the current page with Ajax. Start off without JavaScript and progressive enhancement will follow naturally.
Progressive enhancement is not only smart but it is faster and easier to develop.
And at each stage, you almost always have a working fall-back.
Here's what it looks like in a nutshell:
Boss/client approves mock ups.
We code to valid HTML output. At this point, the boss/client can start using the site. Baring any boss/client changes, the HTML is mostly done. The site is usable at this point.
We start tweaking the CSS to make it match the boss/client's graphic expectations. Changes to the HTML are minor, if any.
In parallel, JavaScript is added to do non-critical, but nice, things. (Sort tables, Fancy CSS helping, replace some links with AJAX calls, warn the user -- client-side -- of input problems.)
If any one of these things breaks, the site still works.
Also, little, or usually no, html changes are needed.
First of all, lets not start using bullshit terms like "web 1.0" and "web 2.0" etc, the fact is the web is forever progressing and new websites are starting to use JavaScript to enhance the user experience.
I don't know anyone that doesn't allow their site to gracefully degrade when JavaScript isn't available, this is for the same reason we use semantic markup so screen readers can correctly interpret our websites for users with visual impairment, and whilst the vast majority of your visitors / users won't fall into these categories it's still important to think about the minority.
Will you lose a LOT of business, well that depends on how successful you are now and how badly your site degrades, chances are you probably won't lose any business... but that should not be the measure yo use to decide whether to gracefully degrade a website.
So unless you can come up with a pretty good reason, you should probably use JavaScript for the purposes of progressive enhancement, don't depend too much on it.
:-)
This greatly depends on the nature of your aplication and its data. If it's something that you know will be mostly used via computers, than degrading to non-script version UI wouldn't make any obvious benefit (even loss of money because it will take considerable time to develop). You can always tell people to enable javascript in their browsers (similar to what's done here on Stackoverflow - try disabling script and reload the page). Your app/site should display at least something when there's no Javascript posibility.
But if your application has simple data to display and users should access it frequently wherever they are, than degrading to lesser browsers (without script engines like Opera Mini) is a must. Creating a separate UI will less functionality, but keeping everything in that users need to access is probably your best option. UIs like separate iPhone applications for example...
JavaScript is disabled by a small proportion of web users, but when you start to talk big volumes this can make a difference. For example for 1 million visitors, you can expect more than 10,000 not to be able to user your site.
You should decide how much lost business is worth the additional cost of having a non-JavaScript version of your site.
You can have an approach where the entire site may not work without JavaScript but that some of the core features are there.
I'm starting to give a little more attention to making my javascript and ajax degrade gracefully. Which is more recommended:
working on incorporating the graceful degradation into your existing code (can be tricky)
or
developing a different sets of pages for the non-js users.
I'm leaning towards the different sets of pages, because I feel it's easier and I get to deliver the best possible results for each user type (js-enabled or js-disabled). Do you agree with me, and if not, why do you disagree?
I'm also worrying about hacking attempts. For example hacker gets to the js-enabled version, then disables his js. Any thoughts on this point? I don't know much about hacking, but can this be a security concern if I go with the separate versions?
Thanks in advance
Though it doesn't work well for existing sites, often it's more useful to use the Progressive Enhancement paradigm: build the site so it works with no special add-ons, then start layering your awesomeness on top of that.
This way you can be sure it works from the ground up and everyone (including those who use screen readers, those who turn off images or stylesheets, and those who don't use javascript) can all access your site.
For an existing site, however, it will depend on what functionality the ajax is delivering. In general you should strive to mirror all the ajax functionality with js disabled. If you have security holes in your js version, than you probably will in your non-js version too. AJAX can't get to anything that can't be accessed via ordinary URL.
Developing two separate sets of pages, one for JS enabled and one for non-JS, is obviously a lot of work, not only initially, but also as your application keeps evolving. If that doesn't bother you too much, I think that's the way to go. I think you are right about same-page graceful degradation being very tricky sometimes. Sometimes this is just because of the layout: With JS enabled, you can simply hide and show elements, where as without JS: where to put everything? Separate sets of pages can help keep page structure cleaner.
About hacking attempts: You can never, never, never rely on client-side JavaScript validation. Everything has to be checked (or re-checked) server-side, and your server-side code may make no assumptions whatsoever on the user input. Therefore, I think the scenario of someone de-activating JS while using the application is irrelevant. Try to keep the expected user input uniform for the non-JS and the JS versions, validate it properly, and you're good.
You'll probably want to check out jQuery Ajaxy. It lets you gracefully upgrade your website into a full featured ajax one without any server side modifications, so everything still works for javascript disabled users and search engines. It also supports hashes so your back and forward buttons still work.
It's been implemented on these two sites (which I know of) http://wbhomes.com.au and http://www.balupton.com
It's fall of 2008, and I still hear developers say that you should not design a site that requires JavaScript.
I understand that you should develop sites that degrade gracefully when JS is not present/on. But at what point do you not include funcitonality that can only be powered by JS?
I guess the question comes down to demographics. Are there numbers out there of how many folks are browsing without JS?
Just as long as you're aware of the accessibility limitations you might be introducing, ie for users of screen-reading software, etc.
It's one thing to exclude people because they choose to turn off JS or use a browser which doesn't support it, it's entirely another to exclude them because of a disability.
Two simple questions to help you decide...
Does using javascript provide some
core functionality of your site?
Are you prepared to limit your
potential users to those who have
JS? (ie. Most people)
If you answer yes to both of those, go for it!
Websites are moving (have moved?) from static pages of information to interactive web applications. Without something like Javascript or Flash, making compelling user interactions is sometimes not possible.
Designing to degrade gracefully is the most that should be done. We are moving/have moved past the point of simple web "sites" to web "applications". The only option besides client side scripting to add round trips to the server.
I think (personal opinion) that the "don't use JavaScript" comes more from a lack of understanding of what JavaScript is/does than any actual market data that shows a significant number of people are browsing without it.
It's reasonable to design sites that use JavaScript but it is not safe to assume that all clients have support for Javascript and therefore it is important that you provide a satisfactory experience even when JavaScript is not available
Search Engines don't support JavaScript. They're also blind and don't support CSS. So my suggestion to you is to make sure that the part of your product that needs to be indexable by search engines works without JavaScript and CSS. After that, it really depends on the needs of your users.
If you have a very limited subset of users, then you can actually query them. But to remember that 10% of the population has some form of impairment ranging from vision issues (low vision, color-blindness, etc.) or motor functions (low hand dexterity). These problems tend to be more prominent in the elderly and the knowingly disabled
If your site will target the general audience of Internet users then please make it degrade gracefully, but if you can't do that, then make a no-JavaScript version (like G-mail has).
I think the days of "content sites only" are gone. What we see now is WWW emerging as the platform of web applications, and the latest developments in the browser front (speeding up JS in particular) ar indication of this.
There can be no yes/no answer to your question - you should decide, where on content site<---->web application continuum your site is and how essential is the experience provided by JavaScript. In my opinion - yes it is acceptable to have web applications which require Javascript to function.
Degrading gracefully is a must. At a minimum, you sure make use of the NOSCRIPT tag in order to inform potential customers first that your site requires javascript, and secondly why you require it.
If it's for flashy menus and presentations that I could honestly care less about then I probably won't bother coming back. If there's a real reason that you're requiring javascript (client-side validation on forms, or a real situation that requires AJAX for performance reasons) then say so and your visitors will respond accordingly.
I install extensions that limit both Javascript and Cookies. Websites that don't prominently state their requirements of both usually don't get a second visit unless there's a real need for it.
You should never design a public site to rely on ANY technology/platform. The user agent may not display colour (think screen readers), display graphics (again, think screen readers or text only browsers such as links), etc.
Design your site for the lowest common denominator and then progressively enhance it to add support for specific technologies.
To answer the question directly: No, you cannot assume your users have Javascript, so your site should work without it. Once it does, enhance it with Javascript.
it's not about browser capability, it's about user control. People who install the noscript plugin for firefox so they don't have to put up with punch-the-monkey garbage ( the same problem that inspired stack overflow) will not allow your web site to do anything non-static until they trust you.
In terms of client software consider users/customers who are using a browser that supports some but not all Javascript. For example, most mobile phone browsers support a bit of Javascript but nothing very complicated. The browsers on devices such as the Playstation 3 are similar.
Then there are browsers such as Opera Mini, which support a lot of Javascript but are operating in an environment where the scripts are running on a server that then sends the results to a mobile device.
You should design websites with Javascript in mind--but not implemented. Consider, build it where every click, every action, performs a round trip to the server. That's the default functionality for older browsers, and those without JS turned on.
Then, after it's all built, and everything is working properly, add in JavaScript which hijacks the link, button and other events, and overlay their standard functionality with the Javascript functionality you're wanting.
Building the application like this means that it will ALWAYS work, which ultimately is what you're wanting.
The received wisdom answer is that you can use JavaScript (or any other technology) providing that it 'degrades gracefully'...
I have experience with disability organisations, so accessibility is important to me. But equally, I'm in the business of building attractive, usable websites, so javascript can be a powerful ally. It's a difficult call, but if you can build a rich, javascript-aided site, without completely alienating non-js vistors, then do so. If not, you will have to look at the context of the site and decide which way to jump.
Regardless, there are no rights and wrongs with this question. However, in some countries, there is a requirement to build 'public' sites to be accessible, so this may be yet another factor in your decision. [In the UK, it is the Disability Discrimination Act.. though to my knowledge, no company has been prosecuted for failure to comply]
JavaScript is great for extending the browser to do things like google maps. But it's a pointy instrument, so use it with care.
My bank web site uses JavaScript for basic navigation between pages. Sigh. As a result, it's not usable from my mobile device.
Make sure you're familiar with the Rule of Least Power when considering JavaScript:
When designing computer systems, one
is often faced with a choice between
using a more or less powerful language
for publishing information, for
expressing constraints, or for solving
some problem. This finding explores
tradeoffs relating the choice of
language to reusability of
information. The "Rule of Least Power"
suggests choosing the least powerful
language suitable for a given purpose.
5% according to these statistics: http://www.w3schools.com/browsers/browsers_stats.asp
As you said, demographics. The web is expanding onto devices that doesn't have very much power, for instance cellphones. If your site is usable without javascript, Opera Mini will likely show your site without any problems.
I think Javascript implementations in most modern browsers have now reached a reasonable level of maturity and there are a bunch of Javascript UI frameworks which let you build very attractive Javascript based web applications using web-services and such (regardless of the back-end server platform).
An example is ExtJS - they have got a very extensive AJAX + UI widget framework which I recently used to build a full fledged internal web-app for a client with an ASP.NET backend (for webservices).
I think it comes down to what you're about to do. Are you writing a web APPLICATION? Then I think you're bound to use javascript and/or something like GWT. Just have a look at all the social sites, and google aplications like gmail. If you're writing a webpage with product descriptions and hardly any interactivity, then you can make the javascript optional.
I agree with the majority of the stackoverflow respondents. JavaScript has matured and offers an "extra" level of functionality to a webpage, especially for forms. Those who turn off cookies and JS have likely been bitten while surfing in dangerous waters. For the corporate power users that pay my way either in B2B or retail sites, JS is a proven and trusty tool. Until something better comes along (and it will) I'm sticking with JS.
There's addon for Firefox called NoScript which have 27,501,701 downloads. If you site won't work without JavaScript most of those guys wouldn't want to use it.
Why you would install that addon? Ever wanted to get rid of the popup on the site that cover the most of the useful text you want to rid? Or disable flash animation? Or be sure that evil site won't steal your cookies?
Some corporate environments won't allow Javascript, by policy or by firewall. It closes the door to one avenue of virus infection.
Whether you think this is a good idea or not, realize that not everyone has full control over their browser and it might not be their choice.
There is a gradient between web sites and web applications. However, you should alway be able to say "we are building a web site" or "we are building a web application".
Web sites should be readable down to plain HTML (no CSS, no images, no JavaScript).
Web applications, of course, could just say "Sorry, JavaScript is needed" (which also assumes CSS for layout). Application should still be able to work without images.
The accessibility issue is the only important technical issue, all other issues can be socially engineered. When one says that javascript reduced accessibility and another says that Web Applications can use javascript, can we take these two together to imply that all blind people are unemployed? There has to be some kind of momentum in making javascript accessible. Maybe a Screenreader object on the javascript side which can detect the presence of a screenreader and then maybe send hints to the screenreader, Screenreaders which can hook onto the browser, and maybe it gets glued together with a screenreader toolbar.
if you want your site viewable by the top 100 companies in the US. I would write without javascript.
Independence from Javascript and graceful degradation are important to an application despite the actual demographics -- because such an application probably has better software design.
The "human user without Javascript" may be purely hypothetical (for example, if you're trying to make money with your product). But designing for that hypothetical user encourages modular software design which will pay off as you continue to develop your app.
Javascript provides functionality. HTML provides data (on the page itself, and via links that point to more data). As a general rule that reaches well beyond browser apps: A well-designed software product will separate data from functionality. All data should be available, and the functionality should be a separate layer that consumes the data.
If your Javascript is creating data at runtime, then it's time to get specific and figure out whether your webpage really is a piece of software (e.g. a mortgage calculator) or whether it's a document containing data (e.g. a list of mortgage interest rates). This should tell you whether it makes sense to rely on Javascript.
As a final note/example, demographics can be misleading. Relatively few humans browse your site without Javascript, but lots of machines (search bots, data miners, screen readers for the disabled, etc.) are browsing your site without Javascript. Again, the distinction between data and functionality are important -- the bots are just making requests and looking for data in the responses. They don't need functionality. But if your user needs to invoke functions just to make your data accessible, the bots are getting no value from your site.
One side point about the screen readers and other accessibility considerations for the disabled. This is an important niche demographic: a mind that navigates data in a human way, but who can only get data from your site in the same way machines get it. By providing data cleanly and semantically on your page, you make it available to the largest possible set of accessibility tools.
Note this doesn't exclude Javascript from consideration. Our mortgage calculator example can still work: accept input from the user, invoke Javascript, and write the output back into the clean semantic data layer of the page. Screen readers can then read it! And if they can't, you're encouraging the development of better screen readers that can.
If you expect your app to work for everyone, you'll need a backup for all your javascript functionality. If it's form validation, you should also check the data on the server before saving it. So the answer is Yes, it's okay, but have a backup. Do not rely on it.
As many people are saying, it's important to consider your user base, but whoever your users are there's a strong possibility that some (stats say 10%) of them will have some sort of disabilities, and screen readers don't like javascript. If you're only adding simple things, a javascript menu or something, then just make it degrade (or don't do it). If the site depends on javascript to work properly, make two versions, one for javascript and one without.
I generally find that anything too javascript heavy is very difficult to make degrade well without just having javascript re-writing the page to a javascript version if the user can take it. Given this, it's well worth writing two pages from square one for complicated stuff.
I would say that there are very very very few web sites that should be running without some support for users without javascript. You'd need to have a very dynamic application that completely didn't make sense as static pages,or you'd need to have a audience you could guarantee were ok with it (like on an office Intranet say).
Well, it depends on your userbase. If you know that people will be using your site from mobile devices, it's good to have unobtrusive JavaScript. However, if you're trying to appeal to a tech-savvy crowd, don't bother with it.
However, if you're appealing to a crowd that may be using screen readers (blind people), I'd highly suggest using WAI-ARIA standards. Dojo's widget system has full support for this, and would be a great and easy way to do it.
Anyway, in most cases, you don't need unobtrusive JavaScript. Most people who have JavaScript disabled are either using a smartphone, using Lynx, or have NoScript installed. It's enabled by default in all the major browsers, so you shouldn't have to worry.
Lastly, it's good to at least have some unobtrusive JavaScript. <noscript> tags are your best friend. For example, one may want to replace a widget that draws rating stars with text. Example using dojo:
<div dojoType="dojox.Rating" stars="5" value="4"></div>
<noscript>4/5</noscript>
It's the 21st century. People not permitting JavaScript need to exit the last millennium, posthaste. It's a mature, widely used, and very useful technology that is one of the foundations of the recent expansion in useful web services.
You should be tying the functionality of your website to your audience. That being said, every modern browser (save for the mobile platform) includes javascript, and so unless your audience includes luddites with decade old computers, you can assume they have javascript.
The people you need to worry about, then, are those that specifically turn it off. This includes:
Corporate networks with tough security (not common, but some financial and defense institutions)
Paranoid web-heads
So, first, who is your audience? Are there other websites that are comparable to your target? Look at their site and success - do they degrade gracefully, and would yo be satisfied with their level of success?
If you are targeting mobile applications, though, you can't guarantee javascript.
-Adam
I would say that you should look at your target audience. If you can reasonably expect that they will have js enabled, and making everything work without any js is too much of a pain, then by all means - go ahead and ignore the non-js crowd, if, on the other hand you have to create a site that will be used by a very large audience/or you are perhaps building a government web site, then you must make sure that everything works, and it is easier in those cases to first build the site so that it works without any js, and add all the nice time-saving ajaxy bits later.
In general though, almost everyone has js enabled by default.
Though you should be aware that server-side validation of user posted data is a must in either case.