How to write JavaScript codes that are only runnable on web browsers? - javascript

I am developing a system with JavaScript, which I want to let it work only on common web browsers (like IE9, Firefox, Chrome, Safari, Opera, ...).
First, I've compressed my code using Closure library+Closure compiler with the ADVANCED_OPTIMIZATION option, generating a code which slightly looks difficult to understand. Unfortunately, the codes can easily be converted to something beautiful (and readable) by using tools like this.
Second, I've chosen algorithms which are easy to read, but difficult to understand. For example, scripts that are decoding Reed-Solomon codes may be difficult to understand for those who has never developed such kind of algorithms before. Of course this solution is not perfect, because ones who have deep knowledge to Reed-Solomon codes may figure out what's written inside, even if the code is compressed and has no comments.
But the major problem is that my complicated code may run easily just by copying-pasting to Non-Web browser javaScript environments like Rhino+env.js, PhantomJS, and so on.
Please teach me the usable techniques to let my code ignore non-web browser environments, if there are .

I don't really understand the point of this question, but it sounds like your worry is that you don't want people to steal your JS that you return as part of your site.
If that's the case, there is only one real solution: do the work on the server.

You can make it hard to run the code but in the end, it's source code that you give to the world. Any kind of obfuscation and encryption must be reversed before the browser can execute it which means that anyone can eventually reverse engineer the code.
If you don't want this / can't have it, then the browser isn't a suitable tool for you. You can try to write a desktop application or build an appliance (= code which is protected by hardware) but that just raises the bars for reverse engineering. People do grind off the cover off chips to find out how they work.
From my experience, you can make it somewhat hard to "steal" your valuable data but you can't prevent it. Other downsides that you should take into account:
Paying customers will be offended by bugs and limitations that you impose on them (pirates will simply remove your copy protection).
Hollywood spent millions of dollars in DVD CSS and HDMI. Both protection systems were circumvented in a relatively short time. How much money do you plan to spend?
Maybe your time and energy is spent better on providing a better service to customers so they don't feel any need to "steal" from you.

Add this to the beginning of your script:
if(typeof window === 'undefined')
throw new Error('This script is meant to run in a web browser');
Obfuscate it as you see fit.
If you are doing this for security purposes it is a pointless endeavour. Everything you do in Javascript running on the users' browsers can be read and modified by the users. Not only that, but it is very easy for them to do so. Any data you don't want the users to see should not be sent to their browser at all. All processing should be done server-side.

Related

Alternatives to JavaScript

At the moment, the only fully supported language, and the de-facto standard for DOM tree manipulation in the browser is JavaScript. It looks like it has deep design issues that make it a minefield of bugs and security holes for the novice.
Do you know of any existent or planned initiative to introduce a better (redesigned) language of any kind (not only javascript) for DOM tree manipulation and HTTP requests in next generation browsers? If yes, what is the roadmap for its integration into, say, Firefox, and if no, for what reasons (apart of interoperability) should be JavaScript the only supported language on the browser platform?
I already used jQuery and I also read "javascript: the good parts". Indeed the suggestions are good, but what I am not able to understand is: why only javascript? On the server-side (your-favourite-os platform), we can manipulate a DOM tree with every language, even fortran. Why does the client side (the browser platform) support only javascript?
The problem with javascript is not the language itself - it's a perfectly good prototyped and dynamic language. If you come from an OO background there's a bit of a learning curve, but it's not the language's fault.
Most people assume that Javascript is like Java because it has similar syntax and a similar name, but actually it's a lot more like lisp. It's actually pretty well suited to DOM manipulation.
The real problem is that it's compiled by the browser, and that means it works in a very different way depending on the client.
Not only is the actual DOM different depending on the browser, but there's a massive difference in performance and layout.
Edit following clarification in question
Suppose multiple interpreted languages were supported - you still have the same problems. The various browsers would still be buggy and have different DOMs.
In addition you would have to have an interpreter built into the browser or somehow installed as a plug in (that you could check for before you served up the page) for each language. It took ages to get Javascript consistent.
You can't use compiled languages in the same way - then you're introducing an executable that can't easily be scrutinised for what it does. Lots of users would choose not to let it run.
OK, so what about some sort of sandbox for the compiled code? Sounds like Java Applets to me. Or ActionScript in Flash. Or C# in Silverlight.
What about some kind of IL standard? That has more potential. Develop in whatever language you want and then compile it to IL, which the browser then JITs.
Except, Javascript is kind of already that IL - just look at GWT. It lets you write programs in Java, but distribute them as HTML and JS.
Edit following further clarification in question
Javascript isn't, or rather wasn't, the only language supported by browsers: back in the Internet Explorer dark ages you could choose between Javascript or VBScript to run in IE. Technically IE didn't even run Javascript - it ran JScript (mainly to avoid having to pay Sun for the word java, Oracle still own the name Javascript).
The problem was that VBScript was proprietary to Microsoft, but also that it just wasn't very good. While Javascript was adding functionality and getting top rate debugging tools in other browsers (like FireBug) VBScript remained IE-only and pretty much un-debuggable (dev tools in IE4/5/6 were none existent). Meanwhile VBScript also expanded to become a pretty powerful scripting tool in the OS, but none of those features were available in the browser (and when they were they became massive security holes).
There are still some corporate internal applications out there that use VBScript (and some rely on those security holes), and they're still running IE7 (they only stopped IE6 because MS finally killed it off).
Getting Javascript to it's current state has been a nightmare and has taken 20 years. It still doesn't have consistent support, with language features (specified in 1999) still missing from some browsers and lots of shims being required.
Adding an alternate language for interpreting in browsers faces two major problems:
Getting all the browser vendors to implement the new language standard - something they still haven't managed for Javascript in 20 years.
A second language potentially dilutes the support you already have, allowing (for instance) IE to have second rate Javascript support but great VBScript (again). I really don't want to be writing code in different languages for different browsers.
It should be noted that Javascript isn't 'finished' - it's still evolving to become better in new browsers. The latest version is years ahead of of the browsers' implementations and they're working on the next one.
Compile to Javascript
For now, using a language which compiles to Javascript seems to be the only realistic way to reach all the platforms while writing smarter code, and this will likely remain the case for a long time. With any new offering, there will always be some reason why one or more vendors will not rush to ship it.
(But I don't really think this is a problem. Javascript has been nicely optimized by now. Machine code is also unsafe if written by hand, but works fine as a compile target and execution language.)
So many options
There is an ever growing pool of languages that compile to Javascript. A fairly comprehensive list can be found here:
List of languages that compile to JS on the Coffeescript Wiki
Noteworthy
I will mention a few I think are noteworthy (while no doubt neglecting some gems which I am unaware of):
Spider appeared in 2016. It claims to take the best ideas of Go, Swift, Python, C# and CoffeeScript. It isn't typesafe, but it does have some minor safety features.
Elm: Haskell may be the smartest language of them all, and Elm is a variant of Haskell for Javascript. It is highly type-aware and concise, and offers Functional Reactive Programming as a neat alternative to reactive templates or MVC spaghetti. But it may be quite a shock for procedural programmers.
Google's Go is aimed at conciseness, simplicity, and safety. Go code can be compiled into Javascript by GopherJS.
Dart was Google's later attempt to replace Javascript. It offers interfaces and abstract classes through a C/Java-like syntax with optional typing.
Haxe is like Flash's ActionScript, but it can target multiple languages so your code can be re-used in Java, C, Flash, PHP and Javascript programs. It offers type-safe and dynamic objects.
Opalang adds syntactic sugar to Javascript to provide direct database access, smart continuations, type-checking and assist with client/server separation. (Tied to NodeJS and MongoDB.)
GorillaScript, "a compile-to-JavaScript language designed to empower the user while attempting to prevent some common errors." is akin to Coffeescript but more comprehensive, providing a bunch of extra features to increase safety and reduce repetitive boilerplate patterns.
LiteScript falls somewhere inbetween Coffeescript and GorillaScript. It offers async/yield syntax for "inline" callbacks, and checking for variable typos.
Microsoft's TypeScript is a small superset of Javascript that lets you place type-restrictions on function arguments, which might catch a few bugs. Similarly BetterJS allows you to apply restrictions, but in pure Javascript, either by adding extra calls or by specifying types in JSDoc comments. And now Facebook has offered Flow which additionally performs type inference.
LiveScript is a spin-off from Coffeescript that was popular for its brevity but does not look very readable to me. Probably not the best for teams.
How to choose?
When choosing an alternative language, there are some factors to consider:
If other developers join your project in future, how long will it take them to get up to speed and learn this language, or what are the chances they know it already?
Does the language have too few features (code will still be full of boilerplate) or too many features (it will take a long time to master, and until then some valid code may be undecipherable)?
Does it have the features you need for your project? (Does your project need type-checking and interfaces? Does it need smart continuations to avoid nested callback hell? Is there a lot of reactivity? Might it need to target other environments in future?)
The future...
Jeff Walker has written a thought-provoking series of blog posts about "the Javascript problem", including why he thinks neither TypeScript, nor Dart nor Coffeescript offer adequate solutions. He suggests some desirable features for an improved language in the conclusion.
should be JavaScript the only supported language on the browser platform ?
Yes and no. There is an alternative out there called Dart by Google which does compile to JavaScript and just like jQuery it tries to make DOM manipulation a bit easier. It may be fun to experiment, check it out.
From Google see The dart language
From Microsoft see TypeScript language
See also
Elm
Kal
It is true that Javascript was at one point notoriously hard to deal with but the web development community has come a long way since. Instead, I would encourage you to have a look at jQuery. It's easy and abstracts away all the various problems.
And there really are no alternatives that work across the board. Flash comes to mind but that too is ECMA script and it's probably over kill for most things.
Short term, I'd use things like jQuery to hide the browser incompatibilities. Long term, technologies like Silverlight or Adobe AIR may make this a very different minefield (but still a minefield) in the future.
Doug Crockford gave a talk to Google detailing the bad and good parts of JavaScript and its future. It actually hasn't changed much at all since 1999--which can be said to be a good thing (pretty much all browsers can run the same code as long as you're aware of their limitations) and Doug shows where the good parts were mostly misunderstandings that turn out to be very powerful.
For DOM manipuluation, look at JQuery as a client-side library that replaces most of the awful DOM API with operations that are a pain to write to pretty elegant bits of code that are easier to write.
If you're thinking that JavaScript has deep issues, I recommend Doug Crockford's book, JavaScript: The Good Parts. (Or Google for "Crockford JavaScript" to find several video presentations he's done.) Crockford sketches out a safe subset and set of practices, and specifically lists some parts of the language to avoid.
I'm unaware of plans to replace JavaScript as the de facto means of manipulating the DOM. So best learn to use it safely and well.
In terms of client side Javascript is the only way to manipulate the DOM. In terms of server side there are a multitude of ways.
Internet Explorer supports pluggable scripting languages, although the only one reliably included with IE besides JScript is VBScript.
As far as I have seen, there seems to be a general sort of bias toward dynamic languages in the browser, and JavaScript seems to fill this need adequately enough that network effects make any other language a non-starter. The language is actually quite powerful, though its implementation in browsers leaves much to be desired.
If you're willing to restrict your customers/visitors to specific browsers, and possibly willing to require them to install a plug-in, you could look at MS Silverlight -- a readable overview is on wikipedia. With Silverlight 2, you can run, client-side, code you've written in C#, IronPython, IronRuby, VB.NET, etc; the free Moonlight clone of Silverlight, from the Mono project, promises to bring the same functionality to Linux.
In practice, most developers of web apps and sites prefer to reach wider audiences than Silverlight (and eventually Moonlight) can currently deliver -- which means sticking with Javascript, or possibly Flash (which uses a similar programming language, Actionscript).
So, gaining substantial mindshare, adoption and traction for anything else is proving to be an uphill fight even for Microsoft with its large groups of engineers and marketing budgets and a free-software project on the side (to possibly ease worries about proprietary lock-in) -- which may help explain why there's very little interest, e.g. on the part of the Mozilla Foundation, in pushing towards such a goal. "Apart from interoperability", you say: but clearly the issue of interoperability is THE biggie here, given what we observe wrt Silverlight's progress...
As already said, you have Flash (ActionScript, which is a derived language from Javascript) and Silverlight/Moonlight (IronPython, IronRuby, JScript, VBScript, C#) that can run in the browser via plugins (the first one being much more ubiquitous).
There is also another alternative if you like Ruby: HotRuby, it's a ruby implementation in javascript that will run in the browser. It is not very mature yet, but you can have a look at it.
One thing I haven't seen mentioned (oh, I see Alcides mentioned HotRuby while I was writing and Nosredna mentioned GWT and Script#) and would like to throw out there is there are a number of implementations of [insert language]-on-JavaScript (eg. translators that allow you to convert Ruby, Python, C#, Java, Obj-J/Cappuccino [similar to Obj-C/Cocoa] or Processing [for Canvas] to JavaScript either on the client or before deployment [and some of which also feature various abstraction libraries]). Of course there's a performance overhead if it is being translated on the client, but if you are more comfortable with another language it will allow you some flexibility.
Personally, though, I recommend learning to love JavaScript. It's an excellent, powerful language, and pretty elegant once you get to know it. I'm facing the opposite dilemma, chomping at the bit to have a capable server-side JavaScript/DOM solution that meets all of my needs. /unsolicited opinion
JavaScript is the English language of the web. English historically spread because it had a strong navy conquering various countries. This is comparable to big companies that conquered the web with JavaScript. It's a language clobbered together from multiple European sources (Greek, Latin, Germanic languages, French even some Chinese and Indian words). JavaScript borrowed a lot of concepts throughout the years from other languages (structural, OO, functional). English is spoken in different places with slight variations in dialect and accent, that can render understanding difficult. Just like JavaScript has different browsers interpreting it a bit differently.
Even though English is easy to learn initially, it has very inconsistent pronunciation and more exceptions than rules. Just like JavaScript it's always there to offer a surprise.
Despite the different accents, JavaScript is the lingua franca of the web. Just like you might not be English and write here in English, every web browser has a certain degree of English understanding. IE6 is like the guy who says on his resume that he's fluent, but only went to a two week course on English as a foreign language.
There have been attempts to supplant English as the worlds main language, e.g. Esperanto. But all of them failed, because most people on earth speak some English. In the same way it will be difficult to introduce better alternatives to JavaScript.
Jquery (still javascript but) it will really help you they have support for almost all the browsers and it isn't really that hard to learn :)
No. JavaScript is it, but it will evolve. The next version is "JavaScript Harmony," and you can learn more if you Google that.
Now and then someone suggests putting a byte code interpreter into the browsers alongside JavaScript. Probably won't happen, at least for awhile.
I happen to love JavaScript. But there are other solutions, including GWT, which compiles Java to JavaScript and Script#, which compiles C# to JavaScript.
I don't think Javascript will be replaced any time soon. For a completely different approach to rich clients, you might want to investigate Flex, which is a Flash-based technology.
Maybe something like haxe (see haxe.org) could help you. It is a language which seems cleaner than JavaScript and can be compiled down to JavaScript, so it can be run inside a browser.
I know that this is not a direct answer to your question, but I thought it might be interesting for you, nevertheless.
Many people understand that Javascript isn't best and prettiest language ever. However, it is currently supported by browsers, and thus it will be extremely hard to introduce a different language. We simply don't need another browser war.
This explains why I know of no plans of switching to a different client-side language.
But I think Javascript isn't so bad if you start thinking about DOM model and how would one work with it. Many things that are messy with JS are the result of the way DOM model works.

Is it worth it to code different functionality for users with javascript disabled?

I'm currently building a project and I would like to make use of some simple javascript - I know some people have it disabled to prevent XSS and other things. Should I...
a) Use the simple javascript, those users with it disabled are missing out
b) Don't use the simple javascript, users with it enabled have to click a little more
c) Code both javascript-enabled and javascript-disabled functionality
I'm not really sure as the web is always changing, what do you recommend?
Degrade gracefully - make sure the site works without JavaScript, then add bells and whistles for those with JavaScript enabled.
Everyone else has committed good comments, but there are a few other considerations to make.
Sometimes the javascript will be hosted on a different domain, and be prone to timeout.
Sometimes that domain may become inacessible, while your site remains accessible. Its not good to have your site completely stack itself in this scenario.
For this reason, "blocking" scripts ( ie: document write inline ) like that present in google's tracker, should be avoided, or at very least, should go as late in the page as possible so the page renders whether or not the domain is timing out requests or not.
If you happen to be serving JS from a broken/malicious server, by intent or by accident, one can halt page rendering simply by having a script that serves that javascript which just calls "sleep(forever)" once its sent all the headers.
Some People Use NoScript
Like the above problem, sometimes the clients environment may block certain script sources, be it the users choosing, or other reasons ( ie: browser security satisfactions, odd antivirus/anti-malware apps ). The most popular and controllable instance of this is NoScript, and I myself paranoidly block some of the popular tracking/advertising services with it ( some proxy servers will do this too ).
However, if a site is not well designed, the failing of one script to load still executes code that was dependant on that script being present, which yeilds errors and stops everything working.
My recommendation is :
Use Firebug
Use NoScript and block out everything --> See Site still works
Enable core site scripts that you cant' do without for anything --> See site still works and firebug doesn't whine.
Enable 3rd party stuff --> See site still works and firebug doesn't whine.
There are a lot of other complications that can crop up, but satisfying the above 2 should solve most of them. Just assume that, for whatever reason, one or more resources that comprise a page are viable to spontaneously disappear ( they do, all the time ), and you want the page to "survive" this problem as amicably as possible. For the problems that may persist for < 10 seconds, its not so bad, refresh the page and its fixed, but if its a problem that can occur, and severley hamper usability for an hour or more at a time.
In essence, instead of thinking "oh, theres the edge case users that don't have javascript", try thinking more a long the lines of "its really easy to have something go wrong, and have ALL of our users with broken javascript. Ouch! Lets try make it so we dont' really hose ourself when that does happen"
( I've seen IE updates get rolled out and hose javascript for that entire browser until the people whom wrote the scripts find a workaround. Losing all your IE customers is not a good thing )
:set sarcasm
:set ignoreSpelling
:set iq=76
Don't worry, its only a 5% Niché Market
Nobody cares about targeting Niché markets right? All those funny propeller heads running lynx in their geeky stupid linoox cpus, spending all their time on the intarwebs surfing because they have nothing better to do with their life or money? the crazy security paranoid nerds disabling javascript left and right because they don't like it?
Nobody wants them as your primary customer now do they?
Niché markets. Pfft. Who cares!
:set nosarcasm
Consider your audience
"Degrade gracefully" is generally the best answer. But lots of sites now depend on JS - especially AJAX.
Consider your audience. If your site is aimed at extremely tech-savvy people, the chances of them not having javascript are small, and you can notify them to turn it on if necessary.
If your audience may access your site with mobile devices, don't assume they have JavaScript, and don't even assume they support CSS properly. Aim to degrade gracefully all the way down to bare HTML.
I've learned a lot from my question: What's With Those Do-Not-Use Javascript People
Go with Ajax and Web 2.0. It's the way the web is going and it's wonderful. Isn't Stackoverflow great to be on? It's not quite as nice with your Javascript turned off.
Once you have your site ready, but before you let it go live, test it with Javascript off, and just add whatever you feel you need to make your site appear and function to them. You only need to add what you feel is essential.
Remember, except for visually impared people using screen readers, the others have chosen to turn javascript off. They can also choose to trust your site and turn javascript on for your site if they want to use all the functionality you have. It really is their choice.
As other have said, it should "degrade gracefully".
In other works, it must work without Javascript (period). It doesn't have to work well. The folks who've disabled Javascript know the limitations that causes and have accepted them. But if you are trying to sell them something, it's important that they can still buy it.
On the site I'm designing, there's a javascript-based fly-out menu. With Javascript off, all the flyouts are always open. It doesn't look as cool as it would with JS, but it can still be used to navigate the site.
It depends on how much time you have to develop and maintain both solutions, and how much the non-javascript users are worth to you.
My e-commerce site relies heavily on javascript, and in over a year and a half, I've not received a single complaint.
In fact, I don't think I've seen a single visitor with javascript disabled in any of logs since I started.
That doesn't mean they're not out there. It just means that either (a) they're a tiny percentage, (b) they're not interested in what I'm selling, or (c) both of the above.
Code your web site with support for the bare minimum kind of browser. Then more people can use your site without frustration even if they don't have all the bells and whistles--like Flash, Javascript, and Java--enabled. It may not be practical to continue support for ancient browsers, say Netscape Navigator 4, because a user can be reasonably expected to keep their computer up-to-date. However, features like Javascript, Flash, and Java can be security holes in old or modern browsers, as well as being an annoyance.
Neither of my parents keep Javascript or Flash enabled because they've had too many experiences with them slowing down their already slow connection, crashing their browsers, or being more of an annoyance on sites that use it stupidly (which is a lot of them...) than a useful feature. It's just bad design if, for example, your form requires an AJAX call be made and you can't actually hit a submit button to send the form when Javascript is disabled.
My mother was recently quite frustrated to discover that she is now unable to click through eBay results pages because each one requires Javascript. The only way she can see the next page of results is to turn on Javascript or to show more results per page. Now what reason would there be for page links to require Javascript while the 'results per page' links are just plain links? They should all be plain old HTML links. Maybe Javascript could be used to add some whiz-bang to the navigation, but a user should not be punished with a bad interface for having Javascript disabled. It's stupid on eBay's part, and it causes undue hassle for their users.
I am one of those that uses 'No-Script.' And I can tell you that sites that use javascript and don't work without it enabled is extremely annoying, stackOverflow... No we don't expect it to be very fancy, if I upvote load a new page that says "Thank you."
We expect to be able to use the site with reasonable limitations, don't ever display a page that says JS must be enabled, though, even if the site is crap without it. And yes if your site convinces us to stay we will enable. A function that isn't in common use on the site can also require javascript.
Please note that your site should also look good with no JS or CSS, if nothing else it is good for Bots.
As others have pointed out some phones don't have JS, this is changing but another good reason to have reasonable non-JS. I suggest code with non-JS and add JS after the former works, there are good ways where JS can work with the non-JS layout.
It helps me in my implementations to think about it as "progressive enhancement" rather than graceful degradation. Degradation often leads you to figure out how to make it work w/o js after it is implemented, instead of making a baseline and enhancing with js.
It is essential to at least test your website is functional when JavaScript is turned off.
As orip says, degrading gracefully is very important. It should be vital that your page both looks nice and functions when JavaScript is disabled.
For a standard web site that is primarily intended for conveying information, degrade gracefully always.
For web applications:
When building a web application for a standard internet audience, I would keep the three following facts in mind:
95%-97% of potential users will have JavaScript enabled.
At times established users will need to access functionality when JavaScript is not available.
3%-5% of potential users will have JavaScript intentionally disabled.
Given fact one, if you believe that building a JavaScript reliant web application will deliver a superior user experience, then by all means do it. Doing so may help you accumulate users.
However, given fact two, you should always provide a means by which your users can access core functionality without JavaScript. Do you need to offer every single feature? Probably not. But a user should be able to get his or her work done. This will keep your users happy when they find themselves temporarily without JavaScript.
Given fact three, I would also provide an in depth tour as an attempt to entice these users to enable JavaScript.
As an aside, one of my most favorite web applications, Remember The Milk follows this approach. Also, Google's Calendar application is unusable without JavaScript. So JavaScript reliant web apps are on the rise and that trend is probably unstoppable. In my opinion this is a good thing.
(Do keep in mind that JavaScript make Accessbility a bigger problem than it is already. Please do make an effort to make your apps usable by those with disabilities.)
As said before, it depends on your target audience.
If I'm part of it, you want to make sure that your site works (if not ideally) on my phone, and that it gives me reason to turn Javascript on when I surf there with it off. Nobody expects full functionality with Javascript disabled, and anybody who uses their phone to access websites expects some issues, but you need to at least provide teasers. For a web store, make sure customers can see at least some merchandise anyway, even if they can't buy without Javascript.

Is it reasonable to assume my visitors have javascript enabled?

I understand that server-side validation is an absolute must to prevent malicious users (or simply users who choose to disable javascript) from bypassing client-side validation. But that's mainly to protect your application, not to provide value for those who are running browsers with javascript disabled. Is it reasonable to assume visitors have javascript enabled and simply have an unusable site for those who don't?
I browse with NoScript in Firefox, and it always annoys me when I get pages that don't work. That said - know your audience. If you're trying to cater to paranoid computer security professionals - assume they might not have JavaScript enabled. If you're going for a general audience, JavaScript is probably on.
Totally depends on who you're aiming at.
If your site or app is for an Intranet, you can make lots of assumptions. If your target audience is bleeding-edge social-networking types, you can assume JavaScript will work. If you anticipate a lot of paranoia sysadmin types, you can assume a bunch of them will be trying to access your site in lynx or have JS turned of for "security reasons."
A good example of this is Amazon -- their approach is driven by their business goals. They are a mass-market site, but for them, locking out users in old/incapable browsers means potential lost sales, so they work hard on non-script fallbacks.
So like lots of these kinds of questions, the answer is not just regurgitating what you've read somewhere about accessibility or progressive enhancement. The real answer is "it depends."
I think there is another reason which push you to support at least some main functionality without JS - lots of us now browsing from mobile and PDA, which have no the same lvl of JavaScript support.
http://www.w3schools.com/browsers/browsers_stats.asp
They claim 95% of users have Javascript on.
Duplicate
There's at least one category where the answer is definitely "no". If you work for the government, you must make sure the site is accessible to those using screen readers.
Is it reasonable to assume visitors
have javascript enabled and simply
have an unusable site for those who
don't?
There are actually two questions, and the answers are: Yes, it is reasonable to assume visitors have javascript enabled. And, No this does not mean others should be left with unusable site.
Progressive enhancement is the way to go. Have your site usable without javascript and then add bells and whistles.
As for client side validation, it is no more than a convenience for user to avoid unnecessary roundtrips to server (where real validation should be performed).
I browse with the NoScript plugin in firefox and I'm surprised at the amount of developers that haven't even considered making their site degradable.
Never assume the user has JavaScript disabled - especially seeing as it may not always be their fault. Many enterprises have firewalls which block JavaScript/ActiveX etc. - In this instance the <noscript> element won't work so I would NOT recomend using that either!
Unless you're creating a full-on web application which is going to be 90% Ajax then you must make sure to abide by standards and progressively enhance your site through various layers of interactivity.
Also don't forget the important of object detection, especially with the rise of mobile phone web browsing. One of the most popular mobile web browsers (Opera mini 4.0) doesn't allow all "Background javaScript" to work and Ajax calls rarely execute correctly... Just something to be aware of.
To be honest I am sick and tired of developers that think everyone will have JS enabled! What ignorance!!
Yes it is. But expose as much of it as possible through regular HTML and URLs, if for nothing else than for Google.
Accessible, yes... functional? Not really.
This is really a customer requirement question more than developer-answerable, but if your customer tries to enforce a requirement that non-JS browsers work, you should argue heavily against it and really hammer them on the "cool" factor they'll be missing.
Given the heavy reliance by GWT, RichFaces, etc. on Javascript, it's just not feasible to make an app with any kind of user-friendly UI without it.
You should certainly warn non-JS enabled users that the site they're trying to visit relies heavily on JS, though. No point in being rude about it.
It is ok in these days to assume your visitors have JS enabled. With that said, you should strive for the best possible degradation of your site with JS disabled. It is ideal if your site falls back to a state that is still usable without JS.
No! Some environments will have it disabled as a matter of policy, with nothing you can do to enable it. And even if it's enabled, it might be crippled.
This question has been asked before.
One interesting point to consider is that as a web developer you have a social responsibility to push technology forward - and by using things like AJAX, you increase exposure and potentially rate of adoption along with it. The only thing that should stop you from using the tech to its fullest extent is money - if you won't make the money that you need because people will have trouble viewing the material, you've got to reconsider.
Never ever assume Javascript for form validation, as your question implies. Someone will eventually realise this and turn Javascript off.
Instead, code the app in fairly regular html manner and use Javascript for what it is: an optional perk for your users.
Even for an entirely AJAX app like Gmail, the complete works of form validation is required on the server side.
Yes it is, JavaScript is as old as CSS and no one tries to build around browsers that don't support CSS. Cross Site Scripting is the reason people are afraid of JavaScript, but believe me if a developer wants to screw you over he doesn't need JavaScript to do it. As far as mobile browsers, most of them now have JavaScript, and the others shouldn't be considered browsers. My advice is not to open yourself to hackers by making your site vulnerable to those who choose to turn off their JavaScript, but at the same time don't go out of the way to support those who are living in the stone age. You aren't going to support IE 4 or Netscape, right? Then why support those who sabotage their own browsers because of blatant fear or paranoia?
I think it's fair to assume that the majority of visitors to your site will have JavaScript enabled. Some of the more trafficked sites out there have a dependency on JavaScript. For example, I was surprised to learn that you can't authenticate through a Passport-enabled site without a JS-enabled browser.
Nearly all (but not quite all!) users will have javascript enabled. (I believe the figure quoted above of about 5% is accurate.)
Given the vast improvement in usability you can make with the judicious use of javascript, my opinion is that most of the time, it is reasonable to assume it is enabled.
There will of course be some instances where that is not the case, (ie, a site designed for mobile devices, or with a high percentage of disabled users etc), and always a effort should definitely be given to making your site as accessible as possible to as large a percentage of the population as possible.
That said, if you only have a low traffic site, 5% of a small number is a very small number. It may not be worth bending over backwards to make your site accessible to these people when it may only gain you one or two extra users.
I guess the short answer is (as always), there is no correct answer - it will depend entirely on the target use, and target users of the sit in question.
According to this little page Javascript is enabled in 95% of browsers and it keeps raising.
The W3C Browser Statistics page (scroll down) has some information on this; they say that 95% of visitors have JavaScript on as of January 2008.
It's reasonable to assume your visitors have javascript enabled !-)
-- but of course it depends on who you're trying to reach ...
Several times above w3schools have been mentioned and, as Dan stated, its their own visitors which make it somewhat quirky to draw conclusion from.
However, if you look at theCounter.com it seems that their audience has the same habits in general on this point ...
A twist that hasn't been mentioned yet is the growing amount of crawlers, mailharvesters and so on, they definitely do not have javascript turned on, and how good are counters to detect them ?-)
My guess would be, that this sort of machine-browsers fill up a lot of those 5-6% !o]
-- that said, if it's at all possible, make your app degrade gracefully (as a wise man said)
Your questions seems to suggest form based input for an application. If it's an intranet application then you'd be guided by the in-house security experts. If it's public app, then as other posters have suggested, fail gracefully.
I will argue that it is more than reasonable to expect them to have javascript so long as you provide suitable means to replace javascript should it not be enabled. One of the reasons that I like the Yahoo UI Library is that it degrades gracefully.
I always try to code my sites as static one first, THEN i add js/ajax functionality. This way i can be kinda sure that will work on non JS browsers :)
But, javascript is like flash: all users have it, but developer have to concern on WHAT IF.... ? :D
No its not, period, full-stop, end of story. Its just naive and wrong at an ethical level, not to mention you miss out on around 50% of Internet users worldwide (believe it or not 70% of web access worldwide is from mobile devices).
Add extra nifty stuff that requires Javascript, thats fine. Don't make your site unusable without Javascript unless you have really, really, really good reason to do so.
Someone rightly pointed out that I don't have evidence to back up my claim of 70% mobile web users. Unforunately I can't find the source I got it from but I remember it being authorative so have no reason to doubt it. It does make sense though when you consider worldwide usage, many developing countries have more mobile phones than landlines and broadband. A statistic that was quoted in my not-to-be-found source was that one African country in particular has 300,000 landlines, but has 1.5 million mobile phones!
This is totally a "it depends" question, as many people have pointed out.
This is why metrics is valuable on sites to help show if you can really run with the analogy that "major sites say that the majority of people have JS on" - you could have a site where it's 99%.
I won't dig in to what's been said above, as it's been answered very well :)
Not that everyone else hasn't chimed in, but I disagree with the "look at your audience" position to some extent.
It really should be "look at your app" if you are just displaying some information, and your js is for bell/whistle purposes, then certainly look at nice degradation, if you want to.
However, if you're building something like Google Docs, it's really asinine that someone would think they could use your site without js, so perhaps let them know that via a nice sarcastic message inside <noscript> tags.
From a purely philosophical point of view, if users want to access your site, they will flip the js switch, or upgrade to a decent browser, etc. And you should force them to do this, because evolution is important for the survival of the species.
According to this site, 95% of browsers use JavaScript.
That said, there are a LOT of bots that don't use JavaScript: scrapers, search bots, etc. I'd say closer to 100% of actual human users use JavaScript. But your guess is just as good as mine.

When writing a web backend for a system, is it important the code can still work without Javascript?

Is it alright to expect that the user using the back end will have Javascript enabled?
I guess the answer I'll get is 'it depends on your target users'. I am developing a system for fun that will hopefully be used by other people. I would like to hear from other people developing back end systems, and what did they decide to do and why?
SEO I'm not concerned with, and semantics aern't of as much importance.
Personally I would expect the failover, but there are circumstances (particularly low profile sites, intranets, e-learning content) where you can assume JS.
Mostly you can even go with a simple "You require JS / This works better with JS" and I would consider that good enough, but there's a couple of instances where I would demand real failover:
.gov or other public service sites (legal requirements)
sites for web-tech companies (you need to demonstrate your ability to do this)
very high traffic sites (where the 3% of non-JS users becomes a high absolute number)
sites (or pages) for mobile devices (most of these haven't got JS reliably)
In general, it's reasonably easy to provide some kind of noscript, so why not do it anyway?
if its for fun please go ahead and require javascript.
considering the 3 points :
backend means only a few people will be accessing them (and all of them probably have knowledge about web too, eg. know what javascript is and how to get it enabled)
SEO isn't important
it's for fun
I'd say that it's alright. :)
annakata provided a pretty good insight as well.
It really depends on your application and its target audiences. Do you care about user accessibility (can disabled people use the site), do you want your site to work on various mobile browsers with limited JavaScript support? I would try to build the site so that it would gracefully degrade without CSS or JavaScript. That is unless you site is very dynamic, like say a word processor which can't possible work at all without JavaScript.
Yes, It mostly depends on your target user.
Whatever the front end is, the back end must be bulletproof.
At least, it should make ensure that, nobody can hack or make a mess by disabling javascript.
server-side filtering/validations is important for security, while client-side validation and interactivity is important for usability.
I don't think it's unreasonable to require Javascript for a web based backend/CMS, where your target users are likely to be a fairly small and pretty specific group.
All the CMS systems that I've worked on so far have required it.
I refer you to this post by Jeff Atwood. The important assertion in it is that you can expect that javascript will work as expected among browsers. The security risks are also lower today. So I would say that is now safe to ignore clients that do not enable javascript. If you want to attack users javascript would give a clear advantage.
The only exception I can think of is mobile sites. Although mobile browsers have gone better and do support javascript, the extra download bandwidth and the small screen make js less suitable.
As long as the function that your application will be serving is general, I'd say it is safe to rely on Javascript. One of the sites that I manage receives ~35,000 UV's on a good day. I think it is fair to say we come in contact with quite a variety of browser and operating system combinations. According to our stats, roughly 97% of our users have Javascript enabled.
If it can fail elegantly without Javascript, I'd opt for that solution, but I wouldn't lose sleep over the fact that you might be losing a few people everyday.

Why JavaScript rather than a standard browser virtual machine?

Would it not make sense to support a set of languages (Java, Python, Ruby, etc.) by way of a standardized virtual machine hosted in the browser rather than requiring the use of a specialized language -- really, a specialized paradigm -- for client scripting only?
To clarify the suggestion, a web page would contain byte code instead of any higher-level language like JavaScript.
I understand the pragmatic reality that JavaScript is simply what we have to work with now due to evolutionary reasons, but I'm thinking more about the long term. With regard to backward compatibility, there's no reason that inline JavaScript could not be simultaneously supported for a period of time and of course JavaScript could be one of the languages supported by the browser virtual machine.
Well, yes. Certainly if we had a time machine, going back and ensuring a lot of the Javascript features were designed differently would be a major pastime (that, and ensuring the people who designed IE's CSS engine never went into IT). But it's not going to happen, and we're stuck with it now.
I suspect, in time, it will become the "Machine language" for the web, with other better designed languages and APIs compile down to it (and cater for different runtime engine foibles).
I don't think, however, any of these "better designed languages" will be Java, Python or Ruby. Javascript is, despite the ability to be used elsewhere, a Web application scripting language. Given that use case, we can do better than any of those languages.
I think JavaScript is a good language, but I would love to have a choice when developing client-side web applications. For legacy reasons we're stuck with JavaScript, but there are projects and ideas looking for changing that scenario:
Google Native Client: technology for running native code in the browser.
Emscripten: LLVM bytecode compiler to javascript. Allows LLVM languages to run in the browser.
Idea: .NET CLI in the browser, by the creator of Mono: http://tirania.org/blog/archive/2010/May-03.html
I think we will have JavaScript for a long time, but that will change sooner or later. There are so many developers willing to use other languages in the browser.
Answering the question - No, it would not make sense.
Currently the closest things we have to a multi-language VM are the JVM and the CLR. These aren't exactly lightweight beasts, and it would not make sense to try and embed something of this size and complexity in a browser.
Let's examine the idea that you could write a new, multilanguage VM that would be better than the existing solution.
You're behind on stability.
You're behind on complexity (way, way, behind because you're trying to generalize over multiple languages)
You're behind on adoption
So, no, it doesn't make sense.
Remember, in order to support these languages you're going to have to strip down their APIs something fierce, chopping out any parts that don't make sense in the context of a browser script. There are a huge number of design decisions to be made here, and a huge opportunity for error.
In terms of functionality, we're probably only really working with the DOM anyway, so this is really an issue of syntax and language idom, at which point it does make sense to ask, "Is this really worth it?"
Bearing in mind, the only thing we're talking about is client side scripting, because server side scripting is already available in whatever language you like. It's a relatively small programming arena and so the benefit of bringing multiple languages in is questionable.
What languages would it make sense to bring in? (Warning, subjective material follows)
Bringing in a language like C doesn't make sense because it's made for working with metal, and in a browser there isn't much metal really available.
Bringing in a language like Java doesn't make sense because the best thing about it is the APIs anyway.
Bringing in a language like Ruby or Lisp doesn't make sense because JavaScript is a powerful dynamic language very close to Scheme.
Finally, what browser maker really wants to support DOM integration for multiple languages? Each implementation will have its own specific bugs. We've already walked through fire dealing with differences between MS Javascript and Mozilla Javascript and now we want to multiply that pain five or six-fold?
It doesn't make sense.
On Windows, you can register other languages with the Scripting Host and have them available to IE. For example VBScript is supported out of the box (though it has never gained much popularity as it is for most purposes even worse than JavaScript).
The Python win32 extensions allowed one to add Python to IE like this quite easily, but it wasn't really a good idea as Python is quite difficult to sandbox: many language features expose enough implementation hooks to allow a supposedly-restricted application to break out.
It is a problem in general that the more complexity you add to a net-facing application like the browser, the greater likelihood of security problems. A bunch of new languages would certainly fit that description, and these are new languages that are also still developing fast.
JavaScript is an ugly language, but through careful use of a selective subset of features, and support from suitable object libraries, it can generally be made fairly tolerable. It seems incremental, practical additions to JavaScript are the only way web scripting is likely to move on.
I would definitely welcome a standard language independent VM in browsers (I would prefer to code in a statically typed language).
(Technically) It's quite doable gradually: first one major browser supports it and server has the possibility to either send bytecode if current request is from compatible browser or translate the code to JavaScript and send plain-text JavaScript.
There already exist some experimental languages that compile to JavaScript, but having a defined VM would (maybe) allow for better performance.
I admit that the "standard" part would be quite tricky, though. Also there would be conflicts between language features (eg. static vs. dynamic typing) concerning the library (assuming the new thing would use same library). Therefore I don't think it's gonna happen (soon).
If you feel like you are getting your hands dirty, then you have either been brainwashed, or are still feeling the after affects of the "DHTML years". JavaScript is very powerful, and is suited well for its purpose, which is to script interactivity client side. This is why JavaScript 2.0 got such a bad rap. I mean, why packages, interfaces, classes, and the like, when those are clearly aspects of server-side languages. JavaScript is just fine as a prototype-based language, without being full-blown object oriented.
If there is a lack of seamlessness to your applications because the server-side and client-side are not communicating well, then you might want to reconsider how you architect your applications. I have worked with extremely robust Web sites and Web applications, and I have never once said, "Hmm, I really wish JavaScript could do (xyz)." If it could do that, then it wouldn't be JavaScript -- it would be ActionScript or AIR or Silverlight. I don't need that, and neither do most developers. Those are nice technologies, but they try to solve a problem with a technology, not a... well, a solution.
I don't think that a standard web VM is that inconceivable. There are a number of ways you could introduce a new web VM standard gracefully and with full legacy support, as long as you ensure that any VM bytecode format you use can be quickly decompiled into javascript, and that the resulting output will be reasonably efficient (I would even go so far as to guess that a smart decompiler would probably generate better javascript than any javascript a human could produce themselves).
With this property, any web VM format could be easily decompiled either on the server (fast), on the client (slow, but possible in cases where you have limited control of the server), or could be pre-generated and loaded dynamically by either the client or the server (fastest) for browsers that don’t natively support the new standard.
Those browsers that DO natively support the new standard would benefit from increased speed of the runtime for web vm based apps. On top of that, if browsers base their legacy javascript engines on the web vm standard (i.e. parsing javascript into the web vm standard and then running it), then they don’t have to manage two runtimes, but that’s up to the browser vendor.
While Javascript is the only well-supported scripting language you can control the page directly from, Flash has some very nice features for bigger programs. Lately it has a JIT and can also generate bytecode on the fly (check out runtime expression evaluation for an example where they use flash to compile user-input math expressions all the way to native binary). The Haxe language gives you static typing with inference and with the bytecode generation abilities you could implement almost any runtime system of your choice.
Quick update on this old question.
Everyone who affirmed that a "web page would contain byte code instead of any higher-level language like JavaScript" "won't happen".
June 2015 the W3C announced WebAssembly that is
a new portable, size- and load-time-efficient format suitable for
compilation to the web.
This is still experimental, but there is already some prototypal implementation in Firefox nightly and Chrome Canary and there is already some demonstration working.
Currently, WebAssembly is mostly designed to be produced from C/C++, however
as WebAssembly evolves it will support more languages than C/C++, and we hope that other compilers will support it as well.
I let you have a closer look at the official page of the project, it is truly exciting!
this question resurfaces regularly. my stance on this is:
A) wont happen and B) is already here.
pardon, what? let me explain:
ad A
a VM is not just some sort of universal magical device. most VMs are optimized for a certain language and certain language features. take the JRE/Java (or LLVM): optimized for static typing, and there are definitely problems and downsides when implementing dynamic typing or other things java didn't support in the first place.
so, the "general multipurpose VM" that supports lots of language features (tail call optimization, static & dynamic typing, foo bar boo, ...) would be colossal, hard to implement and probably harder to optimize to get good performance out of it. but i'm no language designer or vm guru, maybe i'm wrong: it's actually pretty easy, only nobody had the idea yet? hrm, hrm.
ad B
already here: there may not be a bytecode compiler/vm, but you don't actually need one. afaik javascript is turing complete, so it should be possible to either:
create a translator from language X to javascript (e.g. coffeescript)
create a interpreter in javascript that interprets language X (e.g. brainfuck). yes, performance would be abysmal, but hey, can't have everything.
ad C
what? there wasn't a point C in the first place!? because there isn't ... yet. google NACL. if anyone can do it, it's google. as soon google gets it working, your problems are solved. only, uh, it may never work, i don't know. the last time i read about it there were some unsolved security problems of the really tricky kind.
apart from that:
javascript's been there since ~1995 = 15 years. still, browser implementations differ today (although at least it's not insufferable anymore). so, if you start something new yet, you might have a version working cross browser around 2035. at least a working subset. that only differs subtly. and needs compatibility libs and layers. no point in not trying to improve things though.
also, what about readable source code? i know a lot of companies would prefer not to serve their code as "kind-of" open source. personally, i'm pretty happy i'm able to read the source if i suspect something fishy or want to learn from it. hooray for source code!
Indeed. Silverlight is effectively just that - a client side .Net based VM.
There are some errors in your reasoning.
A standard virtual machine in a standard browser will never be standard. We have 4 browsers, and IE has conflicting interests with regard to 'standard'. The three others are evolving fast but adoption rate of new technologies is slow. What about browsers on phones, small devices, ...
The integration of JS in the different browsers and its past history leads you to under-estimating the power of JS. You pledge a standard, but disapprove JS because standard didn't work out in the early years.
As told by others, JS is not the same as AIR/.NET/... and the like. JS in its current incarnation perfectly fits its goals.
In the long term, Perl and Ruby could well replace javascript. Yet the adoption of those languages is slow and it is known that they will never take over JS.
How do you define best? Best for the browser, or best for the developer? (Plus ECMAScript is different than Javascript, but that is a technicality.)
I find that JavaScript can be powerful and elegant at the same time. Unfortunately most developers I have met treat it like a necessary evil instead of a real programming language.
Some of the features I enjoy are:
treating functions as first class citizens
being able to add and remove functions to any object at any time (not useful much but mind blowing when it is)
it is a dynamic language.
It's fun to deal with and it is established. Enjoy it while it is around because while it may not be the "best" for client scripting it is certainly pleasant.
I do agree it is frustrating when making dynamic pages because of browser incompatibilities, but that can be mitigated by UI libraries. That should not be held against JavaScript itself anymore than Swing should be held against Java.
JavaScript is the browser's standard virtual machine. For instance, OCaml and Haskell now both have compilers that can output JavaScript. The limitation is not JavaScript the language, the limitation is the browser objects accessible via JavaScript, and the access control model used to ensure you can safely run JavaScript without compromising your machine. The current access controls are so poor, that JavaScript is only allowed very limited access to browser objects for safety reasons. The Harmony project is looking to fix that.
It's a cool idea. Why not take it a step further?
Write the HTML parser and layout engine (all the complicated bits in the browser, really) in the same VM language
Publish the engine to the web
Serve the page with a declaration of which layout engine to use, and its URL
Then we can add features to browsers without having to push new browsers out to every client - the relevant new bits would be loaded dynamically from the web. We could also publish new versions of HTML without all the ridiculous complexity of maintaining backwards compatibility with everything that's ever worked in a browser - compatibility is the responsibility of the page author. We also get to experiment with markup languages other than HTML. And, of course, we can write fancy JIT compilers into the engines, so that you can script your webpages in any language you want.
I would welcome any language besides javascript as possible scripting language.
What would be cool is to use other languages then Javascript. Java would probably not be a great fit between the tag but languages like Haskell, Clojure, Scala, Ruby, Groovy would be beneficial.
I came a cross Rubyscript somewhile ago ...
http://almaer.com/blog/running-ruby-in-the-browser-via-script-typetextruby and http://code.google.com/p/ruby-in-browser/
Still experimental and in progress, but looks promising.
For .Net I just found: http://www.silverlight.net/learn/dynamic-languages/ Just found the site out, but looks interesting too. Works even from my Apple Mac.
Don't know how good the above work in providing an alternative for Javascript, but it looks pretty cool at first glance. Potentially, this would allow one to use any Java or .Net framework natively from the browser - within the browser's sandbox.
As for safety, if the language runs inside the JVM (or .Net engine for that matter), the VM will take care of security so we don't have to worry about that - at least not more then we should for anything that runs inside the browser.
Probably, but to do so we'd need to get the major browsers to support them. IE support would be the hardest to get. JavaScript is used because it is the only thing you can count on being available.
The vast majority of the devs I've spoken to about ECMAScript et. al. end up admitting that the problem isn't the scripting language, it's the ridiculous HTML DOM that it exposes. Conflating the DOM and the scripting language is a common source of pain and frustration regarding ECMAScript. Also, don't forget, IIS can use JScript for server-side scripting, and things like Rhino allow you to build free-standing apps in ECMAScript. Try working in one of these environments with ECMAScript for a while, and see if your opinion changes.
This kind of despair has been going around for some time. I'd suggest you edit this to include, or repost with, specific issues. You may be pleasantly surprised by some of the relief you get.
A old site, but still a great place to start: Douglas Crockford's site.
Well, we have already VBScript, don't we? Wait, only IE supports it!
Same for your nice idea of VM. What if I script my page using Lua, and your browser doesn't have the parser to convert it to bytecode? Of course, we could imagine a script tag accepting a file of bytecode, that even would be quite efficient.
But experience shows it is hard to bring something new to the Web: it would take years to adopt a radical new change like this. How many browsers support SVG or CSS3?
Beside, I don't see what you find "dirty" in JS. It can be ugly if coded by amateurs, propagating bad practice copied elsewhere, but masters shown it can be an elegant language too. A bit like Perl: often looks like an obfuscated language, but can be made perfectly readable.
Check this out http://www.visitmix.com/Labs/Gestalt/ - lets you use python or ruby, as long as the user has silverlight installed.
This is a very good question.
It's not the problem only in JS, as it is in the lack of good free IDEs for developing larger programs in JS. I know only one that is free: Eclipse. The other good one is Microsoft's Visual Studio, but not free.
Why would it be free? If web browser vendors want to replace desktop apps with online apps (and they want) then they have to give us, the programmers, good dev tools. You can't make 50,000 lines of JavaScript using a simple text editor, JSLint and built-in Google Chrome debugger. Unless you're a macohist.
When Borland made an IDE for Turbo Pascal 4.0 in 1987, it was a revolution in programming. 24 years have passed since. Shamefully, in the year 2011 many programmers still don't use code completion, syntax checking and proper debuggers. Probably because there are so few good IDEs.
It's in the interest of web browser vendors to make proper (FREE) tools for programmers if they want us to build applications with which they can fight Windows, Linux, MacOS, iOS, Symbian, etc.
Realistically, Javascript is the only language that any browsers will use for a long time, so while it would be very nice to use other languages, I can't see it happening.
This "standardised VM" you talk of would be very large and would need to be adopted by all major browsers, and most sites would just continue using Javascript anyway since it's more suited to websites than many other browsers.
You would have to sandbox each programming language in this VM and reduce the amount of access each language has to the system, requiring a lot of changes in the languages and removal or reimplementation of many features. Whereas Javascript already has this in mind, and has done a for a long time.
Maybe you're looking for Google's Native Client.
In a sense, having a more expressive language like Javascript in the browser instead of something more general like Java bytecode has meant a more open web.
I think this is not so easy issue. We can say that we're stuck with JS, but is it really so bad with jQuery, Prototype, scriptaculous, MooTools, and all fantastic libraries?
Remember, JS is lightweight, even more so with V8, TraceMonkey, SquirrelFish - new Javascript engines used in modern browsers.
It is also proved - yeah, we know it has problems, but we have lots of these sorted out, like early security problems. Imaging allowing your browser to run Ruby code, or anything else. Security sandbox would have to be done for scratch. And you know what? Python folks already failed two times at it.
I think Javascript is going to be revised and improved over time, just like HTML and CSS is. The process may be long, but not everything is possible in this world.
I don't think you "understand the pragmatic issue that JavaScript is simply what we have to work with now". Actually it is very powerful language. You had your Java applet in browser for years, and where is it now?
Anyhow, you don't need to "get dirty" to work on client. For example, try GWT.
... you mean...
Java and Java applet
Flash and Adobe AIR
etc..
In general, any RIA framework can fill your needs; but for every one there's a price to pay for using it ( ej. runtime avalible on browser or/and propietary or/and less options than pure desktop )
http://en.wikipedia.org/wiki/List_of_rich_internet_application_frameworks
For developing Web with any non-web languaje, you've GWT: develop Java, compile to Javascript
Because they all have VMs with bytecode interpreters already, and the bytecode is all different too. {Chakra(IE), Firefox (SpiderMonkey), Safari (SquirrelFish), Opera(Carakan).
Sorry , I think Chrome (V8) compiles down to IA32 machine code.
well, considering all browsers already use a VM, I don't think it will be that difficult to make a VM language for the web.
I think it would greatly help for a few reasons:
1. since the server compiles the code, the amount of data sent is smaller and the client doesn't waist time on compiling the code.
2. since the server can compile the code in preparation and store it, unlike the client which tries to waist as little time quickly compiling the JS, it can make better code optimizations.
3. compiling a language to byte code is way easier then transpiling to JS.
as a final note (as someone already said in another comment), HTML and CSS compile down to a simpler language, not sure if it counts as byte code, but you could also send compiled html and css from the server to the client which would reduce parse and fetch times
IMO, JavaScript, the language, is not the problem. JavaScript is actually quite an expressive and powerful language. I think it gets a bad rep because it's not got classical OO features, but for me the more I go with the prototypal groove, the more I like it.
The problem as I see it is the flaky and inconsistent implementations across the many browsers we are forced to support on the web. JavaScript libraries like jQuery go a long way towards mitigating that dirty feeling.

Categories

Resources