Why is javascript backwards compatible to a fault? - javascript

In Coders at work, Douglas Crockford discusses how bugs in browsers cause Javascript to be a complex, clunky language and fixing it is a catch-22. In Beginning JavaScript with DOM scripting and Ajax Christian Heilmann says something similar "[The] large variety of user agents, of different technical finesse [...] is a great danger to JavaScript."
Why doesn't JS have a breaking new version? Is there something inherent n the language design where backwards compatibility becomes a must?
Update
Why can't javascript run with multiple engines in parallel? Similar to how .NET runs versions 2, 3 and 4 on the same machine.

Lazy copypasta at OP's request:
JavaScript is just a programming language: syntax and semantics. It has no built-in support for browsers (read: the browser DOM). You could create a JS program that runs outside of a browser. You (should) know what an API is - the DOM is just a JavaScript API for manipulating an HTML page. There are other DOM APIs in other languages (C#, Java, etc.), though they are used more for things like XML. Does that make sense?
Perhaps this MDC article can clarify further.

Well a breaking change would break a lot of existing websites, which would make a lot of people very angry :)

Backwards compatibility is important because of the large number of browsers deployed and the wide variety of versions of those browsers.
If you serve a new, incompatible kind of Javascript to old browsers, they all break.
If you invent a new language that is not considered to be Javascript by existing browsers, then it doesn't work with the majority of browsers. Very few users will be willing to download a new browser just to work with your new language. So web developers have to keep writing compatible Javascript to support the majority of the users, no matter how great the new language is.
A lot of people would like to see something better than current Javascript be supported by browsers, but it just isn't going to happen any time soon. All the makers of browsers and development tools would have to support the new thing, and continue to support the old Javascript stuff too. Many interested parties just wouldn't consider the benefit to be worth the cost. Slow evolution of Javascript seems to be the only viable solution.

As a matter of fact, ECMAScript 5 is not fully backwards-compatible for the very reasons you mentioned.

Inertia.
Making a breaking change would break too many sites, no browser vendor would want to deal with all the bug reports.
And PHBs would be against targeting a new version, why should they have their developers write javascript for the broken and the fixed languages? Their developers will have to write it for the broken version anyway so why bother with 2 implementations (which from a developer perspective sucks too since now they have to update, support and debug 2 separate trees).

Ecmascript 5 has a "strict" mode. I think this strict mode is intended to combat the problem you mention. Eventually you'd mark scripts "strict" that you want to use the new engine, all others get run in an old crufty VM, or with un-optimized codepaths or whatever.
This is kind like IE and Mozilla browsers having multiple "modes" of rendering websites (IE even swaps out rendering engines).
See this question about it

Javascript has subtle differences across different browsers. This is because each browser manufacturer has different sets of responsibilities to their users to support backwards compatibility (if any). If I had to pick, I'd say the biggest barrier to the advancement of javascript is older versions of Internet Explorer. Due to service agreements with their users, Microsoft is contractually obliged to support older browsers. Even if other browsers cutoff backwards-compatibility, Microsoft will not. To be fair, Microsoft does realize how terrible their browsers are and will hopefully push IE 9.0 very hard. Despite the inconsistencies of javascript across different browsers, they are subtle enough to make cross-browser programming more than feasible. Abruptly cutting off backwards-compatibility would be a practice that would make web development a nightmare. Incrementally cutting of backwards-compatibility for specific aspects of javascript is feasible.

There is much more else wrong with JavaScript. You can't be fully backwards-compatible with things that were never fully compatible when they were fresh... Say, the length of the array [1,] is reported as 2 by at least older versions of internet explorer.
The biggest fault of JavaScript is that is comes with a tiny, incomplete and pretty much unusable standard library. That is why everyone retreats to using jQuery, Dojo, Mochikit etc. - these offer mostly functionality that should be part of some standard library included with the browsers instead of floating around in thousands of copies and versions. It's actually what makes .NET and Java so popular: the language comes with a reasonable standard library. With C and C++, you have to dig out the nice libraries (Boost e.g.) yourself.
But other than that, the ECMAScript standard occasionally is updated.
Google is also trying to do this bold step forwards and redo JavaScript in a slightly more sane way. The efforts are known as Dart: http://www.dartlang.org/
For all I can tell, Dart largely uses the syntax of JavaScript minus a couple of its quirks. Apart from that, it also is nicer for the virtual machine and will thus likely run faster (unless of course you compile Dart to Javascript and use a JavaScript VM; which is offered as a compatibility option). But of course any hardcore JavaScript nazi^W enthusiast will not like anything that claims to be better than JavaScript. Whereas for me, they don't go far enough. In particular, they still don't provide enough "classpath".

Related

Cross-browser compatibility issues

I see that many people have problems with cross-browser compatibility issues.
My question is why do browsers render the html, css or js differently?
Is it due to the DOM? The box model?
Why are there cross-browser compatibility issues when there are standards from W3C etc?
Are there any differences in the way the major Internet browsers display HTML content? Why is it that Internet Explorer, Firefox (Mozilla), Opera may display the same content differently?
What should I keep in mind when building a cross-browser compatible web site?
There are a lot fof reasons or incompatibility:
Specs are often written in response to the development of propriety features by specific vendors,
Specs can sometimes be poorly written, have ambiguity or were not written in anticipation of specific end-use cases.,
Browser vendors occasionally ignore the specification for their own reasons.
Additional factors:
A lot of this stuff is hard to implement, let along implement correctly,
It also has to be implemented to handle poorly formed HTML, backwards compatibility etc. Browser vendors sometimes sacrifice 'correctness' for 'interoperability',
History, politics & personalities.
I'm sure someone will answer this much better, but here's a start:
Yes, there are standards that are supposed to be adhered with respect to CSS rendering. The problem is, some browser editors (cough Microsoft cough) don't consider it a priority to implement the specifications correctly (or even fully, for that matter). Even, when the layout engine is being worked on to attempt to ensure compatibility, it can get quite nasty figuring out how things should be rendered correctly when mixing various CSS properties and units (see for example, the ACID test http://en.wikipedia.org/wiki/Acid3)
To have a cross-browser website, you'll basically have to check all of your website's pages render correctly in the browsers your visitors use (or that you support). Various tools such as Selenium (seleniumhq.org) can help with this.
You basically have to decide what you're going to do
design for the lowest common denominator (if it's IE6, there isn't much you'll be able to do)
design using validating CSS and using hacks (e.g. clearfix) to correct erroneous behavior in certain browsers
decide not to support certain browsers (IE6 being a prime candidate)
sniff browser and adapt display accordingly (NOT the preferred way to do it)
With respect to differences in manipulating the DOM, libraries such as jQuery help a lot as they hide the implementation differences from you.
For reference, it's a good idea to test your website in at least the following:
WebKit-based browser (Chrome, Safari)
Gecko-based browser (Firefox)
IE
Opera
It is an aftermath of the Great Browser War. Now Netscape Communications lies in ruins, but quirks opponents made to outperform each other is still remains in browsers' codebase, and people are still in development team. Consider watching Crockford's lecture, he gives some highlight on subject. (you will want to save the file instead of streaming it)
Everything that Hamish said, plus another major problem that he alluded to is how browsers handle incorrect HTML. For example, back in the days of IE4/NS4, the element was very problematic. If you didn't close a tag, IE4 would close it for you. Netscape 4 would silently disregard the table completely.
This is still true today where one browser will fix incorrect markup differently than another. Yes, the markup should be corrected, but browsers will try their best to render something.
The standard specifies how HTML/CSS markup should be rendered into displayed as visual elements. It does not specify how the rendering should work specifically. Many different people and companies have created different ways of rendering markup visually. Since HTML rendering is an extremely complex task, of course they didn't all converge on the exact same solution. All rendering engines are aiming for the same goal, but often the spec is vague enough to allow for small differences (be it just pixel level), and bugs are inevitable as well.
Add to that that back in the days browser vendors cared less about standards and more about gaining market share quickly and that certain companies have been very slow in turning around to embrace standards (you know who you are).
In the end, specs, which are quite complex, and browsers, which are even more complex, are written by many different people; you can't expect absolute perfection to arise from this process. And perfection is not the goal of HTML either; it's meant to be a simple, vendor and platform independent markup language to present information on a variety of devices, something it does remarkably well. If you need pixel perfect results, you need to go with a technology that was meant to provide this, like Adobe Flash (which is neither platform nor vendor independent nor simple).
Try to look at it from the glass-half-full perspective: Thousands of different people have written millions of lines of code doing vastly different things on many different platforms with many different goals and focuses, and yet in the end all render HTML markup almost identical with often only tiny, virtually irrelevant differences. There are of course weak spots in every engine and if you happen to hit them all at once, your site will break in every browser in different ways. But this is possible to avoid with a bit of experience.

Is it possible to write a JavaScript library that makes all browsers standards compliant?

I'm not a JavaScript wiz, but is it possible to create a single embeddable JavaScript file that makes all browsers standards compliant? Like a collection of all known JavaScript hacks that force each browser to interpret the code properly?
For example, IE6 does not recognize the :hover pseudo-class in CSS for anything except links, but there exists a JavaScript file that finds all references to :hover and applies a hack that forces IE6 to do it right, allowing me to use the hover command as I should.
There is an unbelievable amount of time (and thus money) that every webmaster has to spend on learning all these hacks. Imagine if there was an open source project where all one has to do is add one line to the header embedding the code and then they'd be free to code their site per accepted web standards (XHTML Strict, CSS3).
Plus, this would give an incentive for web browsers to follow the standards or be stuck with a slower browser due to all the JavaScript code being executed.
So, is this possible?
Plus, this would give an incentive for web browsers to follow the standards or be stuck with a slower browser due to all the JavaScript code being executed.
Well... That's kind of the issue. Not every incompatibility can be smoothed out using JS tricks, and others will become too slow to be usable, or retain subtle incompatibilities. A classic example are the many scripts to fake support for translucency in PNG files on IE6: they worked for simple situations, but fell apart or became prohibitively slow for pages that used such images creatively and extensively.
There's no free lunch.
Others have pointed out specific situations where you can use a script to fake features that aren't supported, or a library to abstract away differences. My advice is to approach this problem piecemeal: write your code for a decent browser, restricting yourself as much as possible to the common set of supported functionality for critical features. Then bring in the hacks to patch up the browsers that fail, allowing yourself to drop functionality or degrade gracefully when possible on older / lesser browsers.
Don't expect it to be too easy. If it was that simple, you wouldn't be getting paid for it... ;-)
Check out jQuery it does a good job of standardizing browser javascript
Along those same lines explorercanvas brings support for the HTML5 canvas tag to IE browsers.
You can't get full standards compliance, but you can use a framework that smooths over some of the worst breaches. You can also use something called a reset style sheet.
There's a library for IE to make it act more like a standards-compliant browser: Dean Edwards' IE7.
Like a collection of all known
javascript hacks that force each
browser to interpret the code properly
You have two choices: read browser compatibility tables and learn each exception a browser has and create one yourself, or use avaiable libraries.
If you want a javascript correction abstraction, you can use jQuery.
If you want a css correction abstraction, you can check /IE7/.
I usually don't like to use css corrections made by javascript. It's another complexity to my code, another library that can insert bugs to already bugged browsers. I prefer creating conditional statements to ie6, ie7 and such and create separate stylesheets for each of them. This approach works and doesn't generate a lot of overhead.
EDIT: (I know that we have problems in other browsers, but since IE is the major browser out there and usually we need really strange hacks to make it work, css conditional statements is a good approach IMO).
Actually you can,there are lots of libraries to handle this issue. From the start of the time, javascript compliance issue always was a problem for developers and thanks to innovative ones who developed libraries to get over this problem...
One of them and my favorite is JQuery.
Before JavaScript 1.4 there was no global arguments Array, and it is impossible to implement the arguments array yourself without a highly advanced source filter. This means it is going to be impossible for the language to maintain backwards-compatibility with Netscape 4.0 and Internet Explorer 4.0. So right out I can say that no, you cannot make all browser standards compliant.
Post-netscape, you can implement nearly all of the features in the core of the language in JavaScript itself. For example, I coded all methods of the Array object in 100% JavaScript code.
http://openjsan.org/doc/j/jh/jhuni/StandardLibrary/1.81/index.html
You can see my implementation of Array here if you go to the link and then go down to Array and then "source."
What most of you are probably referring to is implementing the DOM objects yourself, which is much more problematic. Using VML you can implement the Canvas tag across all the modern browsers, however, you will get a buggy/barely-working performance in Internet Explorer because VML is markup which is not a good format for implementing the Canvas tag...
http://code.google.com/p/explorercanvas/
Flash/Silverlight: Using either of these you can implement the Canvas tag and it will work quite well, you can also implement sound. However, if the user doesn't have any browser plugins there is nothing you can do.
http://www.schillmania.com/projects/soundmanager2/
DOM Abstractions: On the issue of the DOM, you can abstract away from the DOM by implementing your own Event object such as in the case of QEvent, or even implementing your own Node object like in the case of YAHOO.util.Element, however, these usually have some subtle changes to the standard API, so people are usually just abstracting away from the standard, and there is hundreds of cases of libraries that abstract away.
http://code.google.com/p/qevent/
This is probably the best answer to your question. It makes browsers as standards-compliant as possible.
http://dean.edwards.name/weblog/2007/03/yet-another/

Alternatives to JavaScript

At the moment, the only fully supported language, and the de-facto standard for DOM tree manipulation in the browser is JavaScript. It looks like it has deep design issues that make it a minefield of bugs and security holes for the novice.
Do you know of any existent or planned initiative to introduce a better (redesigned) language of any kind (not only javascript) for DOM tree manipulation and HTTP requests in next generation browsers? If yes, what is the roadmap for its integration into, say, Firefox, and if no, for what reasons (apart of interoperability) should be JavaScript the only supported language on the browser platform?
I already used jQuery and I also read "javascript: the good parts". Indeed the suggestions are good, but what I am not able to understand is: why only javascript? On the server-side (your-favourite-os platform), we can manipulate a DOM tree with every language, even fortran. Why does the client side (the browser platform) support only javascript?
The problem with javascript is not the language itself - it's a perfectly good prototyped and dynamic language. If you come from an OO background there's a bit of a learning curve, but it's not the language's fault.
Most people assume that Javascript is like Java because it has similar syntax and a similar name, but actually it's a lot more like lisp. It's actually pretty well suited to DOM manipulation.
The real problem is that it's compiled by the browser, and that means it works in a very different way depending on the client.
Not only is the actual DOM different depending on the browser, but there's a massive difference in performance and layout.
Edit following clarification in question
Suppose multiple interpreted languages were supported - you still have the same problems. The various browsers would still be buggy and have different DOMs.
In addition you would have to have an interpreter built into the browser or somehow installed as a plug in (that you could check for before you served up the page) for each language. It took ages to get Javascript consistent.
You can't use compiled languages in the same way - then you're introducing an executable that can't easily be scrutinised for what it does. Lots of users would choose not to let it run.
OK, so what about some sort of sandbox for the compiled code? Sounds like Java Applets to me. Or ActionScript in Flash. Or C# in Silverlight.
What about some kind of IL standard? That has more potential. Develop in whatever language you want and then compile it to IL, which the browser then JITs.
Except, Javascript is kind of already that IL - just look at GWT. It lets you write programs in Java, but distribute them as HTML and JS.
Edit following further clarification in question
Javascript isn't, or rather wasn't, the only language supported by browsers: back in the Internet Explorer dark ages you could choose between Javascript or VBScript to run in IE. Technically IE didn't even run Javascript - it ran JScript (mainly to avoid having to pay Sun for the word java, Oracle still own the name Javascript).
The problem was that VBScript was proprietary to Microsoft, but also that it just wasn't very good. While Javascript was adding functionality and getting top rate debugging tools in other browsers (like FireBug) VBScript remained IE-only and pretty much un-debuggable (dev tools in IE4/5/6 were none existent). Meanwhile VBScript also expanded to become a pretty powerful scripting tool in the OS, but none of those features were available in the browser (and when they were they became massive security holes).
There are still some corporate internal applications out there that use VBScript (and some rely on those security holes), and they're still running IE7 (they only stopped IE6 because MS finally killed it off).
Getting Javascript to it's current state has been a nightmare and has taken 20 years. It still doesn't have consistent support, with language features (specified in 1999) still missing from some browsers and lots of shims being required.
Adding an alternate language for interpreting in browsers faces two major problems:
Getting all the browser vendors to implement the new language standard - something they still haven't managed for Javascript in 20 years.
A second language potentially dilutes the support you already have, allowing (for instance) IE to have second rate Javascript support but great VBScript (again). I really don't want to be writing code in different languages for different browsers.
It should be noted that Javascript isn't 'finished' - it's still evolving to become better in new browsers. The latest version is years ahead of of the browsers' implementations and they're working on the next one.
Compile to Javascript
For now, using a language which compiles to Javascript seems to be the only realistic way to reach all the platforms while writing smarter code, and this will likely remain the case for a long time. With any new offering, there will always be some reason why one or more vendors will not rush to ship it.
(But I don't really think this is a problem. Javascript has been nicely optimized by now. Machine code is also unsafe if written by hand, but works fine as a compile target and execution language.)
So many options
There is an ever growing pool of languages that compile to Javascript. A fairly comprehensive list can be found here:
List of languages that compile to JS on the Coffeescript Wiki
Noteworthy
I will mention a few I think are noteworthy (while no doubt neglecting some gems which I am unaware of):
Spider appeared in 2016. It claims to take the best ideas of Go, Swift, Python, C# and CoffeeScript. It isn't typesafe, but it does have some minor safety features.
Elm: Haskell may be the smartest language of them all, and Elm is a variant of Haskell for Javascript. It is highly type-aware and concise, and offers Functional Reactive Programming as a neat alternative to reactive templates or MVC spaghetti. But it may be quite a shock for procedural programmers.
Google's Go is aimed at conciseness, simplicity, and safety. Go code can be compiled into Javascript by GopherJS.
Dart was Google's later attempt to replace Javascript. It offers interfaces and abstract classes through a C/Java-like syntax with optional typing.
Haxe is like Flash's ActionScript, but it can target multiple languages so your code can be re-used in Java, C, Flash, PHP and Javascript programs. It offers type-safe and dynamic objects.
Opalang adds syntactic sugar to Javascript to provide direct database access, smart continuations, type-checking and assist with client/server separation. (Tied to NodeJS and MongoDB.)
GorillaScript, "a compile-to-JavaScript language designed to empower the user while attempting to prevent some common errors." is akin to Coffeescript but more comprehensive, providing a bunch of extra features to increase safety and reduce repetitive boilerplate patterns.
LiteScript falls somewhere inbetween Coffeescript and GorillaScript. It offers async/yield syntax for "inline" callbacks, and checking for variable typos.
Microsoft's TypeScript is a small superset of Javascript that lets you place type-restrictions on function arguments, which might catch a few bugs. Similarly BetterJS allows you to apply restrictions, but in pure Javascript, either by adding extra calls or by specifying types in JSDoc comments. And now Facebook has offered Flow which additionally performs type inference.
LiveScript is a spin-off from Coffeescript that was popular for its brevity but does not look very readable to me. Probably not the best for teams.
How to choose?
When choosing an alternative language, there are some factors to consider:
If other developers join your project in future, how long will it take them to get up to speed and learn this language, or what are the chances they know it already?
Does the language have too few features (code will still be full of boilerplate) or too many features (it will take a long time to master, and until then some valid code may be undecipherable)?
Does it have the features you need for your project? (Does your project need type-checking and interfaces? Does it need smart continuations to avoid nested callback hell? Is there a lot of reactivity? Might it need to target other environments in future?)
The future...
Jeff Walker has written a thought-provoking series of blog posts about "the Javascript problem", including why he thinks neither TypeScript, nor Dart nor Coffeescript offer adequate solutions. He suggests some desirable features for an improved language in the conclusion.
should be JavaScript the only supported language on the browser platform ?
Yes and no. There is an alternative out there called Dart by Google which does compile to JavaScript and just like jQuery it tries to make DOM manipulation a bit easier. It may be fun to experiment, check it out.
From Google see The dart language
From Microsoft see TypeScript language
See also
Elm
Kal
It is true that Javascript was at one point notoriously hard to deal with but the web development community has come a long way since. Instead, I would encourage you to have a look at jQuery. It's easy and abstracts away all the various problems.
And there really are no alternatives that work across the board. Flash comes to mind but that too is ECMA script and it's probably over kill for most things.
Short term, I'd use things like jQuery to hide the browser incompatibilities. Long term, technologies like Silverlight or Adobe AIR may make this a very different minefield (but still a minefield) in the future.
Doug Crockford gave a talk to Google detailing the bad and good parts of JavaScript and its future. It actually hasn't changed much at all since 1999--which can be said to be a good thing (pretty much all browsers can run the same code as long as you're aware of their limitations) and Doug shows where the good parts were mostly misunderstandings that turn out to be very powerful.
For DOM manipuluation, look at JQuery as a client-side library that replaces most of the awful DOM API with operations that are a pain to write to pretty elegant bits of code that are easier to write.
If you're thinking that JavaScript has deep issues, I recommend Doug Crockford's book, JavaScript: The Good Parts. (Or Google for "Crockford JavaScript" to find several video presentations he's done.) Crockford sketches out a safe subset and set of practices, and specifically lists some parts of the language to avoid.
I'm unaware of plans to replace JavaScript as the de facto means of manipulating the DOM. So best learn to use it safely and well.
In terms of client side Javascript is the only way to manipulate the DOM. In terms of server side there are a multitude of ways.
Internet Explorer supports pluggable scripting languages, although the only one reliably included with IE besides JScript is VBScript.
As far as I have seen, there seems to be a general sort of bias toward dynamic languages in the browser, and JavaScript seems to fill this need adequately enough that network effects make any other language a non-starter. The language is actually quite powerful, though its implementation in browsers leaves much to be desired.
If you're willing to restrict your customers/visitors to specific browsers, and possibly willing to require them to install a plug-in, you could look at MS Silverlight -- a readable overview is on wikipedia. With Silverlight 2, you can run, client-side, code you've written in C#, IronPython, IronRuby, VB.NET, etc; the free Moonlight clone of Silverlight, from the Mono project, promises to bring the same functionality to Linux.
In practice, most developers of web apps and sites prefer to reach wider audiences than Silverlight (and eventually Moonlight) can currently deliver -- which means sticking with Javascript, or possibly Flash (which uses a similar programming language, Actionscript).
So, gaining substantial mindshare, adoption and traction for anything else is proving to be an uphill fight even for Microsoft with its large groups of engineers and marketing budgets and a free-software project on the side (to possibly ease worries about proprietary lock-in) -- which may help explain why there's very little interest, e.g. on the part of the Mozilla Foundation, in pushing towards such a goal. "Apart from interoperability", you say: but clearly the issue of interoperability is THE biggie here, given what we observe wrt Silverlight's progress...
As already said, you have Flash (ActionScript, which is a derived language from Javascript) and Silverlight/Moonlight (IronPython, IronRuby, JScript, VBScript, C#) that can run in the browser via plugins (the first one being much more ubiquitous).
There is also another alternative if you like Ruby: HotRuby, it's a ruby implementation in javascript that will run in the browser. It is not very mature yet, but you can have a look at it.
One thing I haven't seen mentioned (oh, I see Alcides mentioned HotRuby while I was writing and Nosredna mentioned GWT and Script#) and would like to throw out there is there are a number of implementations of [insert language]-on-JavaScript (eg. translators that allow you to convert Ruby, Python, C#, Java, Obj-J/Cappuccino [similar to Obj-C/Cocoa] or Processing [for Canvas] to JavaScript either on the client or before deployment [and some of which also feature various abstraction libraries]). Of course there's a performance overhead if it is being translated on the client, but if you are more comfortable with another language it will allow you some flexibility.
Personally, though, I recommend learning to love JavaScript. It's an excellent, powerful language, and pretty elegant once you get to know it. I'm facing the opposite dilemma, chomping at the bit to have a capable server-side JavaScript/DOM solution that meets all of my needs. /unsolicited opinion
JavaScript is the English language of the web. English historically spread because it had a strong navy conquering various countries. This is comparable to big companies that conquered the web with JavaScript. It's a language clobbered together from multiple European sources (Greek, Latin, Germanic languages, French even some Chinese and Indian words). JavaScript borrowed a lot of concepts throughout the years from other languages (structural, OO, functional). English is spoken in different places with slight variations in dialect and accent, that can render understanding difficult. Just like JavaScript has different browsers interpreting it a bit differently.
Even though English is easy to learn initially, it has very inconsistent pronunciation and more exceptions than rules. Just like JavaScript it's always there to offer a surprise.
Despite the different accents, JavaScript is the lingua franca of the web. Just like you might not be English and write here in English, every web browser has a certain degree of English understanding. IE6 is like the guy who says on his resume that he's fluent, but only went to a two week course on English as a foreign language.
There have been attempts to supplant English as the worlds main language, e.g. Esperanto. But all of them failed, because most people on earth speak some English. In the same way it will be difficult to introduce better alternatives to JavaScript.
Jquery (still javascript but) it will really help you they have support for almost all the browsers and it isn't really that hard to learn :)
No. JavaScript is it, but it will evolve. The next version is "JavaScript Harmony," and you can learn more if you Google that.
Now and then someone suggests putting a byte code interpreter into the browsers alongside JavaScript. Probably won't happen, at least for awhile.
I happen to love JavaScript. But there are other solutions, including GWT, which compiles Java to JavaScript and Script#, which compiles C# to JavaScript.
I don't think Javascript will be replaced any time soon. For a completely different approach to rich clients, you might want to investigate Flex, which is a Flash-based technology.
Maybe something like haxe (see haxe.org) could help you. It is a language which seems cleaner than JavaScript and can be compiled down to JavaScript, so it can be run inside a browser.
I know that this is not a direct answer to your question, but I thought it might be interesting for you, nevertheless.
Many people understand that Javascript isn't best and prettiest language ever. However, it is currently supported by browsers, and thus it will be extremely hard to introduce a different language. We simply don't need another browser war.
This explains why I know of no plans of switching to a different client-side language.
But I think Javascript isn't so bad if you start thinking about DOM model and how would one work with it. Many things that are messy with JS are the result of the way DOM model works.

Why are there no real competitors to Javascript?

Perhaps I'm just unaware of the competitors, but it seems when it comes to client-side scripting in the browser and DOM, Javascript is it. I know there's VBScript, but it really isn't even in the same ballpark as JS (not even cross platform, for starters).
I'm just curious about how this came to be. Surely there would be a general desire for a new language to replace Javascript: built from scratch to do all the things Javascript has been bent and moulded into these days (look at the reliance on JS Libraries).
Momentum. JavaScript has been around for 15 or so years, and browser manufacturers have worked for 15 or so years to make it work in their browsers.
If a competitor came along, it would need to really bring something new to the table in order to convince everyone to a) adopt it, b) live with locking out all the users of older browsers like IE7, Firefox 3.0, Chrome 1.0 etc. and c) find replacements for all existing libraries like jQuery, prototype, extJS etc.
In short: we don't need another Standard, let's rather improve JavaScript and build on the rich foundation that already exists instead of starting back from stone age again.
There is! Ones that spring to mind are Flash, ActiveX, and Java... But these all have their drawbacks. Mainly security and integration with the browser/DOM.
Flash and Java live in their own little world, by design (and to address security issues). They can't alter the HTML around them. ActiveX has access to the DOM, but also everything else on your computer.
JavaScript seems to have found a nice balance between flexibility and security, it can trivially interact and alter the pages HTML/CSS, do "safe" networking, has a decent standard library (which has things like JSON, XmlHttpRequest'sih networking, DOM manipulation, and so on). Most importantly, it's available in basically all vaguely-modern browsers, on all platforms, in a consistent manner (compared to CSS)
There are problems with JavaScript, but nothing major. The biggest is the performance. Load a comment page on Digg and watch your CPU usage. Chance are it will be 100% of one CPU core. There are projects to improve this, like SquirrelFish, TraceMonkey, and other strangely named things. But the performance is adequate to do some extremely impressive things (Google Spreadsheet, for example).
Basically, JavaScript is great, and it's drawbacks aren't nearly as big as the other competitors.
JavaScript won because it was introduced by Netscape in the period when they had above 90% percent market share. IE and other browsers had no choice but to support it also.
If a new language should be introduced, it would have to be either by agreement between all major browser vendors, or in a period where a single browser have enough market share to push it through.
Microsoft could probably have pulled it off some years ago when IE has an extremely large market share (before the rise of Firefox and Safari), but they chose instead (for strategic reasons) to let the browser stagnate.
Today, a new language would require agreement between at least Mozilla, IE and Safari to gain traction, and I think that is highly unlikely. The browser vendors have invested a lot of ressources in optimization, compatibility testing and so on for JavaScript - why should they want to start from scratch with a totally new language - and have to support two languages in parallel for decades to come? The cost greatly outweights any benefits.
Anyway, it is quite unrealistic to believe that a new language designed from scratch could be significantly better than JavaScript.
Show me another language that isn't reliant on libraries?
C, C++, C#, VBs, ... all rely on libraries. The only difference is that they often come with a standard set of libraries.
So do we really want is a standard set of libraries? What we're currently getting is a range of library sets (JQuery, prototype, extjs, mootools, etc). This is a good thing since we the developers get to choose one that suits our needs. In addition these libraries can be included and evolved without changes to the client-side component.
I can think of no compelling language feature missing from Javascript. By compelling I mean so earth shatteringly important that I'd be willing to alienate those browsers that do not support it.
Standardized (ECMA-262)
Common syntax and relatively simple to master
Good browser support
Extendable
Still being developed
Relatively quick based on how much data it needs to process sometimes.
If a good competitor had arrived before 1999 (ECMAScript 3), it would probably be a tie between those two.
There are other languages for client side scripting, but AFAIK, none are integrated into a browser.
Both Flash and Silverlight have their own Languages. Flash has ActionScript, while Silverlight has many and all that work under the DLR including Python and Ruby.
To your second point as to why, more specifically you mention reliance on JS libraries as a flaw in the language; Libraries are popular, not because the language if broken, but because the standard API is broken. The existence of such great libs builds on the power of the language.
There is nothing particularly wrong with JavaScript, it has some features that up to very recently would have been esoteric or academic. First class functions for example.
Also, ubiquity / mass existing runtime deployment is a very compelling feature. ;)
I recommend you to view Douglas Crockford’s presentation about “The JavaScript Programming Language” to learn about the history of JavaScript.
I think Javascript (ECMAScript) with its C like syntax is so popular for some of the reasons C is:
Relatively small number of language
keywords (easier to learn).
Concise and efficient syntax (quick to write).
Easily extendable
through external libraries and APIs that do
not pollute the basic language (ie Browser DOM,FilesystemObject etc).
Creating a new language that will provide many of the current libraries "built-in" is always risky as it starts to limit future applicability of the language and makes learning the base language harder.
This would be even more problematic for a client side browser language because the language designers can't possibly know how the language may be used in the future.
I think Javascript the language is fine in its current role as the "glue" that links so many other client sode technologies.
There are no other competitors because while Javascript is not perfect, it does the job.
I guess because the demand for it would have to be huge for browsers to implement it.
After all, it's the browsers that process and run the JavaScript and you'd have to have a large amount of sites using the language in order to make the browsers interested in implementing it. Then again no-one would use it if there was no browser support in the first place.
i'd say that this is because client side web development is still a very young branch of programming.
if you look at it only now it has become more widespread since we moved to faster "intertubes" :) and we're not using modems anymore.
and the problem for clent based web development is that it's not up to the developer what platform he'll use but it depends on the browser manufacturers.
and they change slow.
VB script's demise was in my opinion its VB-innes. too much unnecessary stuff.
As for javascript it will mature, but it's a start.
Browser support. If its not an MS tech the it most likely will not go into IE. If it's not in IE then no one can use it. If it is an MS tech, then most likely only IE will have the right to use it as it will be closed source and proprietary. If only IE supports it then only some developers will use it.
To challenge JavaScript it must work just as easily and more consistently across all of the major web browsers. Without browser support any new client side web tech is destined to fail.
I think it has to do with standardization, because durring the last browser-war (ie v. netscape) there were two, Netscape's ECMA (+1 Geek point for you if you knew this was the real name for JavaScript) and Microsoft's JScript, obviously ECMA (JavaScript) won out and became the defacto standard.
Now, we have another browser-war in progress and each of the 2 (3 if you count Chrome 12% FFS) major competitors 3 fully (with a few edge-cases) support JavaScript.
My guess is that its ubiquity and ease of integration in any new user agent project. It comes built into almost all browsers so you dont have to download/install/configure anything to have it running. once you look at user agents off desktops (wii, iphone, windows mobile, n95 etc) the availability of any of the contendes dries up quick - so you code for html and javascript becuase it will work most of the time.
I agree with Michael, we should improve Javascript, not worry about competitors because there aren't going to be any, in fact even Javascript 2.0 seems so far from reaching reality.
Since Javascript is such an ambiguous language, we're able to create libraries (jQuery) and even abstractions (Objective-J) and not worry about all the problems that Javascript has at its heart.
After so many years we don't even have CSS implemented same in all the browser, same goes for the JavaScript, IE has one model and rest of browsers has another Model (I mean like Event Handlers and attaching events).
If new competitor comes, it has no chance, neither it has so much time as css and javascript had.

Why JavaScript rather than a standard browser virtual machine?

Would it not make sense to support a set of languages (Java, Python, Ruby, etc.) by way of a standardized virtual machine hosted in the browser rather than requiring the use of a specialized language -- really, a specialized paradigm -- for client scripting only?
To clarify the suggestion, a web page would contain byte code instead of any higher-level language like JavaScript.
I understand the pragmatic reality that JavaScript is simply what we have to work with now due to evolutionary reasons, but I'm thinking more about the long term. With regard to backward compatibility, there's no reason that inline JavaScript could not be simultaneously supported for a period of time and of course JavaScript could be one of the languages supported by the browser virtual machine.
Well, yes. Certainly if we had a time machine, going back and ensuring a lot of the Javascript features were designed differently would be a major pastime (that, and ensuring the people who designed IE's CSS engine never went into IT). But it's not going to happen, and we're stuck with it now.
I suspect, in time, it will become the "Machine language" for the web, with other better designed languages and APIs compile down to it (and cater for different runtime engine foibles).
I don't think, however, any of these "better designed languages" will be Java, Python or Ruby. Javascript is, despite the ability to be used elsewhere, a Web application scripting language. Given that use case, we can do better than any of those languages.
I think JavaScript is a good language, but I would love to have a choice when developing client-side web applications. For legacy reasons we're stuck with JavaScript, but there are projects and ideas looking for changing that scenario:
Google Native Client: technology for running native code in the browser.
Emscripten: LLVM bytecode compiler to javascript. Allows LLVM languages to run in the browser.
Idea: .NET CLI in the browser, by the creator of Mono: http://tirania.org/blog/archive/2010/May-03.html
I think we will have JavaScript for a long time, but that will change sooner or later. There are so many developers willing to use other languages in the browser.
Answering the question - No, it would not make sense.
Currently the closest things we have to a multi-language VM are the JVM and the CLR. These aren't exactly lightweight beasts, and it would not make sense to try and embed something of this size and complexity in a browser.
Let's examine the idea that you could write a new, multilanguage VM that would be better than the existing solution.
You're behind on stability.
You're behind on complexity (way, way, behind because you're trying to generalize over multiple languages)
You're behind on adoption
So, no, it doesn't make sense.
Remember, in order to support these languages you're going to have to strip down their APIs something fierce, chopping out any parts that don't make sense in the context of a browser script. There are a huge number of design decisions to be made here, and a huge opportunity for error.
In terms of functionality, we're probably only really working with the DOM anyway, so this is really an issue of syntax and language idom, at which point it does make sense to ask, "Is this really worth it?"
Bearing in mind, the only thing we're talking about is client side scripting, because server side scripting is already available in whatever language you like. It's a relatively small programming arena and so the benefit of bringing multiple languages in is questionable.
What languages would it make sense to bring in? (Warning, subjective material follows)
Bringing in a language like C doesn't make sense because it's made for working with metal, and in a browser there isn't much metal really available.
Bringing in a language like Java doesn't make sense because the best thing about it is the APIs anyway.
Bringing in a language like Ruby or Lisp doesn't make sense because JavaScript is a powerful dynamic language very close to Scheme.
Finally, what browser maker really wants to support DOM integration for multiple languages? Each implementation will have its own specific bugs. We've already walked through fire dealing with differences between MS Javascript and Mozilla Javascript and now we want to multiply that pain five or six-fold?
It doesn't make sense.
On Windows, you can register other languages with the Scripting Host and have them available to IE. For example VBScript is supported out of the box (though it has never gained much popularity as it is for most purposes even worse than JavaScript).
The Python win32 extensions allowed one to add Python to IE like this quite easily, but it wasn't really a good idea as Python is quite difficult to sandbox: many language features expose enough implementation hooks to allow a supposedly-restricted application to break out.
It is a problem in general that the more complexity you add to a net-facing application like the browser, the greater likelihood of security problems. A bunch of new languages would certainly fit that description, and these are new languages that are also still developing fast.
JavaScript is an ugly language, but through careful use of a selective subset of features, and support from suitable object libraries, it can generally be made fairly tolerable. It seems incremental, practical additions to JavaScript are the only way web scripting is likely to move on.
I would definitely welcome a standard language independent VM in browsers (I would prefer to code in a statically typed language).
(Technically) It's quite doable gradually: first one major browser supports it and server has the possibility to either send bytecode if current request is from compatible browser or translate the code to JavaScript and send plain-text JavaScript.
There already exist some experimental languages that compile to JavaScript, but having a defined VM would (maybe) allow for better performance.
I admit that the "standard" part would be quite tricky, though. Also there would be conflicts between language features (eg. static vs. dynamic typing) concerning the library (assuming the new thing would use same library). Therefore I don't think it's gonna happen (soon).
If you feel like you are getting your hands dirty, then you have either been brainwashed, or are still feeling the after affects of the "DHTML years". JavaScript is very powerful, and is suited well for its purpose, which is to script interactivity client side. This is why JavaScript 2.0 got such a bad rap. I mean, why packages, interfaces, classes, and the like, when those are clearly aspects of server-side languages. JavaScript is just fine as a prototype-based language, without being full-blown object oriented.
If there is a lack of seamlessness to your applications because the server-side and client-side are not communicating well, then you might want to reconsider how you architect your applications. I have worked with extremely robust Web sites and Web applications, and I have never once said, "Hmm, I really wish JavaScript could do (xyz)." If it could do that, then it wouldn't be JavaScript -- it would be ActionScript or AIR or Silverlight. I don't need that, and neither do most developers. Those are nice technologies, but they try to solve a problem with a technology, not a... well, a solution.
I don't think that a standard web VM is that inconceivable. There are a number of ways you could introduce a new web VM standard gracefully and with full legacy support, as long as you ensure that any VM bytecode format you use can be quickly decompiled into javascript, and that the resulting output will be reasonably efficient (I would even go so far as to guess that a smart decompiler would probably generate better javascript than any javascript a human could produce themselves).
With this property, any web VM format could be easily decompiled either on the server (fast), on the client (slow, but possible in cases where you have limited control of the server), or could be pre-generated and loaded dynamically by either the client or the server (fastest) for browsers that don’t natively support the new standard.
Those browsers that DO natively support the new standard would benefit from increased speed of the runtime for web vm based apps. On top of that, if browsers base their legacy javascript engines on the web vm standard (i.e. parsing javascript into the web vm standard and then running it), then they don’t have to manage two runtimes, but that’s up to the browser vendor.
While Javascript is the only well-supported scripting language you can control the page directly from, Flash has some very nice features for bigger programs. Lately it has a JIT and can also generate bytecode on the fly (check out runtime expression evaluation for an example where they use flash to compile user-input math expressions all the way to native binary). The Haxe language gives you static typing with inference and with the bytecode generation abilities you could implement almost any runtime system of your choice.
Quick update on this old question.
Everyone who affirmed that a "web page would contain byte code instead of any higher-level language like JavaScript" "won't happen".
June 2015 the W3C announced WebAssembly that is
a new portable, size- and load-time-efficient format suitable for
compilation to the web.
This is still experimental, but there is already some prototypal implementation in Firefox nightly and Chrome Canary and there is already some demonstration working.
Currently, WebAssembly is mostly designed to be produced from C/C++, however
as WebAssembly evolves it will support more languages than C/C++, and we hope that other compilers will support it as well.
I let you have a closer look at the official page of the project, it is truly exciting!
this question resurfaces regularly. my stance on this is:
A) wont happen and B) is already here.
pardon, what? let me explain:
ad A
a VM is not just some sort of universal magical device. most VMs are optimized for a certain language and certain language features. take the JRE/Java (or LLVM): optimized for static typing, and there are definitely problems and downsides when implementing dynamic typing or other things java didn't support in the first place.
so, the "general multipurpose VM" that supports lots of language features (tail call optimization, static & dynamic typing, foo bar boo, ...) would be colossal, hard to implement and probably harder to optimize to get good performance out of it. but i'm no language designer or vm guru, maybe i'm wrong: it's actually pretty easy, only nobody had the idea yet? hrm, hrm.
ad B
already here: there may not be a bytecode compiler/vm, but you don't actually need one. afaik javascript is turing complete, so it should be possible to either:
create a translator from language X to javascript (e.g. coffeescript)
create a interpreter in javascript that interprets language X (e.g. brainfuck). yes, performance would be abysmal, but hey, can't have everything.
ad C
what? there wasn't a point C in the first place!? because there isn't ... yet. google NACL. if anyone can do it, it's google. as soon google gets it working, your problems are solved. only, uh, it may never work, i don't know. the last time i read about it there were some unsolved security problems of the really tricky kind.
apart from that:
javascript's been there since ~1995 = 15 years. still, browser implementations differ today (although at least it's not insufferable anymore). so, if you start something new yet, you might have a version working cross browser around 2035. at least a working subset. that only differs subtly. and needs compatibility libs and layers. no point in not trying to improve things though.
also, what about readable source code? i know a lot of companies would prefer not to serve their code as "kind-of" open source. personally, i'm pretty happy i'm able to read the source if i suspect something fishy or want to learn from it. hooray for source code!
Indeed. Silverlight is effectively just that - a client side .Net based VM.
There are some errors in your reasoning.
A standard virtual machine in a standard browser will never be standard. We have 4 browsers, and IE has conflicting interests with regard to 'standard'. The three others are evolving fast but adoption rate of new technologies is slow. What about browsers on phones, small devices, ...
The integration of JS in the different browsers and its past history leads you to under-estimating the power of JS. You pledge a standard, but disapprove JS because standard didn't work out in the early years.
As told by others, JS is not the same as AIR/.NET/... and the like. JS in its current incarnation perfectly fits its goals.
In the long term, Perl and Ruby could well replace javascript. Yet the adoption of those languages is slow and it is known that they will never take over JS.
How do you define best? Best for the browser, or best for the developer? (Plus ECMAScript is different than Javascript, but that is a technicality.)
I find that JavaScript can be powerful and elegant at the same time. Unfortunately most developers I have met treat it like a necessary evil instead of a real programming language.
Some of the features I enjoy are:
treating functions as first class citizens
being able to add and remove functions to any object at any time (not useful much but mind blowing when it is)
it is a dynamic language.
It's fun to deal with and it is established. Enjoy it while it is around because while it may not be the "best" for client scripting it is certainly pleasant.
I do agree it is frustrating when making dynamic pages because of browser incompatibilities, but that can be mitigated by UI libraries. That should not be held against JavaScript itself anymore than Swing should be held against Java.
JavaScript is the browser's standard virtual machine. For instance, OCaml and Haskell now both have compilers that can output JavaScript. The limitation is not JavaScript the language, the limitation is the browser objects accessible via JavaScript, and the access control model used to ensure you can safely run JavaScript without compromising your machine. The current access controls are so poor, that JavaScript is only allowed very limited access to browser objects for safety reasons. The Harmony project is looking to fix that.
It's a cool idea. Why not take it a step further?
Write the HTML parser and layout engine (all the complicated bits in the browser, really) in the same VM language
Publish the engine to the web
Serve the page with a declaration of which layout engine to use, and its URL
Then we can add features to browsers without having to push new browsers out to every client - the relevant new bits would be loaded dynamically from the web. We could also publish new versions of HTML without all the ridiculous complexity of maintaining backwards compatibility with everything that's ever worked in a browser - compatibility is the responsibility of the page author. We also get to experiment with markup languages other than HTML. And, of course, we can write fancy JIT compilers into the engines, so that you can script your webpages in any language you want.
I would welcome any language besides javascript as possible scripting language.
What would be cool is to use other languages then Javascript. Java would probably not be a great fit between the tag but languages like Haskell, Clojure, Scala, Ruby, Groovy would be beneficial.
I came a cross Rubyscript somewhile ago ...
http://almaer.com/blog/running-ruby-in-the-browser-via-script-typetextruby and http://code.google.com/p/ruby-in-browser/
Still experimental and in progress, but looks promising.
For .Net I just found: http://www.silverlight.net/learn/dynamic-languages/ Just found the site out, but looks interesting too. Works even from my Apple Mac.
Don't know how good the above work in providing an alternative for Javascript, but it looks pretty cool at first glance. Potentially, this would allow one to use any Java or .Net framework natively from the browser - within the browser's sandbox.
As for safety, if the language runs inside the JVM (or .Net engine for that matter), the VM will take care of security so we don't have to worry about that - at least not more then we should for anything that runs inside the browser.
Probably, but to do so we'd need to get the major browsers to support them. IE support would be the hardest to get. JavaScript is used because it is the only thing you can count on being available.
The vast majority of the devs I've spoken to about ECMAScript et. al. end up admitting that the problem isn't the scripting language, it's the ridiculous HTML DOM that it exposes. Conflating the DOM and the scripting language is a common source of pain and frustration regarding ECMAScript. Also, don't forget, IIS can use JScript for server-side scripting, and things like Rhino allow you to build free-standing apps in ECMAScript. Try working in one of these environments with ECMAScript for a while, and see if your opinion changes.
This kind of despair has been going around for some time. I'd suggest you edit this to include, or repost with, specific issues. You may be pleasantly surprised by some of the relief you get.
A old site, but still a great place to start: Douglas Crockford's site.
Well, we have already VBScript, don't we? Wait, only IE supports it!
Same for your nice idea of VM. What if I script my page using Lua, and your browser doesn't have the parser to convert it to bytecode? Of course, we could imagine a script tag accepting a file of bytecode, that even would be quite efficient.
But experience shows it is hard to bring something new to the Web: it would take years to adopt a radical new change like this. How many browsers support SVG or CSS3?
Beside, I don't see what you find "dirty" in JS. It can be ugly if coded by amateurs, propagating bad practice copied elsewhere, but masters shown it can be an elegant language too. A bit like Perl: often looks like an obfuscated language, but can be made perfectly readable.
Check this out http://www.visitmix.com/Labs/Gestalt/ - lets you use python or ruby, as long as the user has silverlight installed.
This is a very good question.
It's not the problem only in JS, as it is in the lack of good free IDEs for developing larger programs in JS. I know only one that is free: Eclipse. The other good one is Microsoft's Visual Studio, but not free.
Why would it be free? If web browser vendors want to replace desktop apps with online apps (and they want) then they have to give us, the programmers, good dev tools. You can't make 50,000 lines of JavaScript using a simple text editor, JSLint and built-in Google Chrome debugger. Unless you're a macohist.
When Borland made an IDE for Turbo Pascal 4.0 in 1987, it was a revolution in programming. 24 years have passed since. Shamefully, in the year 2011 many programmers still don't use code completion, syntax checking and proper debuggers. Probably because there are so few good IDEs.
It's in the interest of web browser vendors to make proper (FREE) tools for programmers if they want us to build applications with which they can fight Windows, Linux, MacOS, iOS, Symbian, etc.
Realistically, Javascript is the only language that any browsers will use for a long time, so while it would be very nice to use other languages, I can't see it happening.
This "standardised VM" you talk of would be very large and would need to be adopted by all major browsers, and most sites would just continue using Javascript anyway since it's more suited to websites than many other browsers.
You would have to sandbox each programming language in this VM and reduce the amount of access each language has to the system, requiring a lot of changes in the languages and removal or reimplementation of many features. Whereas Javascript already has this in mind, and has done a for a long time.
Maybe you're looking for Google's Native Client.
In a sense, having a more expressive language like Javascript in the browser instead of something more general like Java bytecode has meant a more open web.
I think this is not so easy issue. We can say that we're stuck with JS, but is it really so bad with jQuery, Prototype, scriptaculous, MooTools, and all fantastic libraries?
Remember, JS is lightweight, even more so with V8, TraceMonkey, SquirrelFish - new Javascript engines used in modern browsers.
It is also proved - yeah, we know it has problems, but we have lots of these sorted out, like early security problems. Imaging allowing your browser to run Ruby code, or anything else. Security sandbox would have to be done for scratch. And you know what? Python folks already failed two times at it.
I think Javascript is going to be revised and improved over time, just like HTML and CSS is. The process may be long, but not everything is possible in this world.
I don't think you "understand the pragmatic issue that JavaScript is simply what we have to work with now". Actually it is very powerful language. You had your Java applet in browser for years, and where is it now?
Anyhow, you don't need to "get dirty" to work on client. For example, try GWT.
... you mean...
Java and Java applet
Flash and Adobe AIR
etc..
In general, any RIA framework can fill your needs; but for every one there's a price to pay for using it ( ej. runtime avalible on browser or/and propietary or/and less options than pure desktop )
http://en.wikipedia.org/wiki/List_of_rich_internet_application_frameworks
For developing Web with any non-web languaje, you've GWT: develop Java, compile to Javascript
Because they all have VMs with bytecode interpreters already, and the bytecode is all different too. {Chakra(IE), Firefox (SpiderMonkey), Safari (SquirrelFish), Opera(Carakan).
Sorry , I think Chrome (V8) compiles down to IA32 machine code.
well, considering all browsers already use a VM, I don't think it will be that difficult to make a VM language for the web.
I think it would greatly help for a few reasons:
1. since the server compiles the code, the amount of data sent is smaller and the client doesn't waist time on compiling the code.
2. since the server can compile the code in preparation and store it, unlike the client which tries to waist as little time quickly compiling the JS, it can make better code optimizations.
3. compiling a language to byte code is way easier then transpiling to JS.
as a final note (as someone already said in another comment), HTML and CSS compile down to a simpler language, not sure if it counts as byte code, but you could also send compiled html and css from the server to the client which would reduce parse and fetch times
IMO, JavaScript, the language, is not the problem. JavaScript is actually quite an expressive and powerful language. I think it gets a bad rep because it's not got classical OO features, but for me the more I go with the prototypal groove, the more I like it.
The problem as I see it is the flaky and inconsistent implementations across the many browsers we are forced to support on the web. JavaScript libraries like jQuery go a long way towards mitigating that dirty feeling.

Categories

Resources